SOT: Car battery conundrum

In message , "Dave Plowman (News)" writes

In the olden days, the ammeter went in after the feed to the starter motor. If it had been directly in the battery lead, the massive starter current would have made it whang hard over, and probably burnt it out.

My first car, a 1953 Ford Prefect with an 1172cc 93E engine, had an ammeter as standard.

My second car, an Austin A40 Farina, had no ammeter (although it did have a starting handle!), so I used a knackered RF ammeter which had had its thermocouple burnt out (a common fate of RF ammeters). The actual meter was a standard end-zero 500uA (I think) moving coil meter, so I fiddled the zeroing adjustments to get it to centre-zero. I used a piece of large gauge copper wire as a shunt to take most of current, and calibrated the meter when passing various up to +/- 20A amps through the shunt (initially at 20A while sliding the meter connections along the shunt until I got full-scale in either direction). I then soldered the connections to the shunt, and fitted it somewhere in the car wiring, where the whole current was passed (except, of course, the starter motor and solenoid currents).

Reply to
Ian Jackson
Loading thread data ...

What's that got to do with anything?

nor does it need to be made to anything remotely like the micron. Wiring harnesses are made to fit a template, and are very close to the same length every time.

But with modern automated & reliable charging it simply isnt necessary. It was necessary in the early days of motoring when dynamos began to appear on cars, with I gather no regulation whatever.

NT

Reply to
meow2222

Typo aside, an actual toroid(al transformer) is only any use with ac current. A hall effect sensor will use a magnetic ring (with no toroidal windings) with a gap to concentrate the magnetic flux through the hall effect sensor fitted into the gap.

The only problem with such a setup is auto-calibrating the zero current point[1] but I suppose this could be achieved by assuming a fairly well defined standby current when the ignition is first turned on and have the meter function calibrate to this offset from zero.

This calibration scheme would require some form of control of the electrics to inhibit the use of all electrical loads for the brief period required to complete the calibration phase (50 to 100ms?).

[1] All the DC clamp meters I've ever seen have a DC zero calibration button. I've just switched mine on to see how big the zero error is before pressing the calibrate button. It showed 103.1 amp on the 200.0A scale before I hit the zero calibrate button which suggests to me that such hall effect sensing of the DC current to provide an Ammeter feature will require some form of automated zero calibrate function.
Reply to
Johny B Good

Clamp meters would need calibration every time you used one. However a fixed device wouldn't as there are fewer variables that can't be designed out.

Reply to
dennis

It was only regulated by the feeble amount of DC that the traditional dynamo could supply. Charging at approximately fixed voltage through a simple series resistor (in practice the resistance of the coils themselves) the battery charge current falls off as its terminal voltage rises. Most vented batteries can survive C/20 charge rates or less almost indefinitely without suffering too much harm.

Car batteries back then didn't last all that long anyway and died even quicker when people forgot to top them up regularly with distilled water or used tapwater instead.

Regulation only really became necessary when powerful alternators took over the job of the humble dynamo as they could pack enough punch to seriously damage a battery by overcharging it. Dynamos had trouble keeping enough charge in the battery in winter if you only did short commuter journeys mostly in the dark.

Regards, Martin Brown

Reply to
Martin Brown

snipped-for-privacy@care2.com pretended :

Exactly, you don't even need anything special, no shunt, just a basic multi-meter. Connect one probe to the battery terminal, the other to where ever the main positive feed goes and it will show a voltage depending upon the voltage drop over the length of the cable. Voltage drop equals current flow, but will need some translation.

Reply to
Harry Bloomfield

An ammeter is a gauge of some sort. The clue is in 'meter'.

Just about every car had an ammeter at one time. But not for many years. They tended to go out of fashion at the same time as alternators replaced dynamos. Probably because alternators are generally more reliable. But since they also tend to have a higher output, make the fitting of an ammeter even more expensive.

Reply to
Dave Plowman (News)

OK smartarse. Name one production car which used this method.

You don't need a sensor at all. Just measure the voltage drop along the cable. Easy peasy these days.

Reply to
Dave Plowman (News)

snipped-for-privacy@care2.com laid this down on his screen :

Oh they did have regulation, not very accurate, but it sort of worked. It used an electro-mechanical system to control the field current.

Reply to
Harry Bloomfield

And would have even more massive cables to it. Rather a problem with dash mounted gauges.

But it wasn't just the starter motor which wasn't wired through the ammeter on some cars.

They were standard fitment on some much later than that. Last I can think of was the Rover P6 made up to the mid '70s. But there will likely be others later than that.

Smiths (and others) made aftermarket ammeters ready to go. Seems hardly worth it going to that effort. But 30-0-30 would have been the norm for a car with the same electrics as an A40.

Reply to
Dave Plowman (News)

on 14/01/2015, Martin Brown supposed :

WRONG!

This is what they used from the very earliest days of cars...

formatting link

Dynamos were barely able to support the full load of the car, when the car was moving. When stood ticking over in traffic, you had to reduce the load drastically, if you expected to be stood for a while.

Reply to
Harry Bloomfield

The difference between theory and practice.

You've obviously never made a shunt. 'Very close' doesn't hack it with one.

All you're showing is a lack of knowledge of dynamo regulation systems. An unregulated dynamo of a suitable size to run all the car electrics would boil the battery dry in short order - if it survived that long.

Some pretty early cheap cars did sort of manage with a high/low charge switch. And a starting handle.

Reply to
Dave Plowman (News)

In message , Harry Bloomfield writes

First, choose the length of cable through which you are going to measure the current, then use your multimeter to measure its resistance. [Note: If, for whatever reason, there is any quiescent current drain from the battery, you will need to temporarily disconnect one of its terminals while you do it.] Then switch the multimeter to the mV range, and measure the voltage when the battery is charging or discharging (as required). Finally, use Ohm's law to calculate the current.

Reply to
Ian Jackson

What electrics did you think a 1920 car had?

Reply to
Dave Plowman (News)

snipped-for-privacy@care2.com explained :

Sorry, but you are wrong on both counts. Maybe not flat out, but cars would be driven at steady fast speeds just as much then, as now. Not quite continuously as on motorways now, but none the less...

In my younger days, my uncle had a garage and I took a great interest in the cars some of which were 1920 and 1930's still on the road. I never came across one without a regulator. During my early driving career, I also had a few cars using dynamos and regulators. A favourite trick of mine, which earned me a few bob, was converting them from positive to negative earth.

Reply to
Harry Bloomfield

You have to regulate any dynamo charging a battery - if only to limit the maximum current. And if you do that in the most basic way, you'll also reduce the current at other times regardless of demand.

Reply to
Dave Plowman (News)

Dynamos had fairly sophisticated electro-mechanical regulators for much of their life. Provided both constant voltage charging and maximum current limiting. As well as a necessary cutout to disconnect it when not charging.

Of course these days you'd do the job with electronics.

Reply to
Dave Plowman (News)

Ian Jackson formulated the question :

Measuring such low values of resistance, will be problematical - All you need is a pot to set the reading on the meter and a known load with which to calibrate. It doesn't need to be spot on.

I'm quite lucky, my car has one main lead from starter to battery, then a second one from battery to main fuse panel. Put a meter end to end of the later and it gives a rough idea of charge / discharge.

Reply to
Harry Bloomfield

And will kill the battery in short order with plenty of tap water.

Reply to
Dave Plowman (News)

In message , Harry Bloomfield writes

In my 6V 1953 Ford Prefect, I carefully adjusted the regulator so that with a fairly well-charged battery, the charging current was zero when driving at a reasonable speed on dipped headlights (2 x 40W).

If I switched to full beam (2 x 60W), there was initially a discharge of (say) 5A, but soon meter crept up to around zero.

If I then switched back to dipped beam, the charge was initially around

5A, but soon dropped to zero.

I had additional electrical equipment in the car - an add-on rear window heater and a box heater with fan for the front passenger's legs. These were rarely used. However, I also had amateur radio equipment, which on transmit took around 10A - so at night I had to be rather careful about how much load I was taking. Nevertheless, I can't recall ever finding myself with a flat battery - and If I did, there was always the starting handle!

Reply to
Ian Jackson

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.