How does USB decide charging rate? Especially if you just use a

I thought the data line was used to signal what current the device requires and what the power supply or computer USB socket is capable of. But aren't charging cables missing the data lines?

Reply to
Commander Kinsey
Loading thread data ...

This isn't the whole story, but it'll fill in what some of the other sites miss. USB-PD spec is another way to do it, instead of some of these other earlier attempts.

formatting link
That makes it easier to understand why a cable might matter. As demonstrated here. It's too bad the chart does not have a column for the "suspected charging standard" used in each case. If they had included some names, it would make it easier for you to Google each (proprietary or otherwise) standard.

formatting link
Other cabling schemes may use more "active" means of signaling. "Passive" cabling (where there is no D+ D- continuity, but there is resistor straps on D+ D- on the cable output end) are generally limited to lower currents.

If you have a 1 ampere wall adapter, and you use a "charging cable that indicates 2 amperes", then plugging your iPad into that, the iPad believes the 2 ampere resistor-strap-indication and draws the 2 amperes, and the wall adapter... shuts off on overcurrent. You cannot necessarily join just any old random cable between wall adapter and device. The combinations must be intelligently selected by the person doing this. When using one of the active cable standards, the hope is the automation avoids some of the potential issues. Just because one end of your setup supports USB PD, does not mean the other end does. You could even join two USB PD devices with *the wrong cable*.

If you use a USB extension cable, with just the four wires and no resistors whatsoever, then I would expect the charging current to be one ampere or less.

I think it would take a significant number of web page URLs, to cover all the methods. Start Googling :-)

Paul

Reply to
Paul

There were some modems back in the days of USB 2, which drew about 570mA. Some motherboards were shutting the USB socket off when they did that.

Grrrr. My phone takes no more than 5V 1.1A. Takes quite a while to charge if the CPU is running flat out all 8 cores.

Reply to
Commander Kinsey

Surely nowadays the charger is in the device being charged, and is fed the 5V (or whatever) directly? Lithium batteries are quite fussy about charging, particularly fast charging. Charge rate is specific to the battery, and the maximum current should typically be about 40-50% of the capacity in Ah. Once the battery voltage has reached 4.2V/cell, current is reduced to maintain 4.2V until it's below a certain level, again depending on capacity. The battery temperature must also be monitored, as the voltage is dependent on it, and also to shut down if the temperature gets too high with a faulty battery. It makes sense to have all this stuff physically close to the battery.

Reply to
Joe

Yes but I'm talking about how much current can come in at 5V from the source outside the device, and how the device and the source negotiate this.

Actually they're very simple, akin to lead acid. No nonsense about sensing delta V like Nickel ones. I charge lithium with a bench charger, you set the voltage to 4.2, and the current to 1C (as in 3 amps for a 3Ah battery). There's no need to go slower than a 1 hour charge.

No, 100%.

Nah, you just need limited current at 1C, and voltage limited to 4.2V.

Reply to
Commander Kinsey

Ohm's law?

Reply to
JNugent

Nothing to do with it. For some reason Samsung have decided the input to the phone shall not exceed 1.1A. If the CPU and screen are using 750mA, the battery charges very slowly.

Reply to
Commander Kinsey

Really?

How did they manage to get Ohm's Law repealed?

Internal resistance / impedance (depending on whether it is an AC or DC circuit).

Reply to
JNugent

Why do you believe ohms law prevents it drawing 2 amps? 1 amp for the battery to charge and 1 amp for the screen and CPU?

You know perfectly well it's DC. Why would they limit the socket to only be able to power the phone or the battery?

Reply to
Commander Kinsey

Do you understand Ohm's law?

What do you understand it to say?

Reply to
JNugent

The answer is likely to be that the phone electronics is only powered by the battery. There's no provision for it to be powered from the external power supply while the battery charges from the same source.

Not Ohm's law as such, just a constant-current feed to the battery, from which any current used by the phone is subtracted.

Reply to
Joe

piss off

Reply to
Jim Stewart ...

Intelligent semiconductor circuits simply dont obey it.

Reply to
The Natural Philosopher

Do they reduce the supplied voltage, such that V, I and R stay in balance?

Reply to
JNugent

Why would ohms law limit anything to 1 amp if they've designed it properly?

Reply to
Commander Kinsey

ARGH!

How absurd. If the CPU is drawing half an amp, the half an amp isn't getting into the battery chemistry obviously. It's being powered by the external supply. They should be measuring the charging current right at the battery.

Reply to
Commander Kinsey

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.