replying to Dave Plowman (News), tahrey wrote: The method by which they produce light is very much different from how tungsten filaments do, direct photon emission rather than having to make a piece of wire so hot it begins to glow, so I'd be surprised if their output light vs input power varies much from completely linear. Possibly they get a bit less efficient as you turn the brightness *up* (as the greater dissipated power, and greater current, makes for more heat that has to be sunk, which then doubles down by making the semiconductors less efficient), and the LED+Driver system as a whole may lose efficiency at low brightness depending on how the dimming is achieved (PWM switching, or simple inline resistance) and losses in switchmode transformers at marginal power draws (which is sort of analogous to the losses in a traditional dimmer switch itself, rather than the bulb).
The thing with the tungsten lamps is once you start reducing voltage below what they're rated for mainline use at, you're reducing their effective temperature and shifting their emitted spectrum further down the waveband; in other words, the ratio of light vs heat emitted switches more towards the heat end, and only avoids melting the filament because the total emitted power is so much less, thus less heat has to be dissipated even though it makes up a greater proportion of the total (similarly attempts to improve the efficiency and raise the colour temperature by increasing the voltage can only be taken a little way before the bulb fails, because although the proportion of power dissipated as heat is proportionally less, it's still materially more and ends up melting the filament).
So if you derate a 60w (consumption) incandescent bulb to 20w using a dimmer, it may go from producing 50w of heat and 10w of light (5:1, 16.7% efficient), to 17.5w of heat and 2.5w of light (7:1, 12.5% efficient; naturally these are not real world figures and just pulled out of the air). Your 12w LED replacement starts out emitting 2w of heat and 10w of light, derating to 4w in kind (ie 1/3rd) should see it emitting 0.67w of heat and 3.33w of light, and to emit the same brightness as the dimmed incandescent you'll instead want to derate to 3w (not counting transformer/etc losses), for 0.5w heat and 2.5w light.
The other clue is that the quality of light changes with incandescents when you dim them, the colour shifts towards the red-orange end of the spectrum (and beyond that, deeper into the infrared). The colour of a dimmed LED remains constant, with no wavelength shifts; it just reduces in quantity. So no additional heat should be produced, in fact due to the reducing current it should reduce in proportion to the useful output (ie visible light), and the losses will be traceable more to the power supply.
There are efficient ways to convert mains power to something LEDs can use, and cheap and easy/compact ways to do it, and there's not the greatest amount of overlap between them, so they should always be measured at the point of mains supply rather than at output to the diodes, but on the whole even the iffiest dimmable LED should remain more efficient than an equivalent incandescent... else you'll find an incandescent-like amount of heat being emitted from the electronics.