Poll - dimmer switches

Having discussed this rather too much recently, anyone got any real life data?:

1) Have your replaced an ordinary switch with dimmer switch?

2) Did you notice a reduction in the maximum brightness?

3) Did you because of said reduction, or for any other reason, increase the bulb wattage because of the dimmer?

4) What is your usage pattern for your dimmer controlled lights? (i.e. full brightness mostly, dimmed mostly, half and half etc)

Reply to
John Rumm
Loading thread data ...

I think all the FAQ needs to say is that dimming a filament lamp gives a drop in light output but a much _smaller_ drop in power consumption; the lamp's efficiency drops considerably.

So from an energy saving point of view, the need for occasional extra light is best done by just that; extra lighting rather than having filament lamps dimmed most of the time and then turned up full occasionally.

Extra lighting can be more lights on another switch, uplighters, over cabinet lights, spot lights, whatever.

Though for some rooms like bathrooms, the lights themselves aren't used so much anyway and there's less scope for adding extra lighting too.

cheers, Pete.

Reply to
Pete C

I've got about 20 in various places.

Yes.

No - the reduction is far less than changing size of bulbs - at least in the common wattages. More like changing from a GLS to softone, etc.

They all get varied. Whole point of fitting a dimmer. But I'd point out I don't have an example of a room with dimmer which just has a central light.

Reply to
Dave Plowman (News)

Yes

Slight

No, because we bought dimmers because the main lights were generally too bright most of the time.

Mostly dimmed to create ambient light of an evening.

Reply to
R D S

I guess you do want to go there then.

Theres no way youre going to get useful data here as the sample size is far too small.

Your proposition is that: If you knock 5% off the light ouutput, no-one will uprate their lamps.

If this is true, then logically it should continue to hold true... time after time, ie another 5% and noone will notice etc. With a large population what happens is that people uprate the bulbs at various points along that downward path. There is no one place or zone where this uprating occurs, its spread out. As the amount you reduce it by gets ever smaller, the number of people swapping bulb power gets ever smaller. But you dont end up with any gain.

This isnt really specifically a lighting question, its more one of statistics and applies to lots of life situations. The people to ask about this for an expertly explained response will be a maths newsgroup.

Since you want data, FWIW I used to have dimmers, but never really used them to dim. One of those things that once seemed like a good idea but actually was pointless.

  1. yes
  2. there was no opportunity for comparison, the dimmers went in from the beginning of the lighting scheme.
  3. Total wattage chosen was on the high side in one room, 600w IIRC, in the knowledge that dimmers were going in. Elsewhere no change. So yes in one case, no elsewhere.
  4. Always on full. Many lights were on more than one switch, and operating it like a switchbank was simply easier than using the dimmer. It meant no need to stop as walking by. At the time I knew nothing about efficacy and dimming, switchbanks or anything like that.

NT

Reply to
meow2222

It will give a feel for if the process of uprating bulb power is common though.

Yup...

No not at all - that would be daft...

The first 5% yup, the next maybe, after that its going to be a problem for ever more people.

I was not suggesting that one would ever get that far though. I have changed a switch for a dimmer, and did not notice any change in brightness. The lamp lasts longer, and when it fails I replace it. At no point in that time do I think "hey this is not bright enough"[1].

Assuming any of them swap the light for larger (assuming their reason for dimming in the first place was not with the deliberate intention of doing this from the outset so as to have better bright lighting when needed on odd occasions)

Hence why I asked for anecdotal remarks. I wanted to know if anyone had actually uprated a bulb due to the slight loss of light when run flat out through a dimmer, or as a result of the light output tailing off as the bulb aged. I have never done it, I have never heard any comment on doing this either. Hence the question out of genuine curiosity, not any desire to score points.

So aside from your one room that you over lit in anticipation of letting the dimmer sort the problem (and hence you could legitimately cite that as an example of a dimmer costing you more than had you selected more optimal lighting in the first place), you fell into the category in most rooms where the dimmers either had no real effect or saved you a small amount. This was kind of what I was saying at the outset.

(obviously that one bright room probably wiped out any nett savings by a large margin)

[1] in fact the only time I have been concious of particularly poor incandescent lamp performance is at a mates house in the early evening. It turned out when I measured his supply voltage it was dropping to about 205V without much load in his own installation. That did bathe everything is a noticeably golden / yellow glow. I think he eventually persuaded the supplier to come and shift him to a less loaded phase, and uprate the supply cable he shared with a couple of other properties.
Reply to
John Rumm

It seems fitting more bulb power than needed is more common than I thought.

NT

Reply to
meow2222

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.