Thats right, but my point was that you can create one using the existing ballast by doing nothing more than adding a cap. It is of course entirely necessary to caculate the cap value correctly :) You merely add enough capacitive impedance to cancel the inductive impedance plus add the right amount for capacitive ballast. This is a simlpe way to convert used or already lying around magnetic ballast fittings.
Likewise. At around 75Hz viewed straight on, I am not so much aware of flicker as such but become tired more quickly. Above that it improves and I don't notice a difference between 82 and 100Hz.
I've read that the rod cells (most dense around the edges of the eye) are more sensitive and are faster than the cone cells.
I notice a similar effect to John. Up to around 70Hz I can see flicker clearly in central vision and it's worse in peripheral vision. Whether this has to do with the cell sensitivities or because the brain is processing central vision images differently to peripheral vision - which is more about movement detection - I don't know.
Around 75Hz I am aware that the light is not steady but don't perceive it obviously as flicker. I will tend to tire quickly however
It's irrelevant on an LCD - the only source of flicker is the backlight, which is driven from an HF oscillator. The light-gates are set and stay that way till the next time they're refreshed, unlike a CRT where there's a single spot of light and the phosphors have to keep glowing for a while and then rely on persistance of vision to avoid flickering.
They used to say that about the VESA standard refresh rate of 72Hz (which IIRC is the legal lower limit in the workplace now), but I can still just about sense flicker in centre field of view at 75Hz and noticably in my peripheral vision.
I expect it is just me being "unusual" though - since I often notice flicker on things like LED seven segment displays that to most normal people are steady.
It would be interesting to see if there was a noticiable differnce betweent the sexes as well since male anf female image processing is quite different (we take far more notice of centre field of view information and much less of peripheral vision except for movement or strobe detection)
I did an experiment a few years ago on an Amiga A4000 with a cirrus logic graphics card. The standard screen mode control software allows every aspect of the screen raster to be controlled in very precise detail (line and field scan rates, pixel dimentions, sync polarities etc). So basically you can create any screen mode, resolution, and colour depth you like alongside al the "standard" PC style modes (within the limits of the maximum bandwidth of the GPU).
I found could identify quite accurately the point at which I could no longer see the flicker by ramping up the vertical refresh 1Hz at a time from the mid 70's. 80Hz is acceptable, and 82Hz was enough to ensure there was no detectable flicker in any situation.
Hence of all the default scan rates I will opt for 85Hz usually. At
1600x1200 on a 23" CRT that seems to work nicely.
I guess we must be suffering brain bloat then... damn, Windows comes to us all!
I find 50Hz TV very distracting - although nowhere near as bad as a computer image (high contrast images like computer desktops are *much* more objectionalbe in flicker terms than continous tone images like TV).
So my main TV is 100Hz, or I can watch on my computer screen at its default rate. (I try and avoid the small set in the bedroom for anything more than the odd half hour though).
HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.