US power system

Much depends on the circumstance I find. 50Hz lights don't seem to bother me particularly, but CRT flicker at anything less than about 82Hz really bugs me. (as do modern electronic cats eyes, and some brake lights, and many LED displays etc).

Reply to
John Rumm
Loading thread data ...

I found with large CRT screens that 75Hz vertical was not quite enough.

82Hz was where I stop noticing it.

as seen in many buses and tube trains...

Reply to
John Rumm

Don't think so. I think that was just a manifestation of the fact that component drift with temperature was worse back then. More recent CRT tellies didn't have them - or they could lock on better.

Reply to
Tim Streater

for B&W high stabilty wasn't needed.

Reply to
charles

The latter. That and better temperature control inside the box due to using solid state components instead of valves.

Reply to
John Williamson

Surely if you're slightly off, the picture would fall off the edge of the screen?

Reply to
Uncle Peter

I didn't have to adjust it with temperature. Once adjusted, it was fine when cold and warm. It would drift out after a few months.

Reply to
Uncle Peter

I wasn't referring to a valve telly.

Reply to
Uncle Peter

Not if the DC was smoothed well.

Reply to
Uncle Peter

Do the sums. To adequately smooth the many kilowatts of lighting used in the average studio in those days would take Farads of capacitance and many Henrys of inductance. A single lamp could well be using about 5 kilowatts, and up to a dozen were in use for most shots.

But it wasn't a problem, and the lack of use for the studio in question probably had more to do with the quality of the 525 line cameras than the lighting flicker.

It was also, IIRC, better to convert from 625 lines PAL to 525 line NTSC than the other way round.

Reply to
John Williamson

IME, that seems to be true. However, when I was testing refresh rates with a 19 inch CRT monitor, I could just about discern the flicker at

75Hz when observing from the near periphery field of vision but it wasn't troublesome otherwise.

I think I chose 85Hz (when the choices included up to 120Hz for the

1280 by 960 screen resolution I was using) simply to save unnecessary stress on the LOPT circuitry. 60Hz refresh rate was too ghastly to bear on a CRT but perfectly fine on an LCD based display panel (as would be true for 17 and 25Hz refresh rates if we leave aside the need to satisfy moving image requirements).

I'm our household pets appreciate the change from CRT to LCD displays even more than we do (especially the flies). :-)

Except for the use of DC in the case of high intensity discharge lamps (sans a phosphor coating), I'd already pointed this out in yesterday's afternoon posting.

In which case, the DC ballast aught to introduce a swift reversal every couple of hours runtime regardless if it's only a matter of a few hours (depends on what defines 'a few hours' though).

Reply to
Johny B Good

Don't overlook the fact that the flicker rate of a lamp will be twice the mains frequency so all those 50Hz lights are actually flickering at 100Hz. Fluorescent lamps can introduce a 50Hz flicker component when the cathodes start wearing out. AFAICR from my "Lamps and Lighting", the imbalance only needs to a mere 3% for it to become observable by most people.

Multiplexed displays (whether gas discharge or LED based) tend to have very low refresh rates in relation to hiding the effect that's magnified by the very low duty cycle of each digit which, on an eight digit display will be less than 12.5% for each of the 8 digits. digital oven clocks are a prime example of this deficiency.

As for LED rear brake lights and electronic cats eyes, the problem is also exacerbated by very low duty cycles and them mainly being observed in the near peripheral vision in very dark conditions.

Wobble of the image on the retina can produce a confusing mess of 'dots' instead of a smear if the refresh frequency is too low. Provided the refresh rate is high enough, the resulting series of dots should produce a 'join the dots' type of effect which is a far less disconcerting facsimile of a smear we'd happily accept with continuous illumination.

I'd say you're not alone in observing the worst evils of such illumination.

Reply to
Johny B Good

You're following a red herring with that line of thought. The reason for locking to a wandering 50 or 60 Hz grid supply reference was to stop the hum bars moving which rendered them invisible unless you watched the TV screen at the end of the program day to detect the vertical change of shading in the mid grey tone of the raster scan.

If the TV broadcasters hadn't locked the camera scan rates to the mains frequency, the resulting moving hum bars would have detracted noticably from the picture content.

By the time colour transmissions were introduced the LOPT supply smoothing had significantly improved so that the moving hum bars could only be observed at the end of the programming day, the change in level due to less than perfect smoothing being too slight to be observable during reception of program content.

Colour broadcasting required the line and vertical scan frequencies to be precisely locked to the colour burst reference so could no longer take advantage of the masking effect of 'locked hum bars'.

The TV set manufacturers were effectively forced to upgrade the smoothing performance in the power supplies as a direct result (it was not merely coincidence that the supplies were much better smoothed with colour TV sets).

Reply to
Johny B Good

So what? The percentage cost of the capacitance would be no different than doing it on a smaller scale.

This has been observed when we saw USA newsfeeds on the BBC.

Reply to
Uncle Peter

No TV sets ever used the mains for sychronisation purposes. A TV set was designed to lock onto the sync pulses in the over the air broadcast signal. Since the cameras were locked to the mains frequency in pre-colour systems, the TV set would end up locked to the mains frequency automatically.

As regards the absence of front panel horizontal and vertical hold adjustment controls in modern sets, this is largely down to modern (only 2 decades or so old) digital sync extraction techniques and (possibly) digital control of analogue components in the final line and vertical scan circuitry. These controls would still exist but only as internal 'trimmer' controls on the circuit board for factory alignment and maintenance adjustments to counter drift in component values with aging.

Reply to
Johny B Good

The same cameras were used for both 625/50 and 525/60.

On the basis that there was more information in the 625/50 signal? But remember that the full picture electronic standards converter didn't appear on the scene until well after colour had started in the UK. The Mexico Olympics (1968) was its first outing and that was only in the 525 > 625 mode.

Reply to
charles

TC Studio 1 had an installed lighting load of around a quarter of a megawatt.

You weren't seeing degradation introduced by standards conversion, you were seeing that the incoming pictures weren't very good.

Reply to
charles

There's also the problem of being able to control the level of the lamps.

Reply to
Dave Plowman (News)

I'll bet line up took ages when that switch was thrown. ;-)

Reply to
Dave Plowman (News)

LOPT supply smoothing?. What do you mean by that Johny?...

Reply to
tony sayer

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.