SOT: monitor out of range

I keep getting a message saying the monitor is out of range (when starting up PC). However, I have checked the display settings in the control panel and they are the same as the recommended settings (1920 x 1080 - 60 Hz). If I change the monitor to VGA then back again, it works fine. Any ideas?

Reply to
Scott
Loading thread data ...

You're doing basically as good as I'm doing on this issue :-)

I might shut down the computer, and notice I'm getting "Out of Range" at startup. And really, the signal is in range, and it's the OSD microcontroller in the monitor, that is latching a transient condition and refusing to analyze the perfectly good signal and make the display work again.

Obviously, power cycling the stupid thing (soft power button) or removing and replace an input cable can fix it. But, it should not need us to be doing that.

VGA is RGBHV, with discrete Horizontal Sync and Vertical Sync pulses. VGA is analog, and your digital LCD monitor uses three ADC converters, to make digital streams out of the RGB analog values. In short order, RGBHV are fully digital, inside the input connector chip.

HDMI is RGB and CLK (four diff signals), and the RGB is sent 8B10B (at least on the slowest flavour of the standard). Some of the codes are control codes, and a control code says "this is not a data byte, it's an out-of-band framing signal". And in that way, the RGB is able to pass sync, without having a separate wire for it. Once you get deep enough into the input connector chip, the semantic content of the signals is the same. And you would hope the out-of-range circuit, is checking the signal at this point in the circuit, and not before.

[Picture]

formatting link
What part of my model is wrong ? I'm not sure. But since you and I are both seeing this, and we're not using the same monitor, it's a design/arch problem of some sort.

Back in WinXP era, the "legacy" NVidia control panel, had an obscure setting which caused the display to be "slightly too wide". Naturally this caused "Out Of Range", and for the life of me, I could not figure out what purpose that setting was supposed to have. But I've not seen that option on any newer setup. If you sit there with hands folded for

15 seconds, the interface would reapply the old settings and the screen came back.

There was one interval in time, where the digital output of an NVidia card wasn't fast enough. The pixel clock max back in those days, was supposed to be 165 MHz. Instead, the output pooped out at around 135 MHz. NVidia initially stuck resolution restrictions in the driver, to prevent users from seeing their booboo. But the coder doing the driver, made a hand calculator mistake, and this did result in customers noticing shenanigans were involved. And that's because the silicon was too slow for the application. Today, the I/O is up around 20 Gbit/sec or so on some of these things, and you no longer see any sign of trouble like that. An FX5200 or maybe a 6200, were the last cards from that era with slow silicon. And this prevented you from selecting 1920x1080, and did not have any other side effects. Maybe you could do 1600x1200 or so, max, on that DVI.

For the most part, everything else has been working. Only the occasional Samsung or LG monitor, remaining black screen when driven on HDMI. Resolution of that bug (did they fix it?), unknown.

Paul

Reply to
Paul

[snip]

Thanks for the detailed reply, which I missed a few days ago.

I wonder if it will eventually be resolved by a W11 update?

Reply to
Scott

PS Do you think an HDMI to VGA cable would solve the problem?

Reply to
Scott

I'm wondering if the initial video mode is one the monitor can't cope with. It gets it's knickers in a twist and can't move on.

Does the monitor behave if you boot into BIOS? At what point does it give an "Out of Range" screen?

Reply to
Fredxx

Mine already uses an HDMI to VGA adapter, and it has the problem.

My video cards no longer have VGA, and if you see a DVI connector on a video card, it is now DVI-D and not DVI-I. DVI-I used to include analog VGA signals. But to please Hollywood, by going digital only, they can use HDCP on everything if they want. VGA would be part of the older "analog hole" issue (leaves ability to copy a movie, oh my).

Paul

Reply to
Paul

Years ago I bought a micro motherboard that only had HDMI inputs. I only had VGA screens. I bought a "so-called" VGA to HDMI cable but it couldn't convert the analog VGA signal to the digital HDMI requirement. Waste of money. eBay has several VGA to HDMI converters for under a fiver.

That would be my approach.

HTH

Reply to
pinnerite

Without going into the BIOS, I can see that it starts okay with the basic text showing details of the machine followed by the screen that shows the name of the manufacturer. The problem occurs when it gets to loading Windows 11. I have checked the Windows settings are the same as the settings shown in the error message.

Reply to
Scott

On Wed, 24 May 2023 07:14:28 -0400, Paul snipped-for-privacy@needed.invalid wrote: [snip]

I thought that might be the case as one of your earlier responses suggests the signal is the same anyway (AIUI). I think I'll just have to put up with the inconvenence - and hope it gets resolved by a Windows update - or buy a new monitor, which I am disinclined to do.

Reply to
Scott

I'm seeing this at BIOS level.

It's a side effect of the previous OS shutdown sequence.

It is just strange that the hardware reset does not clear stuff like this.

And one of the reasons I'm using HDMI, is the card has

HDMI DP DP DP DVI-D

and if I use a DisplayPort, when the system comes up, it insisted on driving the HDMI port and leaving the DisplayPort (with the monitor connected) black.

Once I saw that the card seemed to have its own preference, I switched to HDMI, and used an adapter after that to drive the VGA monitor.

I have HDMI to VGA and DP to VGA adapters (more than one), just for stuff like this (the "post-VGA era"). It is pretty hard to navigate all the shortcomings, and keep everything working.

The various options, still rely on the EDID serial bus which declares the supported resolutions and the native resolution. This signal is passed straight back to the video card. This means that the adapters have fewer things they can do to screw up the results.

The conversion from HDMI to VGA should be mostly mechanical (the HDMI provides a pixel clock, and the converter just has to use three DACs to make RGB, plus the tricky part is making H and V from DE). And making sync pulses, just about any hardware designer should be able to figure that out.

My conclusion then, is *something* is coming out of the video card, which is not quite right. It's nothing that a $50000 digital scope could not capture.

Paul

Reply to
Paul

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.