I am looking at buying a new monitor. The one I like is 75Hz but as far as I can see my graphics adapter only runs at 60Hz (Intel UHD Graphics 730). Can I assume any monitor will run at 60Hz? Am I missing something?
If you use Entechtaiwan Moninfo, you will find the monitor advertising all sorts of silly stuff.
The NVidia control panel on the other hand, tends to toss out the less-than-direct configurations, and sticks with the ones that have a reason to exist. This is a gamer monitor with 120Hz capability, and yet, the NVidia panel only lists 60Hz and 120Hz as options. There's no 85Hz offered.
formatting link
An "ordinary" monitor, won't go to 120Hz. Ordinary monitors would still be 60Hz today.
In the old days, 75Hz was achieved by throwing away 1 frame in five (4x15 versus 5x15, so one in five need be removed to get to 60Hz). To do 72Hz, another of the "bogus" values, you would toss 1 frame in six (5x12 versus 6x12, so one in six need be removed to get to 60Hz).
Monitors also have two schemes for supporting variable frame rates caused by performance variation in game rendering on a GPU. This allows better apparent performance. The video card supports one of the methods, so you have to match your video card to your monitor, for the variable feature to work.
I would say "connect it up and it will work". Because, let us say that they were idiots, and it really was a 75Hz totally-inflexible panel. Think of the customer returns, and the expense for sellers, dealing with such a product. The product had bloody well work. It had better work with the all-too-common Intel GPU.
There have been a few brands which had "issues". You always check the reviews for a monitor, to spot such a pattern. It might have been a couple Samsung or LG monitors, where for some reason the HDMI input did not seem to work. I expect most of those got returned, because nobody could figure out a "success formula" to trick it into working. I'm sure they tested it at the factory, and there's some "assumption" somebody got wrong along the way. But for the people who had the problem, returning it was the solution.
Normally, when debugging a DOA monitor, you connect two monitors, and use the good one for "steering". Configure the duff monitor as the "extension" one, while you play with the resolution and refresh choices.
Pretty well all will, most are multiples of 30Hz now, generally higher than 60Hz is gamer territory, for some reason this laptop (not a gamer one) does 90Hz, you can only tell the difference if rapidly slinging windows around with the mouse ...
LCD monitors tend to have a set of built in profiles (width x height x refresh), unlike CRTs where you could feed them with anything and they would try to lock. The good news is that the computer reads the profiles the monitor says it can do and only offers you those both the computer and the monitor can handle.
So it is highly likely the monitor offers 1920x1080x60 (for example) as well as 1920x1080x75, and the computer can just pick the first one.
Although it is quite possible the computer can actually generate the 75Hz profile if not in the maximum mode, eg if the computer can do 3840x2160x60 it is possible it can also do 1920x1080x75. It's the total product (video bandwidth) that is the limiting factor. So the 'up to 60Hz' limitation may only apply in the maximum mode the computer can generate.
But does your graphics card do 90Hz or is the monitor in reality operating at 60Hz? My present monitor is 60Hz. Could this be the reason that nothing above 60Hz is shown in the PC settings? You have certainly dissuaded me from buying the 165Hz model.
On Mon, 24 Jul 2023 16:19:46 +0100, Andy Burns snipped-for-privacy@andyburns.uk wrote: [snip]
Sorry to be a nuisance but can I take it from this that for a 'standard' monitor anything up to 240Hz will be supported and therefore 75Hz will be easy peasy:
For fullHD resolution you should be OK up to 120Hz, but don't you want higher than fullHD? Depending on what interface (HDMI, DP, VGA) you're using, and the specific spec version 1.4/2.0/2.1 of HDMI/DP
formatting link
240Hz is getting exotic and probably needs compression.
still think it's an odd-ball frequency, would want to see it listed by both the card and the monitor
It can do *anything*, up to the metal limit of the cable.
You have to know the standard. Like is the DisplayPort version 1.4 or is it 2.0 . That's a thing you check on the GPU side.
if the standard is DP 1.4 and the spec on the side of the computer tin says "18Gbit/sec max datarate to cable", then that's a limit on the highest combination of settings the card can use.
Video card horizontal, is in multiples of 8. For example, 1360 and 1368 are "valid" divisible-by-8 values for screen width. Yet, there are panels designed to 1366 width, and in that case, the video card can generate 1360 or 1368.
Vertical is divisible by 2, and this covers interleaved and progressive display options. The height could be 766, 768, 770 as generated by the video card.
If a Silicon Image generator chip is used external to the GPU, the synthesizer on that can do x1,x1 and can do odd screen sizes 1367x767. But hardly any company offers us such beauties.
The Mode Line is a series of registers in the video card. You can program any integer number in there, and make any kind of display you want. But after the noise settles, you aim for your mode line maths to be divisible by 8 horizontally and divisible by 2 vertically.
You can have any refresh you want.
Then, it is up to the monitor, to decide what response it will make, when you select something really really goofy. It can display "Out of Range" on the OSD for example. That's a way of saying "piss off with the goofy settings".
Summary: Video card is infinitely flexible (up to the wire limits of the standard it supports). Entechtaiwan Powerstrip program, allowed entering custom mode lines into a GPU (in Windows).
LCD monitor has refresh limits. The monitor is likely to have a scaler inside (for non-native res).
Cable length and datarate selection, can cause transmission errors on a too-long cable. Just because the video card can generate 18Gbit/sec, does not mean the cable magically has to like it. This would not be an issue at 1920x1080.
When buying a 4K monitor, that's when you check your Intel GPU to see if it supports a 4K monitor at 30FPS or 60FPS. The stinky ones were only 30FPS. This would be embodied in the HDMI or DP standards version number, and that's one way to track it down. In general, they don't make this process easy.
Yes all down to drivers, so I use the TV. I cannot see it, but people tell me it looks fine. This will connect via hdmi, but I use a little adaptor plugged into the vga port and amongst other things it seems to simulate what the pc is looking for and on the other end, what the tv wants. Best of both worlds. Brian
I don't do gaming so I do not have high demands. Most of the monitors at Argos seem to be 75 Hz. I like the claims about eye protection (blue light) but maybe this applies to all decent monitors? I see my PC has DP socket but it seems to me that only the more expensive monitors support this feature which I would buy if recommended but it seems to me to be unnecessary for non-gamers with only monitor.
You have something screwy with extensions? 'view source' should never be something a web page can open (unless it's a web page displaying text that happens to be HTML source, but that's not the browser's 'view source' function).
It opens normally for me in Firefox/Ubuntu. ASUS VA247HE is the model.
Apart from the problems with the Argos website, does this seem a reasonable choice of monitor for basic use (no gaming, minimal video editing)? My only concern is that it is HDMI not DP but I don't suppose this matters from what you said earlier. I like the idea of limiting the blue light (for eye care) and the three year warranty.
HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.