OT: 4k TV as PC monitor.

I recently 'fixed' my Dad's 40" TV and was testing it on my PC by using it instead of my two 24" LCD monitors (one vertical, one horizontal). It's a Panasonic TX-40CX680B TV.

Bloody hell, at 3840 x 2160 it's really pin sharp, and much better than my pair being both a larger area and continuous. I hadn't realised how good these were, and they're much cheaper than a similar sized dedicated monitor.

I don't do gaming, so speed isn't a problem, and I don't care about the TV part. I do electronics design and programming, so having some of layout, simulation, compiler, editor, datasheets and email etc open at the same time on a big screen is very useful.

He wants it back. I must have one.

So. Are they all pretty much equal? What's the best buy? 40" is good, is 43" worth the extra?

I've looked at what would seem to be more appropriate newsgroups, but they're full of s**te.

Cheers

Reply to
Clive Arthur
Loading thread data ...

You say speed isn't important, but it's worth bearing in mind that some combinations of graphics card, intarface cable and monitor will only do 4K at 30Hz. That's fine for a static image, but dragging windows isn't as smooth as 4K at 60Hz.

To get 60Hz, everything needs to support it: graphics card, monitor, and you'll need HDMI 2.0 or display port.

One thing to watch out for is that not all applications support scaling properly. Even some of the Windows applications don't. This can make the text and dialogs look very small. I expect things will improve in this area over time, but they're not there yet.

Reply to
Caecilius

If the OP is getting "pin sharp" quality then that must mean that his computer is very capable of handling 4K.

Reply to
Bod

Also might be an argument for a screen with Chroma Sampling 4:4:4 otherwise your text outlines won't be the smoothest as they could be.

formatting link

Reply to
Adrian Caspersz

Wekk, I make it about 15.6% bigger in area.

Reply to
Bob Eager

Whilst that is essentially true, the card in this PC is only HDMI 1.4 and *does* output 4k@60Hz (but only 4:2:0). Long discussion on various forums arguing whether it really is 60Hz or not - though that is what the TV reports the signal as.

Reply to
Lee

I on;y noticed this a couple of weeks ago. I'd filmed/videod a friends band using a canon EOS M3 not sure which track. Think it was this one as the finging was around the white shirts. This doesn;t appear on the you vid. But I haven;t watched it on my friends TV so aren't sure.

formatting link

But I'd taken the SD card out and put it a USB stick adapter to view it on the TV. It seemed OK sound pretty good colour was OK but I did notice a bit of frin ging around the body where the lighting had hit the white shirt, thought it was an intresting effect. But when I got home and viewed it on my retina imac the fringing had all bu t disapeared and the colour was much better.

The TV was a samsung 40 or 43 inch about 4 years old. I tried it on my TV LG 43" which was better than the samsung not quite up t o the imac.

Of course this could be that there isnlt a sutable profile for what they ha d their TV set to movies sport or whatever so maybe it could have been adju sted better.

If you have a good image (not you personally) a picture or video it might b e worth looking at this on a few differnt devices I know there's quite a di ffernce bewteen computer monitors. my 24" LG at £170 looked pretty crappy for skin tones compared to even my old non retina imac the contrast just didn't seem to be right, I didn't adjust anything but ti was lacking that quality look you get from a proper monitor.

another group I'm on are currently discussing wide gamut monitors and sRGB etc... rec.photo.digital

might be worth having a look, they don't all agree of course.

niether of the TVs were 4k just HD.

As for size well. I wanted a 40" but by the time I decided which one it had gone out of proiduction and the was a new 43" for around the same price. Now the ultra HDR 4Ks are out not sure how they'd fair compared to a prope r monitor.

Reply to
whisky-dave

+1 The TV I'm currently using does 4:4:4 @60Hz but the card in the PC can only do 4:2:0 @60Hz over HDMI. Not usually too much of a problem except with red text on dark backgrounds. It's fine doing 4:4:4 @30Hz though, so I can always back it off when needed. A new card is on the list ;)
Reply to
Lee

I use my Hazro IPS pc monitor as my tv !. It was an emergency arrangement in 2012 to watch the olympics and I have stuck with it. My Humax HD Fox T2 stb supplies the image and my philips 320 pc speakers do a pretty good sound.

Most 4K tvs are TN screens. IPS is better. support for deep colour desirable.

The latest NEC monitors have a slot to take the arm3 card (the variety with just a backplane connector).

Reply to
Andrew

The problem I have is widescreen is not suitable for PC use. I prefer one (preferably two or three side by side) 4:3 monitors. Since they no longer make those, an ultrawide is the only way forwards (two 4:3s equivalent)

Reply to
James Wilkinson Sword

I would think someone who "does electronics design and programming, so having some of layout, simulation, compiler, editor, datasheets and email etc open at the same time" will have a good graphics card.

Reply to
James Wilkinson Sword

We've been using 40" 4K like this for a couple of years. They are very nice for CAD and general development work.

Caveats:

If you want 3840x2160 at 60Hz you need to pay careful attention to your GPU. Until recently AMD were better at this at the low end than NVIDIA. My current pick is the RX460, though I still have a few glitches under Linux.

30Hz is a lot easier.

Also make sure you have an HDMI 2.0 cable if doing 60Hz. You may need to enable DisplayPort 1.3 (monitors might have a setting that defaults to DP

1.2).

TVs have many kinds of picture-mangling - one Samsung has 'Football mode' that applies some horrible filtering, and it still a ness even if you turn off all the settings. On Samsungs the trick is to rename the inputs to 'PC', which magically turns it into a dumb non-filtered display.

At the cheaper end, many of the panels are married with rubbish scaler electronics. That can be laggy and downsample chroma, which is horrible for coloured text. This image will test chroma downsampling:

formatting link
and how to test for lag:
formatting link

If it's a no-brand TV and only does Freeview, run away as those are some of the worst. For instance the US Seiki 39" was very popular but in the UK they put on a terrible scaler and SD Freeview tuner so it's awful. The low end of Bush and friends are likely similar. (ob d-i-y: I looked into it and it seems you can buy the better control PCB from China and swap boards. But I didn't think the price/risk tradeoff was worth it)

Lots of TVs come with smart-junk. You pay a bit more for a monitor but it avoids the junk and it usually get more useful inputs (Displayport). Though lots of monitors have slightly awkward menu systems. Some have serial ports or remotes you could try and drive externally.

Our current model of choice is:

formatting link

- this looked pretty reasonable in our testing, though I haven't used it for long periods myself.

Theo

Reply to
Theo

This isn't an issue at 40", where the PPI is roughly the same as a 20" 1080p monitor. That's one of the benefits of going to a larger size - you don't need to adjust any scaling settings.

It's more of a pain on 27" 4K which can get uncomfortably small. The Mac/Raspberry Pi approach to this is better - render the screen at whatever resolution you're comfortable with, and then rescale with the GPU to the panel's native resolution. It's not a 1:1 mapping so not quite as sharp, but on high-DPI ('Retina') displays you don't notice.

Theo

Reply to
Theo

Area yes, but no more pixels.

It all depends on your viewing distance. If you're going to sit it on a desk and use it as a monitor, the danger with a larger panel is you end up moving your head to see the corners.

For instance, I'm about 80cm away from the 40" panel in front of me and I can see the whole screen without turning my head, but some of it is using more peripheral vision. In contrast, turning to a 19" 4:3, it's entirely within my primary FOV. When using the 40" I tend to have my primary focus on only parts of the screen, and others are peripheral tasks (eg email).

In some cases there are apps (CAD etc) that need the full screen with important info at the extreme left/right edges. These are a bit tiring to work with.

A larger panel would make this worse - a smaller part of the panel would be in my FOV at any one time. I wouldn't want to do that unless there were more pixels to compensate.

Try it: cut out a piece of cardboard and pin some documents to it. Then see how you move your focus around.

Theo

Reply to
Theo

Are there really graphics cards on sale today that can't do 4k @ 60Hz? My i7 3770 machine which is now 4 years old can manage 4k @ 30Hz with its own on chip HD Graphics 4000. More recent CPUs will do 4k+ @ 60Hz.

Given it is digital signal transmission you either get all or nothing.

Corruption when a cable goes bad is random blocks/lines of wrong colour.

Why bother? You can get away with no graphics card these days and save a couple of hundred watts. The internal i5 & i7 graphics engine from HD

4000 onwards is perfectly adequate for electronics design and most other 2D work. Its clunky for gaming but that is another matter.

formatting link

What you don't spend on the graphics card can be used to have a more potent CPU, ram or SSD according to taste.

Unless you want high end gaming and realtime 3D texture rendering a fancy graphics card is total overkill in a dedicated workstation for electronics design or programming. The only caveat is that it might be pushing it to drive a 4k display @ 60Hz on a 6th generation CPU. I think it may be interlaced 60Hz (ie. 30Hz) at the highest resolution.

The only question is would you want a 40" TV on your office desk. I honestly don't think I'd want anything bigger than 28". YMMV

You may also have to do something creative to make a it adjustable. (which is entirely in the spirit of UK DIY)

Reply to
Martin Brown

I have a 24 inch screen and when it is at a comfortable distance to read everything, I find I still have to move my head to see the corners.

One of the problems with a restricted field of vision!

Reply to
Bob Eager

agreeing with most of what you said, but a 'better than onboard' video experience doesn't need a couple of hundred watts.

the per watt performance of my latest Nvidia card at the bottom end is much much better than the older cards.

It runs fanless.

Currently idling at 36C

normally less than 20W...

GT710 or 720 - I forget which

Reply to
The Natural Philosopher

The difficulty is in outputs. Either you need Displayport or HDMI 2.0. Only the latest generation of cards have HDMI 2.0. Displayport only features on mid-high end cards - most of the low end have VGA+DVI+HDMI 1.4, which will only do it with chroma downsampling.

Last time I looked the lowest NVIDIA with DP was a GTX750, which is heading towards hundred-pound territory.

I suspect the reason is that the lower end GPUs are old designs that filter down. Those old GPUs may be on an older process that doesn't support DP/HDMI 2.0. However CPUs are already on the latest process so it's not a problem to add them.

If your CPU supports 4K Displayport and your display does too. Only Kaby Lake CPUs support HDMI 2.0.

Theo

Reply to
Theo

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.