I’ve come to notice that many medical-use LCD monitors have RGB, composite, component, and s-video inputs. I know that historically 240p gaming sucks on LCD displays but I’m wondering if a medical use LCD would be different. My logic is as follows:
- Medical monitors should have high color accuracy and low input lag
- They should not stretch or distort images
- They need to be backwards-compatible with older input devices
Here are a few examples of models which have RGB inputs:
Has anybody experimented with this? What are the possible cons?
It wouldn’t be better, or worse, than a quality computer LCD.
Professional monitors are calibrated for a specific use, with a specific hardware companion. There is no evidence that it would have the same characteristic (rendering fidelity) with another hardware.
As for the cons, they’re the same as any LCD really. Anything that isn’t at its native res is scaled, it has a processing delay for the scaling process, square pixels means some analogue images will have their aspect ratio distorted, etc.
Don’t go for it just because it has native RGB. Modern European TVs still have a Scart input, but I wouldn’t recommend them to anyone.
If you’re looking for a mean to display RGB but Scart wasn’t a standard on your continent, I would suggest you to look at solutions that involve Component. Properly encoded, it has the exact same quality as RGB, meaning an SD CRT TV with Component inputs will have the same display quality as a European TV. And it’s a damn good quality.
Thanks for your response. I’m already playing my games on a consumer CRT via a SCART to component converter and I love the quality. I was just curious if any benefit could be derived from a medical use LCD but I’m sure you’re right about the fixed aspect ratio being an issue. I’m sure the upscaling hardware in them still isn’t on par with an OSSC.
It depends on the panel. See www.panelook.com for details of any LCD panel.
I’ve read that that the BVM Sony LCDs are not as great as you’d expect. I’d like to try one myself but it’s way down the list.
I’ve heard good things about the picture on newer monitors with black frame insertion to minimize motion blur. Not sure what lag is like on those though. High lag for a retro game (say 5-7 frames) is still pretty low if you are using it as a remote monitor for a camera or something, so I don’t know how much they prioritize it in the film/medical field.
It still astounds me how much lag modern consoles like the PS4 has in general, it is jarring when you come from an old console or MiSTer and a CRT.
At the low refresh rate that retro games run at (generally 60hz), BFI isn’t going to do you much good, because the monitor won’t be blurring the image anyway.
The latency of most good modern monitors is well within the range of CRT speeds now. A good monitor will advertise 1ms response, but in reality it’ll be somewhere in the 4-8ms range on the best overdrive setting.
Considering a CRT will only be drawing the screen once every 16ms, it’s safe to say we’re finally at the point where CRTs are being outperformed in the latency dept.
That’s not to say that a CRT is worse though. For one, the way they draw the screen is completely different, and you also have to factor in things like bloom, image retention, black levels etc. I still much prefer a CRT for anything 4:3 ratio.
The other thing about black frame insertion is: For modern games, you generally cannot use it with variable refresh rate. You end up having to choose between them. The one exception I know of, is the Asus TUF VG27AQ. For me, I would rather have variable refresh rate than BFI.
I think the inherent latency of modern consoles is slightly higher than PCs, but for the most part the lag you feel in comparison to a CRT is probably the television. Even on game mode, most LCD TVs have horrible latency. Even the best on the market are usually in the 15-25 range (which is fine), while the worst can be upwards of 100ms!