This site may earn affiliate commissions from the links on this page. Terms of utilise.

I've been a PC gamer for nearly 30 years, but I've never seriously considered gaming on a tv set until quite recently. Most of this, I remember, is due to growing up in an era when TVs were flatly incompatible with monitors and vastly junior to them in terms of resolution and overall epitome quality. Even before the advent of LCDs and apartment-screen monitors, CRT-based displays outstripped the resolution of your average television.

Modern televisions are vastly more than capable than the CRT-based sets of 20 years ago. Like PCs, they use HDMI, support the same resolutions (720p, 1080p, and 4K), and are ofttimes advertised as supporting refresh rates of 120-240Hz. While in that location are a handful of >60Hz monitors bachelor on the market, these oftentimes control substantial premiums compared to regular old 60Hz displays. In theory, a 4K Tv could make a great gaming display. The reality is considerably more complicated.

First, there'south the issue of input lag. Input lag is the lag introduced between pushing a button on a controller, mouse, or keyboard, and when the results of that action appear on-screen. While it's oftentimes discussed as an issue that impacts console gaming, anyone considering an HDTV for PC gaming will have to argue with this every bit well. 1 of the issues with HDTV gaming is that input lag on the best TVs is even so higher than on elevation monitors. The fastest monitors add together 9-10ms of input latency, while the best HDTV's are around 17-18ms.

To be clear, 17-18ms isn't bad at all (it rates as "Excellent" on DisplayLag.com's official ranking arrangement), and if you lot aren't playing high-speed FPS or RTS titles, you might not notice college input lag at all. Civilisation doesn't exactly rely on fast-twitch gaming, subsequently all. Plenty of TVs, however, don't even articulate the 40ms bar that DisplayLag qualifies as "Great." Input lag tin can sometimes be improved by adjusting settings within the Tv'south various submenus, but this varies by model and manufacturer. The vast majority of manufacturers don't listing or provide input lag information — it'south something you typically have to cheque at third party sites similar DisplayLag.

Overscan1

Next, in that location's the effect of overscan. Overscan refers to the exercise of not displaying all of an available paradigm on the bodily screen. It's a holdover from the pre-LCD era when there was no style to guarantee that every tv would display content in precisely the same style. The solution to this trouble was to zoom the final output slightly, creating a pocket-size border around the edges of the screen. Ultimately, modern LCDs don't take much use for overscan, but it'southward still enabled by default on many displays. Whether or not you can disable it depends on what kind of TV you have — some older LCDs may not offer the selection to disable overscan at all. Graphics cards from AMD and Nvidia tin can compensate for overscan in software, but this may result in less-than ideal text and font rendering.

Finally, while in that location are televisions that tin can actually achieve a 120Hz refresh rate, this varies past manufacturer. This article from CNET explains the rules of thumb for a number of companies and how to determine exactly what the refresh rate is.

No support for Adaptive Sync / FreeSync / G-Sync

This terminal point is aspirational, but if y'all've spent any time with a monitor that supports Adaptive Sync (that's the official VESA name of what AMD calls FreeSync) or Grand-Sync, you're enlightened of how crawly the feature is for gaming, even when you lot're playing at 60 FPS. The lower the frame rate, the more FreeSync / Adaptive Sync helps, since ensuring smoothen frame delivery is more than important, the longer the gap betwixt each frame. For example, 30 FPS titles deliver 1 frame every 33.3ms, while sixty FPS titles deliver 1 frame every sixteen.6ms, assuming constant frame latency.

An AMD slide explaining FreeSync / Adaptive Sync. Nvidia's G-Sync accomplishes the same thing.

An AMD slide explaining FreeSync / Adaptive Sync. Nvidia'southward M-Sync accomplishes the same thing.

I mention 1000-Sync since that's Nvidia'southward version of the aforementioned technology, though the company has never appear any interest in working with TV manufacturers to bring a G-Sync uniform panel to market. The AMD-backed VESA standard, Adaptive Sync, could theoretically be supported in future panels — merely but if it's added to the HDMI specification. Right now, the latest version of HDMI, 2.0, doesn't support Adaptive Sync. HDMI 2.1 is withal in the planning stage, which means TV sets that apply this standard are still a few years away, best-example.

While supporting Adaptive Sync in HDMI two.i wouldn't solve input lag or overscan issues, information technology would ameliorate the overall gaming experience on HDTVs that besides addressed these problems. Both PCs and consoles would benefit from the feature, and information technology might even be possible to activate on current consoles depending on exactly how Adaptive Sync was implemented and whether or not the GPUs within the Xbox 1 and PlayStation 4 support it.

If y'all desire to game on a big-screen TV, y'all'll need to plan your buy advisedly, and we recommend Googling specific model numbers of displays yous're considering to meet how others are getting on with the same hardware. Hopefully in the well-nigh time to come we'll encounter HDTV's adopting standards like Adaptive Sync / FreeSync, likewise as some manufacturers explicitly moving towards the PC gaming crowd. Given that we're already seeing HDR support show upwards in early TVs and monitors, information technology'd exist dainty to see more cross-pollination between feature sets.

Next, read: How to buy the right video bill of fare. And cheque out our ExtremeTech Explains series for more in-depth coverage of today's hottest tech topics.