Why Your Next TV Needs HDMI 2.1

13
Why Your Next TV Needs HDMI 2.1
Image: iStock/Kameleon007

This article is sponsored by National Product Review.

With a new generation of consoles now just over the horizon, you may be hearing some chatter around HDMI 2.1, so what’s all that about?

If you’re super new to TV tech, HDMI stands for High-Definition Multimedia Interface, which is an audio/video interface for transferring data. For the layman, it’s the connection a TV or monitor uses to talk to consoles, DVD players, etc. The type of HDMI being used will dictate the maximum resolution and refresh rate you can view.

Most current TVs use HDMI 2.0, which can deliver a 4K resolution at a maximum of 60 frames per second (FPS), which is fine for current-gen gaming but won’t deliver the full advantages of the Xbox Series X, PlayStation 5 or RTX 3080 if you’re hooking up a PC.

With HDMI 2.1, you can access a 4K resolution at 120 FPS, along with some other handy features. It can automatically set your TV to game mode thanks to something called Auto Low Latency Mode, which is triggered when a console or graphics card is detected to reduce input lag.

But most importantly, HDMI 2.1 supports Variable Refresh Rate, which is crucial when it comes to preventing screen tearing — a jaunty visual defect that occurs when a display is out of sync with the video source.

You can see what I mean in an example video below.

The Variable Refresh Rate of HDMI 2.1 allows the TV to match the console or PC’s frame rate so you get smooth visuals with no tearing. Lastly, it also supports eARC, or Enhanced Audio Return Channel, which provides higher levels of audio bandwidth for soundbars or other audio setups.

Of course, you’ll still be able to play next-gen games on TVs or monitors with older HDMI ports, but it might not look or feel as good as it’s meant to. There aren’t a lot of TVs with HDMI 2.1 on the market right now, but you can absolutely pick one up if you’re keen.

Samsung’s Q70T, Q80T and Q95T series all feature HDMI 2.1, but if you’re really looking to future proof yourself, the Samsung Q800T 8K TV will sort you out for a good while. It features 8K AI up-scaling capabilities that can restore picture details not present before, Anti-Glare and Direct Full Array Elite technology for better contrast and more depth, even in bright rooms and Object Tracking Sound technology for accurate 3D sound placement. 

A worth upgrade for the next generation of gaming. You can read more about the TV here.

Comments

    • I’m kind of curious about what you’d suggest as an alternative. Outside of the pro space with things like SDI, everything supports optional over the wire encryption. Switching to DisplayPort isn’t going to change that, and any new non-DRM standard would need to have clear benefits over unencrypted HDMI or DisplayPort.

  • Since I not long just got a new 4K TV to replace my old 1080p one that stopped working, my PS5 will just have to settle for 60fps for now.

  • This article seems a bit backwards. Presumably there are going to be television sets that have HDMI 2.1 inputs but don’t implement all of these features.

    If you want a television that supports variable refresh or automatic low latency mode, shop for TVs that advertise those features.

    • Atm its holding true in general that if a TV supports 2.1, it’ll support 4k/120 & VRR. ALL is a bit more hit and miss you’re right.

      Tbh, HDMI 2.1 is a fuckfest. Too open of a standard, with LG doing things like its CX panels not supporting 48 Gbps, Sony running into issues with its X9000H with not having enough EDID space to support Dolby Vision @ 4k/120, and plenty of sets only coming with one or two ports that are 2.1

  • Well I bought a Sony 4K TV 2 years ago and can’t realistically afford a new one.
    I don’t see many games going above 60fps at 4K anyway so unless your some elitist PC user who spends 10 grand on a setup I don’t see this really affecting anyone.

    • 120 fps seems a bit crackers and pointless considering 4k 60 fps should be near perfection and is quite probably unnoticeable to the majority of people.

      And don’t get me started on the plethora of 8k 120 fps articles…

      • There are ways this can benefit lower frame rate content.

        As the display link has the capacity to transmit a lot more pixels, it also means that it will take less time to send all the pixels in a single frame. Which in turn can reduce latency. This is the basis of HDMI 2.1’s “Quick Frame Transport”, where it runs the link at maximum bandwidth in a burst to send a frame as quickly as possible.

        The higher refresh rate can also correspond to more flexible variable refresh rate modes.

  • Unless you want to use your PSVR on the PS5, in which case there may not be a point, because you will lose the 2.1 when it’s passed through the PS5 (from what I understand). So either get used to cable swapping, or give up your 4K120 dreams until Sony releases PSVR2

  • Also be careful of what tv you get. A whole bunch said the support HDMI 2.1 but it was discovered it doesn’t actually work on a late amount of them

  • I have an older OLED and there’s no way I’d upgrade just for a new connection. If it is even possible to get 4k @ a steady 60fps I will be ecstatic. Not really one for throwing out predictions but I doubt in this gen that cutting edge games will do much better.

    120FPS is just…. the machines aren’t powerful enough to do it in any meaningful way. But I guess we’ll see, maybe I’ll be shown to be wrong in 5 or 6 years when they’re really tapping the entire thing to the last ounce of power.

    • I mean…they literally are. Dirt 5 runs at 1440p (might be 1080p) on the Series S at 120fps…and this isn’t even a game developed exclusively for next gen

Show more comments

Log in to comment on this story!