How The New Sony Bravia X90J Stacks Up As A HDMI 2.1 Gaming TV

How The New Sony Bravia X90J Stacks Up As A HDMI 2.1 Gaming TV
Photo: Kotaku Australia

Despite the PS5 and Xbox Series X being released way back in November 2020, TV options for making the most out of both consoles are still limited. There are very few mainstream, affordable TV sets currently capable of the HDMI 2.1 @ 120Hz standard that enables smooth gameplay and 60FPS on ‘next gen’ titles — and it can also be difficult knowing whether now is the right time to upgrade, or what TV you should purchase.

One of the latest HDMI 2.1 TVs on the market is the new Sony Bravia X90J, a $1,995 mid-range smart TV that’s one of the few entry-level gaming TVs on the market. I recently got to spend an evening with the X90J, and came away impressed by the visual fidelity and contrast of the screen. While it doesn’t hit the heights of the leading HDMI 2.1 TV (the LG CX OLED is still king), it’s a great budget-friendly option for anyone looking to boost their gaming capabilities.

Here’s what you need to know about the new TV.

What’s the big deal about gaming on the Sony Bravia X90J?

hdmi 2.1 sony bravia x90j
Photo: Kotaku Australia

As one of the few affordable HDMI 2.1 TVs on the market, the Sony Bravia X90J is an important, budget-friendly option for everyone looking to upgrade their TV set-up in the wake of the PS5 and Xbox Series X.

For the uninitiated, HDMI 2.1 is a new visual standard that enables 4K @ 120Hz and 60FPS — basically, it allows smoother images and uncompressed video quality that reduces picture lag and artifacting while you game.

If you’ve ever noticed text starting to blur or jitter when you’re running around in a game world, you’ve likely run into a phenomenon called ‘screen tearing’, where frames from an image are delayed or out of sync. HDMI 2.1 has a high refresh rate, meaning the image you receive updates frequently, creating smoother visuals and colours that blend seamlessly.

The visuals of the X90J are smooth as butter, and one of the primary reasons you’ll want to upgrade from your existing rig.

As described by representatives from Sony at the event I attended, the TV works with an AI to enhance picture based on a ‘focal point’. Rather than taxing the processor to run the entire screen, the AI selectively focuses on the ‘action’ of the image you’re watching in the same way a human eye would. This way the processor is focusing on enhancing the visuals, colour and depth of specific objects and people, rather than wasting computing power on things ‘out of focus’.

In practical terms, that means whatever you should be focussing on (ie. your player character while gaming) is naturally highlighted, looks more vivid, and moves smoothly throughout the game world.

sony bravia x90j hdmi 2.1 tv review
Photo: Kotaku Australia

It also means that as you run, jump and slide through games, there’s no visual artifacting, screen tearing or blur. Instead, action is very consistent with no chopping or jumping, even for in-game text.

This processing technique also enhances the overall picture of the TV. It means darker colours are deeper and blacker, bright colours are more vivid and pop on-screen, there’s minimal colour blurring, and cel-shading is particularly crisp.

I noticed this most in Yakuza: Like a Dragon running on Xbox Series X, which is littered with colourful, pop-art style signs. Each of these were vibrant and clean, and really showed off the great colour palette of the TV. As mentioned, it’s not as crisp or smooth as OLED TVs (the X90J is a standard LED), but it still rocks great visual clarity for its price point and excels when it comes to movement on-screen.

This is aided by impressive directional sound which emanates from the screen where you’d expect it to – i.e. if somebody is speaking from a distance, the sound is naturally ‘far away’; if somebody is speaking to the right of the screen, that’s where the sound comes from. You will need a soundbar to get those really bassy, low-sounding notes, but the TV performs well on its own, and the directional speakers are a great way to make games feel more realistic and engaging.

It is important to note that the TV only has two HDMI 2.1 ports though, particularly because one is an eARC which may be needed for a soundbar. It means if you have a PS5, Xbox Series X and a soundbar, you may have trouble juggling your inputs. It’s a minor issue, but you should consider it if you plan on forking out for the TV.

Despite this quibble, the X90J is still a worthy contender in the HDMI 2.1 market, and a solid TV all-rounder.

How the X90J goes with other forms of entertainment

sony bravia x90j entertainment
Photo: Kotaku Australia

Beyond gaming, the X90J is a solid performer for all types of content.

Animation looks particularly good on the screen because the TV’s colour range and contrast is solid, and brighter colours are rich and vivid. Justice League Dark: Apokolips War was a blast on the TV because the action was so sharp and snappy, and the cel-shaded artwork looked very neat on the TV. The X90J was also able to handle all the high-speed action very well, making the animation style look dynamic and sleek.

The AI-driven focus that helps reduce stutter in gaming is also great when you’re watching live action entertainment because it gives greater clarity to images in the foreground and reduces jitter between scenes.

There’s also no colour bleeding to speak of, meaning ‘halos’ of light and artifacting on 4K content are practically non-existent. Colour gradients and light glows are consistently smooth without looking blocky, and darker scenes are well-illuminated, meaning the TV picture is steady and reliable even in scenes with minimal lighting.

It makes for a very pleasant viewing experience, and one that looks good in sunlight or darkness thanks to the hearty brightness settings of the TV.

Is it time to upgrade to a HDMI 2.1 TV?

horizon forbidden west
Image: Sony

Personally, I don’t see a need to upgrade to a HDMI 2.1 compatible TV just yet, particularly if you’re already gaming on a 4K TV. While a HDMI 2.1 TV will give you an extremely smooth gameplay experience with zero screen-tearing and superbly smooth action, the difference between mainstream HDMI standards and 2.1 isn’t massive.

The main things I noticed while gaming with the Sony X90J were brighter colours, smoother movement and crisper textures — but if you have no direct comparison, it’s hard to feel like you’re missing out on much.

Importantly, it also feels like we’re on the cusp of change for the TV market. There aren’t many HDMI 2.1-compatible TVs currently available, but as manufacturing slowly returns to normal it’s likely we’ll see more options pop up in the coming months. We’re still in a very similar place to where we were in 2020 in regards to new TV technologies. So at this stage, it’s better to wait and see what happens as the funk of coronavirus starts to ease back.

If you do still want to upgrade now, the Sony Bravia X90J is a solid option. At $1,995 it’s one of the cheaper HDMI 2.1 TVs on the market (its nearest rivals being the Samsung Q70T and Q80T) and it performs well as a gaming TV. For anyone currently suffering input lag, screen tearing or colour distortion, it’s a worthy upgrade option — but you should still consider your budget and whether it may stretch for a self-lit OLED like the LG CX or BX, both of which feature a more vibrant colour palette and crisper picture quality.

Smart TVs are becoming more affordable than ever, and the X90J is a step in the right direction.

While it’s not strictly necessary to upgrade right now, the X90J is a solid TV with great performance and features for its price range. Anybody looking to take full advantage of the 4K @ 120Hz / 60FPS output of the PS5 and Xbox Series X will find the X90J is a solid, reliable gaming companion that won’t stretch the budget too much.


  • The X90H was on my ‘to get’ TVs last year but alas finances and circumstances worked against me. I figured I would get its replacement as the LG models were just too expensive (going from $2500 to $4000 is a big leap).

    If you can still get an LG BX or CX (they’re 2020 models) then I would definitely recommend as I suffer screen envy at my mate’s house every time, but they’re getting harder to find. The C1 is the replacement and again, the cost difference is considerable.

    • I have been looking at the x90H for about 6 months. My Hisense N7 just blew up and I have convinced the wife I need a new TV only to realise the X90H doesn’t support VRR so now I will be getting the Samsung Q70A in the next month or so. Currently $1695 at JB.

      • Seeing as their flagship console didn’t support VRR, obviously Sony is doing the same as they did originally with the vibration function on their controller, i.e. ‘we don’t want to do it, so we think no one wants/needs it’.

  • It’s a shame that it doesn’t support variable refresh rate. Sony still saying that it will come in a firmware update.

  • I might be incorrect here but the advantage of hdmi 2.1 is the massive increase in bandwidth 48 Gbps as opposed to hdmi 2.0 and 18 Gbps. This basically means it can carry images up to 10k in resolution and deliver 120 fps (though not at the same time) realistically for games the new standard translates to:
    8k at 60fps <—- Latest Gen of RTX cards.
    4k at 120fps <—- Latest console Gen (series X and PS5)

  • The only thing rarer than a 2.1 TV is a 2.1 HDMI cable longer than 1 meter. Anyone got any recommendations?

  • > HDMI 2.1 is a new visual standard that enables 4K @ 120Hz and 60FPS per second

    Am I missing something here? Since when did screens get measured in FPS?

    • What you’re missing is that the article does not equate screen measurements with HDMI 2.1. The quote you use does not mean what you think it means.

        • It’s not clear to you that I’m replying to kapone? Maybe your comment is what I should have replied to them with.

          • Okay, I just noticed the redundant “per second” in the quoted sentence. That redundancy doesn’t support kapone’s reading of the quote as meaning that the screen is being measured in FPS.

            That quoted sentence could use an edit but it’s no biggie. I got what the author intended to say.

          • That’s good because I don’t think the author knew what they were saying as alluded to by the other replies in that this TV does not “enable” 60fps beyond it the fact it refreshes at 60Hz or greater like every other modern TV.

          • Look, I’m not a tech geek but since your criticism is clearer than kapone’s I’ll just say that what I took from the article is that if you have a PS5 game, say, and want to game at 60 fps on a 4k tv, you’ll need a TV with HDMI 2.1 to do it or at least do it consistently and at a very good standard. Yes, I know there’s the whole Hz issue, but I’m talking about wanting to game at 60 fps on a 4k resolution setting game on a matching TV.

            If you’re making some semantic point about using the word “enabling” i n that context, that’s a real non-issue for me. It’s quite okay to say that an HDMI 2.1 TV “enables” gaming at 60 fps in that context, I think. If you’re arguing that 60 fps gaming at 4k is not a property of having an HDMI 2.1 TV and a 4k game then I’d say are you sure that it’s you and not the author who is confused here?

          • If there’s some technical reason you have an issue with the quote “enables 4K @ 120Hz and 60FPS per second”, spell it out.

            You can have a 240 Hz monitor but your fps may vary because you don’t have a GPU with enough power to ‘enable’ 60 fps rates, say. Hz and fps seem independent to me.

            So, it’s still not clear to me exactly what the issue/complaint is that is being raised here. I thought it was the use of a redundant “per second” in the quote but I suspected it might have to do with things which the negative repliers want to suggest, i.e. that the author is incompetent. That’s not a conclusion I drew and it seems a long bow to draw.

          • As far as the display goes they are interchangable, the frequency in Hz is how many frames it can display in a second, so if the PS5/GPU, whatever is generating and pushing out 60 frames per second, as long as the TV refreshes at least 60Hz it will display those 60 frames.
            HMDI 2.0 has always supported 4K:60Hz.

          • Like I said, I’m not a tech geek, so I’d be willing to defer to someone who is known to know their stuff. As a non-tech guy, I’d say that my previous reply is not consistent with your comment, i.e. I’m saying that Hz and fps are independent of one another, in which case just because you’re playing a 4k resolution game on PS5 on a TV which has 60Hz, does not mean that you will get 60 fps. Maybe a HDMI 2 TV can play games at 60 fps but I think the argument goes that it doesn’t do that consistently or you’re not getting the best visuals possible, which is what HDMI 2.1 is offering.

            The quote used says that the TV is 120 Hz for 4k material. Since I’m saying that Hz and fps are independent, there’s no contradiction between a TV having a Hz number double that of the fps number. You will get 60 fps on a TV at 4k settings for a game if you have the grunt to do that with your PC or console and…HDMI 2.1.

            It might be that you could use a HDMI 2 TV for that purpose but I’m taking the article to be saying that if you want to get the best experience of playing 4k resolution games on a TV (all the bells and whistles, visually, in other words), go for a TV with HDMI 2.1, which will ‘enable’ you to do that.

            If some pedantic point is being made here about HDMI 2 being capable of doing this, it’s a pretty weak instrument to beat the author over the head with, I think. Sure, you COULD try and play Cyberpunk 2077 on console, at 4k resolution on an HDMI 2 TV…but…why?

          • At this point it seems like you’re obsessed with refusing to listen to the actual facts I’ve told you, a 60Hz TV will display 4K 60fps, all day long if you feed it that data.
            You keep reiterating that its bad to call out this stuff and yet ironically you’ve got people like yourself that clearly are bamboozled by the misrepresentation.
            The article can say if you want the best experience for 4K gaming on a TV look for HDMI 2.1 (variable refresh rate support, 4K 120fps support) but 60fps gaming is not one of the reasons and its misleading to suggest that.

          • Looks like we’ll have to agree to disagree unless, like I said, someone who I know knows what they are talking says that you’ve got things right. Or maybe you can link to a reputable source which confirms what you say?

            I’ll take your position as being that if someone here wants to play a game at 4k resolution, at 60 fps and wants to know what the best tv for that is, your answer would be: any 4k TV which runs at 60 Hz. That would include HDMI 2.1 TVs but what about, hypothetically, a sub HDMI 2 equipped TV which displayed at 60 Hz?

            My position is that if you’ve got a 4k TV which displays at 60 Hz and you’re using, say, a PS4 Pro or XBox One X and you want to play games at 4k resolution and get 60 fps, you’re not going to have a good time of it. Even supposing the XBox One X can do it, will it be able to do it new releases?

            Like I said, if you’ve got a 360 Hz capable 4k TV say, and you hook up a PC to it which is powered by a GeForce GTX 780 (Hell, let’s go crazy and splurge on the Ti version) and you load up a 4k resolution game…good luck getting 60 fps on recent releases. Sure, your TV will refresh that 1 fps that you’re getting playing the game 360 times a second.

            I see no problem at all with the article recommending getting a HDMI 2.1 TV if you want to play games at 4k resolution at 60 fps. These TVs are marketed as doing that and if that’s what you want to do, why on Earth would you put up with an inferior experience with a HDMI 2 equipped TV?

          • It would be interesting to see a test which ran a sample of 4k HDMI 2 TVs and 4k HDMI 2.1 TVs and connected them to the new consoles or PCs and ran games designed to display 4k resolution at 60 fps. If you enabled all the visual bells and whistles, on which variant of HDMI would you get the best result, as far as fps?

            If the HDMI 2 TVs can handle it, I think this line of criticism would be vindicated. If not, I really think that the criticism here is misguided. Anyone tried running games at 4k and 60 fps on an HDMI 2 TV? How did that go? If it worked, do you think it will work as well for future releases?

  • Every tv in my house is so old that NONE are smart tvs.
    Im going to have to do alot of research going forward as i want upgrade the living room tv by the end of 2022 and i want it to be able to give me at least 4k @ 60fps (id like 120, but it needs to be at least 60).

    • if you only care about 4k @ 60 then there’s literally a TONNE of TVs on the market and many of them would be sub $1500 for a 60 inch screen..

      If you want 120Hz or things like VRR (which will fix frame rate hiccups as best it can) then you will need to go for an HDMI 2.1 TV. Also look out for TVs which state HDMI 2.1 and only 1 or 2 usable HDMIs. The above mentioned X90j has 2 HDMI 2.1 ports, but something like an LG CX or C1 has 4 HDMI 2.1 Ports, VRR.

      Also, things like imput lag are also important.. you can literally lose yourself in all the literature out there.. Check out HDTVTEST on youtube for some good videos on TVs.. though he only covers more high end ones, he does occasionally give decent suggestions for low end. Based overseas, but good info nonetheless..

      • THANK YOU!
        My budget is capped at a hard $2500. So ive got a little wiggle room in the budget that might be able to get 120hz/VRR.
        A lot of the research for ones ive found that are 2.1 HDMI are very much 1 only, ill probably need 2…
        The other thing im going to have to make sure to do research on Burn in. As those LG ones have a small risk of burn in, and as ill have a computer connected to it with a mother who doesnt understand the concept of burn in, its a big risk hahha.
        The black friday sales at the end of this year is the opening of the Purchase window.

          • Yeah, the old tv cost $999 and it was a Sharp and i got it 10 years ago.
            Sharp hasnt been in the TV game for a long time. So an upgrade is needed, but it lasting 10 years has been good.

        • Recommend checking out, they do some pretty thorough testing on most tvs and you can do a direct comparison between two as well.

          • Thats one of the sites ive been using and its been a great help, so fantastic suggestion!

  • So what does the AI processing actually do here? When I last checked, you usually wanted to disable as much of the TV’s post-processing as possible to reduce latency.

    Why would this be different? Is it trying to correct for glitches in the output of the game console, or something else?

    • So I watched through a video of Sony explaining this “cognitive intelligence” thing. It’s really not clear why you’d want it, especially for interactive use where this kind of post-processing increases the latency.

      It sounds like the the rationale for its development was for cases where TV screens are so large that people can’t concentrate on them all at once, so you want something to draw the viewer’s attention to the important part of the screen. It seems like a bit of a weird tech to push, especially when most of the recent work with things like HDR has been in trying to better reproduce the content creator’s intent.

  • Feels like this article could have done with some research on what the benefits of HDMI 2.1 are…

  • I find it surprising that the only TV that is arguably fully compliant with HDMI 2.1, hasn’t been on sale for two years, and that’s LG’s C9/G9.

    LG are the best when it comes to HDMI 2.1, with their OLED C and up lines supporting HDMI 2.1/120hz/VRR/Dolby across all four ports, but their CX and C1 models are stuck at 40Gbps, a regression from the C9’s 4 full 48 Gbps ports.

    How the TV industry has cocked this up so bad, with Samsung not supporting half the industry standards, Sony promising feature updates that still aren’t available a year later, and even LG not quite up to snuff is beyond me.

  • What doesn’t Samsung support? Only reason I ask is I’m looking at the Q70A the only issue I can see is only one 2.1 port.

        • Different standards, Dolby Vision can be thought of to be a upgrade over HDR10.

          HDR10+ confusingly is a Samsung backed ‘standard’ that is supposed to compete with DV for quality, but isn’t getting any traction.

          • Despite these standards being theoretical for me at the moment, as I’m not in the market for an upgrade yet, I would find having products which can make use of either Dolby Vision or HDR10+ to be an attractive proposition, e.g. having a TV or blu-ray player which can can run either.


            “It is considered to have most of the advantages of Dolby Vision over HDR10, despite being fee free”.

          • Given that HDR10+ is royalty free and listed as the default dynamic metadata format in the HDMI spec, I wouldn’t count it out yet.

            Dolby Vision certainly has name recognition and first mover advantage, but if it’s true they’re doing things like making the tech exclusive to one video game console manufacturer, they could piss away that advantage.

            With 20th Century Fox, Warner Bros and Universal backing HDR10+, you could end up seeing a lot of films mastered in the format in future. If you get to a point where most Dolby Vision metadata is generated by converting HDR10+ metadata, you may as well just go with the royalty free option.

Show more comments

Log in to comment on this story!