What Is FXAA, And Why Has It Made Anti-Aliasing As We Know It Obsolete?

Fast Approximate Anti-Aliasing may not be known by its name to many gamers, but they certainly do recognise its good looks, being used in such visually luxurious games as The Elder Scrolls V: Skyrim, Batman: Arkham City and Battlefield 3.

But what is it, really? It's an improvement above three earlier iterations of anti-aliasing, rendering high-performance visuals faster and more completely. Here to explain is Jeff Atwood, who writes about programming for Coding Horror.

Anti-aliasing has an intimidating name, but what it does for our computer displays is rather fundamental. Think of it this way -- a line has infinite resolution, but our digital displays do not. So when we "snap" a line to the pixel grid on our display, we can compensate by imagineering partial pixels along the line, pretending we have a much higher resolution display than we actually do. Like so:

Anti-aliasing produces a superior image by using grey pixels to simulate partial pixels along the edges of a line. It is a hack, but as hacks go, it's pretty darn effective. Of course, the proper solution to this problem is to have extremely high resolution displays in the first place. But other than tiny handheld devices, I wouldn't hold your breath for that to happen any time soon.

This also applies to much more complex 3D graphics scenes. Perhaps even more so, as adding motion amplifies the aliasing effects of all those crawling lines that make up the edges of the scene.

But anti-aliasing, particularly at 30 or 60 frames per second in a complex state of the art game, with millions of polygons and effects active, is not cheap. Per my answer here, you can generally expect a performance cost of at least 25 per cent for proper 4x anti-aliasing. And that is for the most optimised version of anti-aliasing we've been able to come up with:

1. Super-Sampled Anti-Aliasing (SSAA). The oldest trick in the book -- I list it as universal because you can use it pretty much anywhere: forward or deferred rendering, it also anti-aliases alpha cutouts, and it gives you better texture sampling at high anisotropy too. Basically, you render the image at a higher resolution and down-sample with a filter when done. Sharp edges become anti-aliased as they are down-sized. Of course, there's a reason why people don't use SSAA: it costs a fortune. Whatever your fill-rate bill, it's 4x for even minimal SSAA.

2. Multi-Sampled Anti-Aliasing (MSAA). This is what you typically have in hardware on a modern graphics card. The graphics card renders to a surface that is larger than the final image, but in shading each "cluster" of samples (that will end up in a single pixel on the final screen) the pixel shader is run only once. We save a tonne of fill rate, but we still burn memory bandwidth. This technique does not anti-alias any effects coming out of the shader, because the shader runs at 1x, so alpha cutouts are jagged. This is the most common way to run a forward-rendering game. MSAA does not work for a deferred renderer because lighting decisions are made after the MSAA is "resolved" (down-sized) to its final image size.

3. Coverage Sample Anti-Aliasing (CSAA). A further optimisation on MSAA from NVIDIA [ed: ATI has an equivalent]. Besides running the shader at 1x and the framebuffer at 4x, the GPU's rasteriser is run at 16x. So while the depth buffer produces better anti-aliasing, the intermediate shades of blending produced are even better.

Pretty much all "modern" anti-aliasing is some variant of the MSAA hack, and even that costs a quarter of your framerate. That's prohibitively expensive, unless you have so much performance you don't even care, which will rarely be true for any recent game. While the crawling lines of aliasing do bother me, I don't feel anti-aliasing alone is worth giving up a quarter of my framerate and/or turning down other details to pay for it.

But that was before I learned that there are some emerging alternatives to MSAA. And then, much to my surprise, these alternatives started showing up as actual graphics options in this season's PC games -- Battlefield 3, Skyrim, Batman: Arkham City, and so on. What is this FXAA thing, and how does it work? Let's see it in action:

FXAA stands for Fast Approximate Anti-Aliasing, and it's an even more clever hack than MSAA, because it ignores polygons and line edges, and simply analyses the pixels on the screen. It is a pixel shader program documented in this PDF that runs every frame in a scant millisecond or two. Where it sees pixels that create an artificial edge, it smooths them. It is, in the words of the author, "the simplest and easiest thing to integrate and use".

FXAA has two major advantages:

1. FXAA smooths edges in all pixels on the screen, including those inside alpha-blended textures and those resulting from pixel shader effects, which were previously immune to the effects of MSAA without oddball workarounds.

2. It's fast. Very, very fast. Version 3 of the FXAA algorithm takes about 1.3ms per frame on a $US100 video card. Earlier versions were found to be double the speed of 4x MSAA, so you're looking at a modest 12 or 13 per cent cost in framerate to enable FXAA -- and in return you get a considerable reduction in aliasing.

The only downside, and it is minor, is that you may see a bit of unwanted edge "reduction" inside textures or in other places. I'm not sure if it's fair to call this a downside, but FXAA can't directly be applied to older games; games have to be specifically coded to call the FXAA pixel shader before they draw the game's user interface, otherwise it will happily smooth the edges of on-screen HUD elements, too.

The FXAA method is so good, in fact, it makes all other forms of full-screen anti-aliasing pretty much obsolete overnight. If you have an FXAA option in your game, you should enable it immediately and ignore any other AA options.

FXAA is an excellent example of the power of simple hacks and heuristics. But it's also a great demonstration of how attacking programming problems from a different angle -- that is, rather than thinking of the screen as a collection of polygons and lines, think of it as a collection of pixels -- can enable you to solve computationally difficult problems faster and arguably better than anyone thought possible.

Jeff Atwood has a programming background going back to the 1980s. He was most recently a programmer for Vertigo Software, and now devotes himself full time to his blog, Coding Horror and the site stackoverflow.com. He lives in Berkeley, California. Republished with permission.


    The fxaa image is a bit blurry compared to the Sharp 2 on the left

    Most gamers seem to dislike FXAA and much prefer MSAA.

      "I seem to dislike FXAA and much prefer MSAA." ~ Fixed

      My apologies for the necro, but speak for yourself.

        He's not wrong. And who made you King of Gamers? i sure as hell didn't vote you into office.

          I think its only you who prefer msaa. Fxaa is used by the most. But i guess each to his own. Although in some games i heard it makes objects blurry.

          Last edited 30/09/14 12:46 pm

            Im joining this chat late but, fxaa is inferior. I never use it. I use supersampling 24x with edge detect. its very taxing on my system but my rig can handle it on most games, especially now that mantle is becoming more popular. With my MSAA settings it makes a crisp clear image. no jagged lines, no fxaa blurr. and i would agree with anon2. Ive discussed it in forums and the general consensus among most pc gamers is msaa. fxaa is generally a good option for those whose systems cant handle full on MSAA and supersampling.


        Actually FXAA is the lowest tier of Antialiasing for Quality and isnt even the best performing.
        SGSSAA wins in quality and SMAA wins in Speed
        And TXAA is not far behind

        And if you don't know what any of these stand for.. please pour coffee into your PSU

    I have found that the FXAA implementation in Skyrim makes everything just a little too blurry for my tastes, I usually just stick with 2-4x normal AA.

    If the texture blurring problem can be solved though (maybe using a z-depth image so you can tell where the actual polygon and alpha edges are, and disregard all others), this is good news, and hopefully will mean we will start to see some AA used in console games as well as PC.

      Agreed. I ended up uninstalling it myself.

    I find it interesting that there's no mention at all about MLAA, which had a fantastic debut in God of War 3 and worked really well.

    FXAA blurs out textures as well as polygon edges. I don't really like that.

    FXAA isn't the best at all it makes things too blurry but on the plus side it doesn't seem to as hardware hungry as regular AA.

    I hated it in skyrim, didn't realise it was enabled first time I played it. Needless to say I got quite a surprise when everything became sharper and clearer upon turning it off and bumping up normal AA.

    Im more interested in full scene V-sync @ 60fps+ at High Detail! (on console, before someone says, get a PC then, hahaha)

    Interesting that the 3 examples of games are all console focused games.....While I understand that graphics is not all important in games (I mean, I love minecraft....) but it is sad that an industry has decided it is easier to just start to aiming lower.....Do I expect an article in the next few years about how I shouldn't mind lack of story in games as long as it has an immersive multiplayer???

    I turn on AA cause prettier is nicer, but I don't get many peoples fear of jaggies.

    I don't see a performance hit with any type of AA.

    May as well just play on a CRT and let that provide your "anti-aliasing".

    To me FXAA is like smearing the lens with Vaseline to hide an actor's dodgy complexion

    yeah not a fan of FXAA, makes everything way too blurry, in both Skyrim and BF3 at least.

    Since nobody's making games that can bring a high-end SLI rig to its knees unless the resolution is insanely high, I'm not seeing how this would have much if any effect on the midrange-up of PC graphics cards.

    I don't know what else to call it, but the FXAA 'haters' are seeing a placebo effect of unwanted blurriness that technically isn't bad. FXAA basically finds jagged edges and smoothes them. It is a much improved algorithm compared to MSAA. SSAA will always be the best, but is too expensive performance-wise and with most video drivers you cannot select it without a third party utility (like nvidia inspector).

    FXAA smooths the rough edges on everything including alpha textures. It makes MSAA+MSTAA (transparency multisample) obsolete. It is cheaper performance wise. It is especially cheaper than SSTAA (transparency super sampling). In fact, if you use 8xSSTAA, your screen will appear 'blurred' similar to FXAA (if there are many alpha textures). Gamers with the rigs to do this consider this a good thing. That perceived extra sharpness without FXAA is jaggies in alpha textures. If you like it, I dont't know why. I prefer the smooth grass and foliage, but if you like the pixelated stuff go nuts.

    HardOCP did a test and verified it. 3D game devs are aware of it. We gamers often have a humanesque herd-minded mentality. Somebody says "it blurs the screen" and we see it and believe it. The blurring is good, no detail or texture resolution is lost. HardOCP says if you can handle it to enable your preferred level of AA, 8xSSTAA,and FXAA for the least amount of jaggies and aliasing with best picture quality. I say if you play Skyrim with just like 4xAA, turn off AA and enable FXAA. The level of jaggies smoothed is the same and all alph textures are smoothed too. It also has less of a performance hit so if you got less than 60fps average, you will notice a boost.

    It's not a matter of opinion. It's the truth. Truth is not determined by majority opinion. The fact is people see what they what they want and hate change. So they stick with good old' AA and think the smooth alphas are somehow less detailed. I guess if you prefer jaggies, that's up to you. I thought AA was all about smoothing out the rough edges. FXAA does a great job of this on its own with little impact in performance. Compare sreenshots, do your own research, come to your own conclusion. Majority gamer concensus is generally determined by a few elitist aspergers pricks who have no idea what they are talking about. Don't follow the herd, try it yourself and compare. Don't just load it for two seconds and decide. Your eyes can deceive you

      Ryan-o. I've done screenshot tests and so have many other people placed their opinion on personal experience. Do a AA vs. FXAA comparision and you will see the texture quality has decreased, as in with the FXAA blurriness usual complainment. It is a fact it causes bluriness, and most people do not like deteriorating texture quality.

      FXAA not only affects the edges or jaggies, but the texture in the middle of the model with a blurring effect.

      I love how you talk about "fact" (even though it's personal opinion still), and then go on to make a broad claim about the kind of people who say otherwise, insulting them. Sounds a lot like you've been proven wrong one too many times by these "elitist aspergers pricks" and now you have some sort of vendetta against them haha.

      But anyways, back on subject... FXAA is crap. Yes, it is good in concept, but the execution fell short. It clearly blurs the image, and it in no way guarantees that it will catch all the jaggies. As the system is just trying to guess based on basically a photo. And even when it does, it's still working with a limited amount of color information to anti-alias with. In real life, you would have additional color information from a higher resolution that would be combined to determine that edge color and shape. This is just a very rough guess and blur of the low res image.

      But fear not, your boner for FXAA type algorithms is not dead. There is a slightly newer type called SMAA, made by a couple guys from Crytek and a friend. It does what FXAA does, but without the screen blurring, and better. And the full implementation of it is able to do so much more than ANY AA algorithm, even super sampling, because it does temporal sampling. And the performance hit is around the same as FXAA, negligible.

      Watch the video on this page to learn more about it. Hopefully developers start picking it up soon. There's an injector for it, like there is FXAA, but it's a very limited version of SMAA.

    Something's wrong in the second comparison (the one with three screenshots). Clearly the "No AA" image *does* have some AA going on (there *are* grey pixels at line edges), and "4x MSAA" is partially even worse than "No AA" (which shouldn't happen). The first comparison (the one with two screenshots) shows how it should look (minus JPEG artifacts).

    Jeff, your game is broken.

    FXAA is partially broken by design as well. For example, the "missing" beams to the right of the "No AA" screenshot in the first comparison cannot be reconstructed by any simple postprocessing. You need extra samples (or some other trick).

    FXAA is not anti-aliasing. Its complete BS and doesn't get rid of 90% of the aliasing I see in games at 1440p, 2x MSAA does a much better job and it doesn't blur the shit out of the image. Your article title is stupid and I hate you.

    so, the FXAA is like anti-aliasing or not?

    It really boggles my mind how this article presents FXAA as 'the new big thing' for anti-aliasing technology, when really it's just a passable substitute for people who's computers can't handle the real thing.
    Now don't get me wrong, I find that it's pretty useful as a performance boost in a handful of circumstances, even with a relatively high end notebook. But it doesn't do what real anti-aliasing can, actually simulating a higher resolution, so it will never really replace true AA.

    I used FXAA in a game with deferred shading a few years back. One of the downsides of deferred is that multiple render targets for diffuse, normals, lights, etc. make some AA methods rather challenging. FXAA operates in screen space so it's a nice and easy post-process on the final composition. It fits neatly into the pipeline and still looks better than no AA at all. There are definitely better looking solutions, but FXAA is a lovely fix that is almost free in terms of per-frame shader budget and fits neatly into any post-process stack. Overall it's a great tool to have in your arsenal.

    Last edited 11/03/14 12:54 pm

    Honestly SMAA combined with 1x of supersampling is probably the best. Crysis 3 implemented it and it worked flawlessly. ZERO BLUR supersample transparency AA mixed with SMAA and removes sub-pixel Aliasing along with only giving a maximum of 5-8 FPS drop!! Its pure genius.

      1x supersampling... Sorry but LOL.

      1x supersampling = no AA at all. The minimum for supersampling to actually do anything is 2x - supersampling (SSAA) is quite literally doubling (2x) or quadrupling (4x) the resolution of a game then downscaling it to your display resolution. Having said this, "1x SSAA" would be rendering at the same resolution of your monitor... and displaying it at the same resolution. Hence no AA.

Join the discussion!

Trending Stories Right Now