Here's Some Battlefield 1 Benchmarks To Ponder

Image: Supplied

The Frostbite engine has a pretty good track record of being well optimised, and that was something PC users saw with the release of Star Wars Battlefront last year.

But how is Battlefield 1 holding up so far? The closed alpha wasn't made available to Australians, but now we have a rough idea of how it will perform.

The first exhaustive benchmarks for the game, albeit using the older DirectX 11 renderer, have appeared online courtesy of the Russian tech website GameGPU.

The first graph is a collection of GPUs running the game at 1080p with the Very High Quality preset, which is a touch strange given that the Ultra preset is also available. The testing machine used was also a Intel i7-5960X running at 4.6GHz, but we'll get into that shortly.

Image: GameGPU

The two factors that stick out to me the most so far is the decision not to use the Ultra preset, and the i7-5960X. The latter is an Extreme Edition processor, which most gamers won't have.

Most gamers will have systems that are slightly less powerful. With that in mind, you'll probably want a NVIDIA GTX 970, Radeon RX 480 or the RX 290X to guarantee 60fps performance in all situations. It's also worth remembering that AMD cards should get a nice performance bump when Battlefield 1 is released, as the full game will support DirectX 12 (something the alpha and beta aren't expected to do).

If Total War: Warhammer is any guide, I'd expect AMD cards to get at least a 10% to 15% bump under DirectX 12. Real world results have been better than that — and under Vulkan as well — but I'm being ultra conservative for now, at least until I can run my own figures.

Also: small shout-out to the HD 7970 for putting up a respectable performance in those tests. Hell, it's not that far off the bloody GTX TITAN. Another thing to note is the non-existent support for SLI and Crossfire (the latter confirmed by the lack of difference between the R9 290X and the dual GPU R9 295x2).

If you want to check out the rest of the figures, including how Battlefield 1's alpha ran at 1440p and 4K (again, under DX11) head over to GameGPU. I'll run a test with everything I have in the office once the closed beta becomes available to Australians, and I wouldn't be surprised if DICE optimise things a little further between now and then as well.

For everyone looking forward to Battlefield 1 on PC: what rig have you got right now, and how do you think it'll run? And do you have any plans for an upgrade any time soon?


    Getting a 1070 shortly, but I only have an i5 3570K which is a few years old now. What does the processor do in games like this?

      Not a huge amount, I wouldn't worry about upgrading for another year or two yet.

      I have a 6700k and a 2500k both running an R9 290x and the frame rates in most games are fairly similar. The 6700k is faster but not enough to be noticable.

      I've got the same CPU and a GTX1070 (Gainward GS GLH). I run it at 2560x1440p - not sure about the FPS since I have been unable to get it displayed (because of DX12, methinks).
      Rest assured, there's never been a FPS related hiccup so far - I'd say it never dips below 60 in the closed alpha. A guesstimate would be between 70 and 90 FPS.

      I have the same CPU as well, paired with a 980TI and its still fine.
      Also don't forget that the K series CPUs were 'specifically' labelled as K as they allowed relatively easy and stable overclocking, so if in doubt u can always get a slightly bigger cooler and bump the cpu from 3.4Ghtz up to 3.8 or 4
      It seems at least for the last say 4 years or so, GPU's are much more related to FPS in games than CPU's, i'd say most CPU's from the last 4 years are still just about fast enough as long as you keep up with ur vid card

      Last edited 20/07/16 7:36 am

      Actually, an i5 has been showing to be bottle necking the 1070/80's (moreso the 1080's). The new Pascal architecture is utilising the CPU cores more efficiently and providing more overhead for the GPU's. So say you're running an i5 with a 1070/80 on max settings in a game, your GPU will be hitting close to max constantly, whereas if you have an i7, you'll only be using about 70-80% max as the extra threads on the CPU are handling a bit more of the load and helping the GPU run more efficiently. But don't take my word for it, plenty of videos and reviews on the internet if you know where to look. Bare in mind though that this will mainly affect anyone doing 1440+ gaming, and won't really be an issue at all for 1080P gaming. I've just taken delivery of my EVGA GTX 1070 ACX 3 SC today and I'm paring it with my i7-4770k + 16GB RAM. Should be decent.

      By games like this do you mean mostly multi-player shooters? Then not as much as say an RTS or a big MMO. That said, you might still see slow-downs when you get a ton of players all doing stuff in a small area. You can wind up with the scenario where the GPU is sitting waiting on the CPU.

    I'm unsure why people haven't replied yet, probably because of the weird cpu choice, the non-ultra settings and the fact that this is information for the beta (which in itself should be a huge disclaimer that the results are unreliable). In any case; here we go:

    According to this list if you want to have a playable framerate of 60 avg you will need either a GTX780Ti or R9 290X...

    Both of these cards poop all over the XBone and PS4 GPU in raw numbers (please correct if I'm wrong).

    I am keeping in mind they had VHQ at 1920x1080 and I know it's likely the XBone will play at 720p/900p and the PS4 likely at 900p, both consoles on a custom medium/High quality setup and yet this still seems largely inaccurate to me as a FPS graph for VHQ.

    In time we will know, but I'm happy to take a punt right now and say that VHQ on a PC with the average latest gen i5/i7 CPU with either of the two GPUs I've spoken of on the released game will get between 75-100 fps, or ~80fps avg.

    Last edited 19/07/16 7:59 pm

      I just find it weird that you'd benchmark an alpha

      Last edited 19/07/16 8:41 pm

        Why wouldn't you? It's a good way to get a feel for upcoming game performance on different cards. It's also a good way to compare with existing games since the engine (or at least variants of it) are used in other games. So you can get a feel for relative performance and see how it compares to say the Star Wars game.

    The BF games are also known for having pretty dogpile performance compared to live when in alpha/beta also. Should be better at/shortly after release.

      I remember BF4 Beta didn't even have multi-threading CPU support. My poor CPU0, if only they'd let CPUs 1-5 help too :(

        lol yep... Couldn't get a solid 45fps on BF4 beta with the same machine that averaged 145fps on ultra settings in the release version.

    6700K with a 7970 which I want to upgrade, seriously considering sticking with an AMD card for my next GPU instead of going the NVIDIA route, still a bit off from getting a new GPU though

      Thankfully the 7970 is the gift that keeps on giving. Over its lifespan, its picked up somewhere along the lines of 30%-40% extra performance.

      Man that card had shitty drivers on release...

        Am about to retire my XFX R7970 Black edition. Is a sad day! As soon as I get home my shiny new GTX 1070 EVGA is going in. Did start to have thermal issues with it in the last year or so though, I'd finish playing a game, and as the fans spun down too quickly and the GPU still warm, constantly caused 1 monitor to wig out, which was only fixed with a reboot. Need to take the fans/heat sync off and apply some new thermal paste. Still, at I think the $500 mark I bought it for about 4 years ago, it still plays Doom on 1080P with most things on Ultra. One of the best cards I ever bought. Weird thing though, it doesn't work with the Vulkan render in Doom! WTF?!

        Last edited 20/07/16 4:11 pm

        I was extremely sad to see mine go, recently. It really was the little card that could. Every time a new game came out in the last 5 years I kept thinking, surely this will be the one that forces me to upgrade. That little bastard would spin his fans up and churn out the frames.

        I gave him a viking funeral when I replaced him with a watercooled 1080ti, Ryzen 1800x and X34. He's in Valhalla now.

    The 6950x it actually less powerful in gaming most of the time then the older sky lake processors (6500 6600k 6700 etc)

      It is a generation behind, so has fractionally lower IPC. But anything that runs slower on it just isn't multithread aware well enough.

    Rog maximus formula mb I7 [email protected] 4.2ghz corsair 110i cpu cooler 16g vengeance pro ram msi gtx1080 Seahawk x. 250gig ssd

Join the discussion!

Trending Stories Right Now