EVGA GeForce GTX 1070 FTW ACX 3.0 Video Card

Benchmarks, Performance, Temperatures and Power Consumption

We are influencers and brand affiliates.  This post contains affiliate links, most which go to Amazon and are Geo-Affiliate links to nearest Amazon store.

Before we get into the benchmarks, here are my system specs, maybe compare them without your own system to get an idea of what kind of performance you would get?

Here are the clocks and specs reported by TechPowerUp’s GPU-Z.


These readings might come into play a little later in the review, so memorize them.  You can see for these benchmarks I was using NVIDIA’s driver version 372.54 for Windows 10 64-bit.

EVGA also has their own utility to allow you to see the a ton of specs and also allows you to do some tweaking.


From here you can see the active GPU Clock, Memory Clock, Voltages, GPU Temperatures and even the active fan RPM’s.  There is a bit more you can see from here, but I will get into that a little later.

For the benchmarks you are about to see, aside from the actual results, I provide temperatures and power consumption.  For the power consumption, I test Minimum, Average and Max usage using the “Kill A Watt” by “P3 International”.


The programs I am using to benchmark are the following.




An awesome score, the end result is 92% better than all of the reported results.  The lowest wattage reached while benchmarking was 228Watts, the average power consumption was 333Watts and a peak of 368Watts.

Now, these cards are 0db until they hit 60°C, which scared me a bit when I first started benching it.  I didn’t freak at first because I knew they had it, but while I am benching, I am looking at the card and I don’t see the fan spinning.  Maybe my eyes went static but I know my blood pressure was steadily increasing then suddenly and amazing thing occurred, the fans spun it, the actually started spinning.  The hottest this card got in 3DMark was 65°C, that’s pretty sweet.

Now, one of the most grueling benchmarks is Metro Last Light, potentially due to poor optimization of the game, or the fact that it is just that rough.  Let’s check out Metro Last Lights benchmarks.


Throughout the benchmarks I keep the settings the same only changing the resolution.  Here are the presets.


As I mentioned, I keep the settings the same in all 3 of the presets, only changing the resolution.


Some of the best performance I have seen in my time reviewing, I am impressed so far, but there is still a long road ahead.  We were able to get 45.26 frames per second here at 2560 x 1440, which I know is no great considering 60FPS is the sweet spot but the game does not start to get choppy till we reach the end where the huge militia starts breaking through the wall and bullets are flying everywhere, so it is still playable but results are results, I will not stray from the, 45.26FPS is it and are presented for a reason, you want at least 60FPS at all times if not better.

So the 45.26 was at a resolution of 2560 x 1440, but if we toned it down a notch to a more common resolution of 1920 x1080 we can see the card start to shine here in Metro Last Light.  At 1920 x 1080, we can see 75.64FPS, a 50.26% improvement over 2560 x 1440.

At 2560 x 1440, we reached an average of 335 Watts and at 1920 x 1080, the average was only slightly lower at 330 Watts.  Lowering the resolution only dropped the power consumption 1.50%, both still very lower.

As a gamer, I want FPS, but I want to be able to see as much as I can in a game.  While going a lower resolution helps that, I prefer 2560 x 1440 so at that point I would just dial down some of the eye candy and surely it would get 60FPs, or even higher.

In the end, we got here some of the best results in Metro Last Light; I think though that our thieves might be a little jealous and try to steal the show.  Let’s go see what THIEF shows.




Thief blew everything away showing you some more of the power of the card, with a few oddities.  First off, at 2560 x 1440 we reached 60FPS and went further with 81.4FPS on average, even the lowest frame rate was over at 67.6FPS, very impressed.  Here the average power consumption was 333 Watts.

If you were looking at the differences between 2560 x 1440, 1920 x 1080 and 1280 x 1024 you would have noticed something a little odd.  At 1920 x 1080, on average consumed more power than 2560 x 1440, only a 0.90% difference but not only that, it hit 60°C, while 2560 x 1440 did not even have the fan turn on.  Very strange, but I was in front of the PC during every benchmark, and saw these with my own eyes.  No worries though, because if you can you will be playing at 2560 x 1440.  If you can’t hit 2560 x 1440, the barely 1% power difference and maybe a little extra noise with the fans on wont bug anyone.

Talking about not bugging anyone, let’s have a chat with our friend Laura over at Tomb Raider.




The performance here is amazing; at 2560 x 1440 we can see an average FPS of 129.7, more than double the acceptable FPS.  At that FPS on an average we consumed only 335Watts Average.  If you needed to tone down the resolution though, you would see a 45.60% FPS increase to 206.3FPS at only 330 Watts average.

All this action on earth can get a little boring though, so let’s go out of this world with DX12 and Ashes of the Singularity.


As I mentioned, Ashes of the Singularity is a DX12 gaming, though to actually get it to work in DX12 is not as easy as going into settings and selecting DX12, as most of us would have tried.  In Steam, when you are about to click on this game to get it started, instead you would lick click on the game like this.


In the drop down of the game, you will find “Launch DirectX 12 Version (Windows 10 Only)”.  As you might have been able to gather, this will only work in DX12 because Windows 10 is the only OS that works with DirectX 12.  Here are the settings I use in the benchmark, of course aside from changing the resolution.


Before you run the benchmark though, you are greeted with the configuration menu to start the benchmark, here you can verify if you are running DX12


Ok, so let’s get to the benchmarks.


We can see here where the extra money comes into play.  On lower end cards, we can see that the CPU picks up the slack where the GPU might fall short, but here we can see that the CPU is on par with the GPU.

At 2560 x 1440, we can see that the average frame rate for the GPU was at 27FPS, the average FPS for 1920 x 1080 was 34.8FPS, a 25.24% improvement.  Now this can look very deceiving, and many including myself would think that 27FPS or 34.8FPS is 100% unplayable, but actually that’s where the CPU helps.  To show you how this looks like, I recorded a video showing gameplay of this, as well as a few other games that I will show a little later in this review, so keep your eyes out for it.

Let’s see what Mr. Clancy has to say with the Division.


Here are the settings I use in The Division, as like in the previous benchmark results, I only change the resolution.



Like Ashes of the Singularity, Tom Clancy’s The Division is optimized to take better advantage of the CPU.  Here we can see that the average FPS is 56FPS, and the CPU helps a little tiny bit at 31%.  At 2560×1440 running at 56FPS is very acceptable, especially only running at 62°C and an average power consumption of 342 Watts.  At 1920 x 1080 a standard resolution, we were able to get an average of 79.5 FPS running at 59°C which means the GPU fans never turned on and an average of 339 Watts, a 34.69% improvement in FPS.

Now, not everyone can make the conversions on what FPS means in a game, so I try to help you on this.  In this next chapter I play a few games to show you how the game runs.

We are influencers and brand affiliates.  This post contains affiliate links, most which go to Amazon and are Geo-Affiliate links to nearest Amazon store.