We are influencers and brand affiliates. This post contains affiliate links, most which go to Amazon and are Geo-Affiliate links to nearest Amazon store.
As a gamer and for all fellow gamers, we are bombarded with promises of upgrades will change our lives, and for the most part they don’t do much. SSD’s and memory do help tons, but one of the things that get us as gamers are video cards, but they are ALWAYS so expensive. Today I have the pleasure of bringing you my review of the Sapphire Radeon NITRO+ RX 480 8GB GDDR OC 11260-01-20G.
Ohh…. I know,… I want to tear it open too, but let’s check out the specs first.
Spec’s and Features
- Core Speed: 1208Mhz
- Boost Engine Clock: 1342 (Seems to always run at this speed though)
- 8192MB GDDR5 256 bit RAM
- 2000Mhz
- 8000Mhz Effective Memory Frequency
- Compute Shaders: 2304
- Supports Crossfire
- 5 Output Maximum
- 1 x DVI-D
- 2 x HDMI 2.0b
- 2 x Display Port 1.4
- Resolutions Supported
- DVD-D
- 2560 x 1600 (60Hz)
- HDMI 2.0b
- 3840 x 2160p (60Hz)
- Display Port
- 3840 x 2160 (120Hz)
- Quad HD Display 4K*K Support
- Supported API’s:
- OpenGL 4.5
- OpenCL 2.0
- DirectX 12
- Shader Model 5.0
- Vulkan API
- Supported Features
- FreeSync Technology
- AMD Eyefinity
- Dual Bios
- AMD Liquid VR Technology
- AMD Virtual Super Resolution (VSR)
- AMD TrueAudio Next Technology
- AMD Xconnect Ready
- DirecX 12 Optimized
- Radeon VR ready Premium
- HDR Ready
- Frame Rate Target Control
- NITRO Fan Check
- Dual X-Fans
- Two Ball Bearing
- NITRO Quick Connect System
- NITRO Glow RGB LED
- NITRO CoolTech(NCT)
- NITRO Free Flow
- Dual-X Cooling
- NITRO Boost
- Intelligent Fan Control III
- Black Diamonds Choke
- Power Consumption: 225Watts
- System Requirements
- 500Watt Power Supply
- 1 x 8 Pin AUX Power Connector (Yes, only 1)
- Windows 10, 8.1, 8 or 7
- Form Factor
- DVD-D
- 2000Mhz
- Length: 9.45in
- Width: 4.9in
- Depth: 1.6in
- Shipping weight: 2.35 pounds
I know, I fell asleep too while I was typing it, too much info. OK, let’s get into an unboxing and open this Sapphire Radeon NITRO+ RX480 8GB GDDR5 OC box up.
So yeah, you can hook up to 5 monitors, what kind of ports though?
As I mentioned in the long list of specs and features, we can connect 2 x Display Ports, 2 x HDMI ports and a single lonely DVI-D port. Before you make fun of the DVI-D port, I still have a 27in monitor that has one, and yes it is an LED and it works fine, so there is no reason to get rid of it. Thankfully they still keep that port there as well as provide a host of other ports to make your investments still worth it.
Another thing this does is that it allows you to not have to use adapters to connect monitors. As you saw in the unboxing, Sapphire does not include any cables or adapters, so you have to fend for yourself on these.
Looking around the card, we find the “LED Mode Switch/V BIOS Switch”
The LED Mode Switch on this Sapphire card allows you to change the LED lights on the card; I will show you that a little later in the review.
A little closer, under the button, we find the V BIOS Switch. This switch allows you to switch to a different BIOS or Firmware on the card where on one, you might have an overclock and the other, you might have a stock setting. This comes in handy when working on overclocks and you might push one of the overclocks a little too high not allowing the system to turn off, at this point you can unplug the machine to let it discharge, flick the switch to the left (or the right), plug the machine back in and you are good to go.
Moving more to the right, we come to the back of the card.
Here we find a single 8 Pin PCI-E plug, yup just one. This is not a power eater, but we will get into that also a little later in the review with some benches.
Turning the card around again, we come to the bottom of the card itself, where we find the PCI-e connector.
We also find 2 huge pipes to keep things nice and cool.
The top of the card has an Aluminum backplate with their own nice design, but it’s a little more than a design, there’s some function here as well. I will get into this a little later in the review as well.
Alight, so now that we have seen the card, how do we install it? Well I go over that it in the next chapter.
[nextpage title=”Installing the Sapphire Radeon NITRO+ RX 480″]
Alright, while some of my more advance readers might think it’s a little dumb that I am showing them how to install a card, I am not. I am providing this video for those of you that don’t know how to install a card to save you some money. Aside from saving you money, I am hoping to provide you some confidence to install and upgrade your video card on your own. This will save you some money, maybe enough to buy another card, who knows.
Anyway, let’s get to the install itself.
It’s a little bright I know, but I took the side panel off so that I could get a crystal clear picture of the system with the card inside. The Anidees AI Crystal has tinted glass side panels so it is not as bright with the side panel on but I wanted to get rid of the glass’ glare. She looks beautiful right?
That was a small chapter, but I think important enough to be on its own and hopefully helpful enough.
OK, let’s get into some benches, I think it’s time.
[nextpage title=”Benchmarks, Performance, Temperatures and power consumption”]
Before we get into the benchmarks, here are my system specs
- Anidees AI Crystal Case: https://geni.us/6NAIJBN?ygdb
- Intel Core i7 5930K Processor: https://geni.us/6NAIJBN?4C8Itd
- EVGA X99 Classified Motherboard: https://geni.us/6NAIJBN?E9eamo
- Arctic Liquid Freezer 240MM CPU Liquid Cooling: https://geni.us/6NAIJBN?vEaJAf
- Kingston HyperX Predator 3000Mhz 16Gig: https://geni.us/6NAIJBN?w9kPe5
- Sapphire Nitro RX 480 Video card: https://geni.us/6NAIJBN?zeF3
- Samsung 850 EVO 500GB SSD: https://geni.us/6NAIJBN?1gf0fs
- Hitachi 1TB SATA 3G HD: https://geni.us/6NAIJBN?pU2QOo
- Patriot Ignite 480GB SSD: https://geni.us/6NAIJBN?eoPVsG
- Kingston HyperX 240GB SSD: https://geni.us/6NAIJBN?8leEDW
- Plextor 256GB PCIE SSD: https://geni.us/6NAIJBN?gVBR
- Cooler Master Silent Pro Gold 1200W Power Supply: https://geni.us/6NAIJBN?Umwm
- Microsoft Windows 10 Professional: https://geni.us/6NAIJBN?GYbBRY
Here are the specs of the card displayed in TechPowerUp’s GPU-Z, just to give you a little more visual perspective of it.
Sapphire also has their own utility, which is very handy, it is TRIXX 3.0. I will go into that software a little later in the review, because we will need it for a few things.
you can find Sapphire’s TRIXX 3.0 here: http://www.sapphiretech.com/catapage_tech.asp?cataid=291&lang=eng
The Sapphire Radeon NITRO+ RX 480GB card is based off of AMD’s Ellesmere GPU and of course Polaris. This comes with a new 14nm FinFET process improving performance and lowering power requirements of AMD’s cards.
I had to cheat a little here and show you some of TRIXX 3.0
GPU-Z was not allowing me to show you the number of Compute Shaders, but here you go, 2304 of them.
As you can see from GPU-Z’s readout I am using GPU-Z’s I am using AMD’s video driver version 16.300.2511.1001, which is Crimson version 16.8.2. Once AMD released Crimson about a year ago, their performance has been greatly improving.
Before I start discussing the benchmarking process, I wanted to let you know I also check the wattage use of the entire system with the card. I test for Minimum, Average and Max usage using the “Kill A Watt” by “P3 International”.
The programs I am using to benchmark are the following.
- FutureMark’s 3DMark Fire Strike
- Metro Last Light
- Thief
- Tomb Raider
- Ashes of Singularity
- Tom Clancy’s The Division
Ashes of Singularity and Tom Clancy’s The Division are actually new additions to my benchmarking suite. So let’s started benchmarking Sapphires latest and greatest.
A very decent score and as it states 82% better than other results, pretty cool. The lowest wattage reached while this was benchmarking was 226Watts, average was 350Watts and the maximum watts pulled was 399Watts. Also, the GPU reached a max of 83°C, this was not utilizing TRIXX 3.0’s fan adjusting utility, all stock.
3DMark states that the drivers are not approved because Crimson drivers are typically beta drivers, but I grabbed the latest and greatest.
Alright, let’s get to some Metro Last Light
Here are my Metro Last Light presets. For each benchmark I will present the presets, changing on the resolutions.
The only difference between “Preset 2”, “Preset 0” and “Preset 1” is changing the resolution between 2560×1440, 1920×1080 and 1280×1024. Below are the results from each.
The performance here I know looks a little rough, but actually Metro Last Light is a card killer; it can take down the toughest of cards. I have been using this game to bench for a few years and not many cards can make this budge,… though there have been a few. One thing to note about Metro Last Light, and all of the other games I will be benchmarking here, they are set to Max settings, Very High, High or Ultra, depending on the individual settings on where they can go. That being said, if Metro Last Light is one of your favorites, you can turn down just a little bit of eye candy and get the performance you want, as high of a resolution you want.
There is a 6.94% decrease in performance from 1920×1080 to 2560×1440, though oddly enough the average power consumption also goes down 2.33%, usually it would go higher especially since the temperature was raised from 79°C to 81°C, a 2.5% difference.
Ok, let’s see if THIEF can silently makes its way into the arena and take the cake.
Thief gets much more playable at 2560×1440 at 58.3, slow close to the magical and sometimes mystical 60FPS. At 1920×1080 we can see 82FPS, that’s a 33.79% improvement from 2560×1440 and even the wattage difference between the 2 is impressive at 21.57% in favor of 1920×1080. For 1.7FPS from 60FPs, I would still play at 2560×1440 myself.
Laura might have a different opinion though, let’s check with her.
Laura joins the Thief guild in playability as the FPS jumps up to 95.1FPs at 2560 x 1440 at even better than that at 1920×1080 jumps to 155.4, that’s a 48.14% increase dropping resolutions. Mind you, if your monitor can handle 2560×1440, I would choose that resolution so it is 100% playable at 95.1FPS and there is only an 8°C heat increase and actually 1 Watt decrease. Very impressive here but these may need have more research done, I believe that the technology Ashes of the Singularity has may be able to calculate this correctly.
Ashes of the Singularity is a DX12 game if you did not know, but there is a little trick to get it to function in DX12. In Steam, when you about to double click this game to get it started, actually right click on the title
There you will see a drop down and you will find “Launch DirectX 12 Version (Windows 10 Only)”, then also as it implies, this will only work in Windows 10. After that, once it is loaded, here are my settings.
Also, you know you are benching in DX12 when you get this message when you click “Benchmark”
Notice under “API:” it reads “DirectX 12”.
Now this game HEAVILY utilizes the CPU. As I was benchmarking this game, I watched it and it was smooth, there were a few tiny spots where it did slightly chug a bit, but mostly smooth. You can see that actually in the graph above. The settings I have are incredibly high, “Use Compressed textures” might help in the performance as well, but I wanted to make this card sweat a bit.
2560×1440 shows a GPU performance of 18.4FPs and 1920×1080 shows 22.6FPs, but the CPU offsets this a tiny bit.
The performance difference GPU wise was in favor of the 1920×1080 and shows a 20.49% improvement over 2560×1440. Here is where the CPU took charge a bit and the CPU performed better under 2560×1440 with a 3.32% improvement over 1920×1080. This is definitely a very interesting benchmark; we can see that the CPU might be a little more important than the GPU, though surely if you lowered some of the settings under “Video” you would surely get better performance.
OK, let’s jump to Tom Clancy’s The Division. I bought this game to do benchmarking and found it pretty cool. Now I just need time to actually play it.
Here are the settings I defaulted at, again afterwards only changing the resolutions. There are a ton of settings.
The Division does show you how much CPU it utilizes, but you can see here utilization is relatively low, now the benchmarks here are a bit surprising. I watch these benchmarks as they run to see if I see any artifacting, and while I saw none, it felt like everything was smooth. From 2560 to 1920 resolutions there is a 32.94% improvement, simply dropping resolutions. The game did play very well, but yeah there were a few spots where it chugged a little tiny bit.
It is very difficult to actually discuss what I saw though aside from what I have already, but actually, I think it might be best to show you exactly what I found.
This next chapter will show you some game play.
[nextpage title=”Gameplay and Performance”]
There is nothing better than seeing things first hand, so let’s check out some gameplay, this also gives me a great excuse to play some games.
Grand Theft Auto V at Ultra (Though I did change settings mid through the video, I show you everything)
OK, not bad, but since this I did have this card during the Open Beta, let’s check out some Battlefield One Beta Gameplay.
OK, I really like this game, so here is a little more.
And well, since I do still love Battlefield 4, here is some Battlefield 4 Gameplay
And here is Tom Clancy’s the Division.
You can see the frame rates did not get into the 60’s, but it still ran very well.
OK, now that we have all that, we can come to a conclusion of the card… well not really. Let’s benchmark the card itself overclocked.
[nextpage title=”Overcloking Performance, Benchmarks, Temperatures and Power Consumption”]
OK, so before I provide the results, let me show you how I got there. First off, this is not a quick thing, to overclock correctly, you have to spend a few hours, have some patience, a paper and a pen because you will be there for a while recording your previous attempts.
Below, I will list before and after results of course of benchmarks but I will show reports from Sapphires TRIXX 3.0 and GPU-Z.
Before After
So with this overclock, I was able to 79Mhz from the GPU Clock and 30Mhz from the memory clock. With that, I was to raise my bandwidth, pixel fill rate and texture fill rate. I didn’t spend a ton of hours doing this, I had to get this review out but it looks like I could have squeezed out some more performance.
OK, so let’s check it out under TRIXX 3.0 itself.
Before
After
I modified the GPU clock
Then also raised the “Power Limit”
Also the “GPU Voltage”
The “Memory Clock”
And finally the “Current Fan speed”
This program does more than just overclock the card, but I will get into that a little later in the review. Let’s get into the comparisons.
So with this overclock you can see that performance improved 4.98% from the stock output of 11,517 to its overclocked counterpart at 12,105. Of course, power usage increased from 350 Watts average to 388 Watts average a 10.30% increase. With that, since now I have control of the fans; the temperature on the overclock was actually lower, 73°C specifically. Standard clocks temperature was 83°C so the improvement to 73°C, a 12.82% improvement but of course it was a little louder.
OK, let’s jump to Metro Last Light.
1280×1024
From this overclock we can see the average frame rate increased from 47.08 to 48.44, a 2.85% increase in performance. With the increase came a decrease in power consumption of 6.75% and a cooling improvement of 9.40%. You might wonder why a decrease in average power consumption, if the card is cooler it will take less power, heat increases power consumption so this is a great example on why cooling is so important. Now onto 1920 x 1080.
1920×1080
This overclock provides an average frame rate increase of 3.62%, an increase in performance from 33.11FPs to 34.33FPS. As we saw before, the increase came with a tiny decrease in power consumption of 0.77% and a cooling improvement of 10.67%. Now onto 2560×1440.
A 10°C improvement in cooling due to the custom fan profile though wattage was higher on both average and max, can win them all the time. As for FPS, there was a measly 3.84% improvement from the base clocks to the OC, from 20.97FPS to 21.79FPS.
Metro Last Light shows a very small improvement across the board in reference to FPS. I have known this benchmark to bring very powerful cards alike to its knees, I think aside from it being a game that requires a lot of resources, I think it’s also a very poorly optimized game. It is a great game to keep in my suite though, who knows when a card will show improvement again.
Next up is Thief.
1280×1024
While performance is slightly better on average here on the Extra OC at 92.4FPS up from 92.1FPS on stock, the temperate went up a bit too, from 68°C to 70°C. The power usage went up from 331 Watts to 376Watts, a 12.73% increase in power usage. Thankfully this is only 1280×1024 but there still was an FPS improvement, hopefully no one is using that resolution but it can happen. OK, let’s check out 1920×1080, hopefully a better improvement there.
1920×1080
Performance nudged itself up just a bit, 84.4FPS from 82FPS a 2.88% increase. The temps went up 2.74% and the average power consumption went up as well 8.43%. A nice increase also was the Minimum frame rate, which is never anything good to look at, but being that it is at 56.3FPS from 45.6FPs, it shows that even when the game dips, it is 100% playable, very nice. Now, let’s take a look at 2560×1440.
2560×1440
While not a huge improvement, the extra overclock allow for Thief to be played over the magical 60FP at 61.5FPS, a 5.34% improvement. The temperatures stayed the same, but average power consumption went up 32.92%, a 119 Watt increase.
Thief showed more improvement than Metro Last Light did, but what will it do for Tomb Raider? Let’s have a chat with Laura.
1280×1024
Not sure if an overclock is really needed here, but would you say not to extra performance,… I think not.
On average FPS we can see there was a 2.59% improvement, a 5.5FPS improvement but once again the power hungry power monster strikes again. From the base clock speeds where we saw an average power consumption of 380 to 405, an 11.49% increase even though cooling was improved by 6.80%. Again, this is 1280×1024, a resolution that I would think not many would use now a days, but it is still worth mentioning. Let’s jump to a more commonly used resolution, 1920x 1080.
1920×1080
I will tell you, since I did increase the wattage some; it looks like the OC really does need it. Before I get into the performance, we can see here that the average power consumption went up from 360 Watts to 406Watts, a 46Watt increase in power draw but the heat only went up 2°. Performance also increased on the average, from 155.4FPS to 163.5FPS, a 5.08% increase in performance. Let’s see how much we gain in performance on 2560×1440.
2560×1440
Not that the game needed anymore FPS thrown at it (I threw up in my mouth a little writing that) but it still did benefit from the OC. We can see that at the stock clocks it score a nice 95.1FPS but under the OC, it show up a bit to 103.1FPS a 8.07% increase. The temperature stayed the same between the 2, but the average wattage jumped up 17.53% from 359Watts to 428Watts. Let’s check out Ashes of the Singularity.
1280×1024
Here we can see a slight improvement from 28.7FPS to 30.2FPS a 5.09% increase in performance. The temperature did drop from 75°C to 73°C, a 2.7% improvement but the power comes back to haunt us. We can see a 13% increase in power draw from the 353 Watt average on the stock clocks to the 405 Watt average on the Extra OC clock. It is understandable that the wattage would rise, being that we have increase voltages on the GPU, but I didn’t realize how significant those increases would be. Now mind you, it’s a small in small increase, it just looks relatively large because we are dealing with such low wattage, all below 500Watts but it is worth mentioning.
Let’s jump to 1920×1080.
1920×1080
As we saw before, a slight increase in FPS is noticed between stock clocks and the Extra OC clocks. To be more precise, a 4.76% increase coming in from 22.6FPS to 23.7FPS. With that improvement came a 5.33% dip from 77°C to 73°C and a wattage consumption of 8.63%. The overclock ended up consuming 32Watts more than the base clock speeds. Now let’s check out the difference on 2560×1440.
2560×1440
A fraction of an FPS was felt here, and the percentage somehow seems more meaningful than the actual FPS, but it came in at an increase of 2.15% but actually it was 18.4 to 18.8, .4 FPS. That .4 though actually consume less power on an average though, 1.68% less. What’s best is that it was actually cooled a little better too, 2.70% better cooling, 75°C to the overclocks 73°C.
Let’s go over to Tom Clancy’s The Division.
1280×1024
Here we can see the overclock actually did help quite a bit. It took what was almost 60FPS and too it to 71.7FPS, a very nice 44.67% increase from 57.8FPS to 71.7FPS and the cooling also improved 21.47%. With the added performance, came a slight hit in power consumption. We were at 372 Watts on average on stock but with the overclock we jumped up to 380Watts a 2.13% increase, but at least we can see where the power was used here. Let’s go to a much more realistic resolution.
1920×1080
Another perfect example of where overclocking does what it’s supposed to do. Here we can see where the stock speeds were originally 49.5FPS, decent but you will notice an occasional chug, but on the OC we jumped all the way to 58FPS. The increase here was a decent 6.65% improvement; there was also a 4.08% improvement in cooling, 75°C to 72°C. Following previous examples, we see that average wattage consumption here was raised 6.65% from 378 to 404Watts. 2560×1440 is next, my preferred resolution.
2560×1440
I thought here we were seeing a trend, but while the improvement in performance was not significant, there was still a brief improvement. We see here a meager 3.32% improvement, though even though it was not at 60FPS, it was still very playable as you can see in the video I posted previously, but also linked here
I would prefer to hit 60FPS, but as you saw in the video, if you turn a few things off, it does not affect too much of the visual quality but it does improve gameplay significantly.
So let’s jump back into how we were able to obtain these overclocks and a few other features as well.
[nextpage title=”TRIXX Overclocking and Card Utility”]
As I mentioned before, I used TRIXX 3.0, An in house utility Sapphire created for overclocking, but it can do much more than just overclock. For those of you learning, I will just quickly breeze over it.
The way I recommend doing it is to start from the bottom, start stock, you don’t know where you’re going unless you know where you’ve been. Please take note, this is my approach, everyone has their own approach to overclocking.
So the rule with overclocking here is the same rule overclocking anywhere, you have to keep a log of all your work and benchmark results.
To start here, raise your GPU Clock 5Mhz then click “Apply” then run a round of “3DMark” and a benchmark of one of your most graphically intense games, for me the game I use is “Metro Last Light”, it catches mostly everything wrong with an overclock.
When you get the results, write it down next to the section where you wrote down the settings and then raise 5 more Mhz and repeat the process. If during your testing the machine freezes or you notice artifacting or tearing (spots appearing on the screen, missing or stuck textures) then it’s time to add more raise the “Power Limit” bar and if that fails then also raise the GPU Voltage and test again. Make sure all changes you make, you click “Apply” and you write it down.
Be careful with the GPU Voltage, if you raise it too much you can potentially damage the card, but on the lighter side you will eat up a lot more power needlessly.
As you are raising the GPU Clock, you will also want to work on the Fan Speed. I would recommend clicking “Custom”, which will open up the “Custom Fan Speed” section where you can raise/lower the fan bar.
Once you have reached a stable GPU speed and achieved adequate cooling then I recommend you start working the same process on the Memory Clock.
I would recommend to save your work too, you can do this on the profiles.
You can click on any one of those numbers and click “Save” the settings.
You will want to use the same methods for Memory Clock that I mentioned for GPU Clock.
Here you will raise the “Memory Clock” slider
And like before, click “Apply” to apply the settings.
This is what I came up with, let me know what you got?
One thing to mention about overclocking on Sapphires TRIXX 3.0. If/When you restart your computer for any reason, at times the “GPU Voltage”, if you saved it will appear to be “0” (zerp)
as well the “Power Limit” setting. I am assuming this is occurring since it is a beta, it’s not perfect yet but it is good but it is a quirk. When you do in that case, if just click on the “Profile” you have saved your overclock to and click “Apply”, this will set everything as you last saved it.
So aside from the ability to overclock, TRIXX gives you a few more features, I will list them here.
FanCheck: A utility built into Sapphires TRIXX 3.0 that allows you to check the life of your fans, since you are actually able to easily replace these fans.
This will individually check each fan
And when complete, will let you know that status of your fans
Sapphire TRIXX 3.0 also give you a nice feature called NITRO Glow.
Nitro Glow allows you to change the LED lights on your card. It’s pretty cool actually, but both of these don’t really say much unless you actually see them work, so I recorded a video for you. Check it out.
So a few more of the features of this software is the “Settings” button itself.
Settings allows you to show effect memory clock, Synchronize crossfire cards, set clock on Change, save fan settings with profile, Disable ULPS (ULPS: Ultra Low Power State: a sleep state that lowers the frequencies and voltages of primary and non-primary cards to save power, it can also cause instabilities with Crossfire and single card configurations). The other settings are to be able to “Load on Windows Startup”, start TRIXX minimized and restore clocks,… maybe I should enable that one?
There is also “Graphics Card Info” which shows you information on the card. Most of the information is static information, of course if anything did change it would update, for example when I overclocked, it showed the clock speeds.
Here are the stock settings:
Here are the overclocked settings:
This also allows you to “Save the BIOS” to maybe store for your own purposes, give to a friend or share with the community or maybe adjust using another piece of software and reflash back to the card.
Hardware Monitor, allows you to see all of the specs of the card in real life, as you are overclocking and maybe running through games and benchmarks. Here you can see where the voltages are maybe if the benchmark fails, then make adjustments or if maybe the voltages are needlessly high, you can adjust looking through here as well.
And “Log Now” allows you to create a log to save your metrics to of your Sensors. Its output is similar to that of GPU-Z.
Benchmarking just builds up heat, I was able to use thermal imaging to see how the card reacted on each piece of the card. I used the Seek Compact Thermal Sensor that I will be reviewing soon to get this information.
This is the top of the card, the center is the GPU. You can see from the legend on the left the white parts are the hottest parts, reaching 56°Cm this was using 3DMark. The lowest temperature of 9°C was only along the sides of the card, closest to the bottom, away from the GPU. The rear of the card was the hottest points aside from the GPU itself.
The bottom of the card shows the center of the fans themselves being the hottest, makes sense, that’s where the ball bearings are constantly causing friction among themselves as well as the motor itself of the fan.
With the cards close to idle for a few seconds, we find the temps dropped 15°C and while the card looks almost the same, the legend updates accordingly to show you the correct temperature. The hottest was 41°C at the center of the GPU and at the rear, one of the exhaust points. You may have noticed that the card looks more red than the previous image, but again, it’s all relevant to the legend, focus on the legend.
Seeks Compact Thermal Sensor is amazingly handy, the one I have attached to your Android smartphone, but they have a version that also attaches to iPhones. You can click on the links to check them out. Android / iPhone.
The bottom of the card, shows much cooler of course because this is where the fans are. A max of 36°C can be seen here, where the motors for the fans are and the cooler parts being 23°C can be found towards the outer edges of the card itself.
While the thermals look hot, the temperatures are made relevant on the legends on the left hand side so you can see they are actually very low.
This next image is borrowed from Sapphire’s website
It shows a little on how the vents on the side of the card works. I don’t too much agree on the rear of the card, venting hot air into the case the cooling itself somehow and being recycled back into the card itself. It seems to do the how well however, and the top of the card venting into the portion of the case that usually has some sort of exhaust seems like a great idea from our friends at Sapphire.
You can see in the thermal image below, where the vents are actually doing their jobs, I have it circled in blue.
So with all of that, what do you think of the card and its accessories? We’ll I will let you know on the next chapter “Final Thoughts and Conclusions”
[nextpage title=”Final Thoughts and Conclusions”]
Let’s check out the Pros and Cons of this card and its contents.
Pros
- Tons of ports to fit almost any monitor
- Supports 4 x 4K displays
- FreeSync Support
- Supports DX12
- Fan Check is a nice touch
- Ability to easily replace the fans
- 0DB Fan mode
- Quick Connect Fan Replacement
- Aluminum Backplate (For Better Cooling)
- NITRO Glow RGB is a nice little toy
- LED BIOS Mode Switch in case you don’t want to install Sapphire TRIXX 3.0.
- While a cute touch, the RGB on the cards seems a little gimmicky, though some people like that. Everyone is on that bandwagon.
- Great GPU/Memory Speeds
- Relatively affordable for the speeds provided
- Sapphire TRIXX 3.0 is a nice Utility
- Dual BIOS support
Cons
- Does not include any adapters or adapter cables
- Could be important for people that have 2 x DVI Monitors, 3 x HDMI Monitors, 3 x DP monitors, etc…
- While a cute touch, the RGB on the cards seems a little gimmicky, though some people like that. Everyone is on that bandwagon.
This card is packed full of features and some nice accessories, though virtual accessories since they are only software packages that you can get on a CD or just download off of http://www.sapphiretech.com/Default.asp?lang=eng, no real adapters, cables or things of that nature. One can say it is not really needed since all the output are available on the card, but what if you have 2 x DVI monitors, or 3 HDMI monitors, etc… they might be needed.
The card stays cool and quiet, though you can raise the fan speed and even the card speed as well to better suite your gaming or professional needs. It cannot handle the highest end of games at everything maxed out and give you 60FPS or higher but for a relatively affordable card, it does a good job at it, maybe I didn’t overclock it enough. It’s got plenty of room surely to overclock more.
The card is amazingly priced right and the price can only go down, which is the best way for a price to go. There is little bad that can’t easily be fixed, as for the missing adapters part, you may even have them lying around. That being said, if you don’t have them laying around, you can find yourself in a predicament where you need to go to a store and buy one that can be around $50 additional, or you can buy one cheap online….. and wait.
Because of all of this, I have no choice but to give this card a 5 out of 5, Editors Choice. With this performance at this price, you just can’t get any better.
Great job Sapphire.
We are influencers and brand affiliates. This post contains affiliate links, most which go to Amazon and are Geo-Affiliate links to nearest Amazon store.
I have spent many years in the PC boutique name space as Product Development Engineer for Alienware and later Dell through Alienware’s acquisition and finally Velocity Micro. During these years I spent my time developing new configurations, products and technologies with companies such as AMD, Asus, Intel, Microsoft, NVIDIA and more. The Arts, Gaming, New & Old technologies drive my interests and passion. Now as my day job, I am an IT Manager but doing reviews on my time and my dime.