Benchmarks, Performance, Temperatures and Power Consumption
Contents
We are influencers and brand affiliates. This post contains affiliate links, most which go to Amazon and are Geo-Affiliate links to nearest Amazon store.
So that you can compare the results with your own system, below you can find a list of all the components I am using in this review.
- Viotek G35DR 35″: http://geni.us/6LrGJ
- Asus ROG Strix XG438Q: https://geni.us/BBMh3P
- Intel Core i9 9900K Processor: https://geni.us/0PrCbaY
- Arctic Liquid Freezer II 360: https://geni.us/CmNtJA
- EVGA Z390 Dark Motherboard: https://geni.us/UmV6tf
- ASRock Challenger D Radeon RX 5500 XT : https://geni.us/TvdErm
- WD Black SN750 1TB: https://geni.us/8sqXFs
- Montech Air 900 ARGB White Mid-Tower PC Chassis: https://geni.us/kcq3
- Patriot Viper Gaming RGB DDR4 DRAM 3200MHz 16GB Kit: https://geni.us/URlI8
- Samsung 850 EVO 500GB SSD: https://geni.us/QTA2
- WD Black PCI-e NVMe 512GB SSD: https://geni.us/mMNLde
- Windows 10Professional: https://geni.us/TbOpq
- EVGA SuperNOVA 1000 G+ Gold Power Supply: https://geni.us/IVoE
Here are the specs on the ASRock Challenger D Radeon RX 5500 XT brought to you by GPU-Z.
You can see here, I used the AMD Adrenalin 20.4.2 driver set. I originally used 20.4.1 and I was done with the review, but I was using the Viotek GN35DR since my previous 4K monitor had died. The Viotek GN35DR is a great monitor but is not a 4K monitor, so I bought the Asus ROG Strix XG438Q, a 4K monitor that also offers Freesync 2 and 120hz,… so I had to re-review EVERYTHING.
I used GPU-Z to gather temperatures of the card. Since the 5 series was released, AMD has added a new temperature sensor reading named HotSpot. I will include both readings in my findings.
I’ve read different theories online and even received information from a board partner on what the HotSpot temperature was, but I wanted to get something directly from AMD. Asking AMD about the Hotspot, they provided this information.
The junction temperature on modern AMD Radeon graphics cards such as the RX 5000 Series use a sophisticated network of on-die sensors to accurately report the hottest spot across the entire GPU die. Hence, the Junction temperature is also known as the ‘Hotspot’ temperature as Iggy from ThisBytesforYou is asking. Edge temperatures are usually reflective of the average temperature around the edge of the die.
The “GPU Hotspot” readout in GPU-Z is reflective of the junction temperature. The “GPU Temp” reflects the Edge temperature of the die. Older graphics cards that feature a single temperature sensor report this ‘edge’ value shown in GPU-Z. However, we continue to allow applications to read and publish this legacy ‘edge temperature’ to provide a fair comparison point vs. older Radeon and competitive GPU’s that don’t have the ability to report (and act on) the Junction or Hotspot temperature.
AMD Radeon graphics cards utilize the junction temperature to continually optimize gaming, thermal and acoustic performance in real time, enabling higher levels of performance and efficiency over basing these optimizations on ‘worst case’ edge temperature alone. The maximum safe operating junction temperature for the current lineup of 7nm Radeon graphics products is 110C.
I hope this helps to clear the air a little on exactly what it was, I know it did for me, thank you AMD. So, if this HotSpot edge temperature has you a little worried, you can relax a little, just focus on the GPU Temperature.
To measure wattage, I use the Kill A Watt by P3 International, it works great and is very affordable. In my benchmarks, when you see power consumed, this is how much power the entire system is consuming, including the video card, not just the video card.
Also to mention, the ambient temperature in my office is 66°F/18.8°C.
Here are the games and programs I use for benchmarking.
- FutureMark’s 3DMark Fire Strike
- FutureMark’s TimeSpy
- Metro Exodus
- Assassin’s Creed Odyssey
- Shadows of the Tomb Raider
- Far Cry 5
- Tom Clancy’s Ghost Recon Wildlands
Alright, let’s get started.
The overall 3DMark FireStrike 1.1 score was 12,477 and the hottest the card reach was 64°C on the GPU and 82° at the HotSpot while consuming on average 218 Watts. The card did pretty well.
This is a budget card and still scored 66% higher than all other results while keeping a decent temperature. While keeping on its own here, let’s see how this card handles DX12 on TimeSpy since it does tout DX12 Ultimate support.
TimeSpy, while a little more rough on cards fared pretty well here. The ASRock Challenger D Radeon RX 5500 XT is not meant for 2K or 4K, it pumped out enough to be 35% better than all other results. The 5500 XT is meant to be a decently priced power house for 1080p and with that, it scored an overall 5,185 3DMark score.
The card kept its cool on the GPU at 65°C while on this packed system, consuming on average 228 Watts.
Enough with synthetics though, let’s get to some gaming.
For all games in this review, I keep the settings the same, changing only the resolution. I raise the settings to Ultra or the highest possible changing the resolutions to 1920×1080, 2560×1440 and finally 3840×2160.
At 3840×2160, Metro is a slide show. The Challenger D 5500 XT came in at 19.06 FPS with the system consuming on average 243 Watts while keeping a cool 68°C. At 2560×1440, coming in at 30.75 at 67°C taking up only 240Watts on average, 1.24% lower power consumption delivering almost double the frames per second and here is where you start seeing the benefits of FreeSync. At 1920×1080, there is a 21.92% improvement going up by 8 frames per second, it still is FPS wise unplayable.
Prior to having the Asus ROG Strix XG438Q, a FreeSync 2 capable monitor, it would be totally unplayable, but with FreeSync 2, even unplayable FPS becomes so much more tolerable. While Metro itself is a GPU slayer up the chain, the benefit’s become much more apparent as we progress in this review.
Continue with me in this adventure, as we jump over to Shadow of the Tomb Raider and see what Laura has in store for us.
Again, here I keep the same settings, changing only the resolutions
At 3840×2160, it came in at 21 Frames Per Second, this is understood to be 100% unplayable coming it 69°C and consuming on average 219 Watts. Watching this benchmark portrayed a different story. It was not a flip book, it was so close to 60FPS, it was very smooth with only slight stuttering.
Coming in at 68.75% higher than it previous resolution of 3840×2160, at 2560×1440 we reached 43 FPS at a cooler 67°C, a 2.94% improvement though eating up 228 Watts. The card now being able to manage a bit better its resources, will consume more power, in this case 4.03% more power. Here FreeSync again paints a different story than simple FPS can calculate. 43 FPS with FreeSync2 is 100% playable and no brief stutters were observed.
At 1920×1080, even with FreeSync off, we come in at a 100% playable 66 FPS. Here the card gobbled up 232Watts on average while keeping a nice and cool 66°C, 1.50°% lower than at 2560×1440 at 4.44% better than at 4K.
Laura showed us the benefit’s of a decent card and opens our eyes a little to FreeSync, but will Assassins Creed Odyssey show us the same?
Assassins Creed Odyssey, like Metro is a GPU killer. At 3840×2160, it came in at a totally unplayable 13 FPS, not even FreeSync could save it. While it did not look like it was playing at 13FPS, it was bad. At this resolution, we sucked up 238 Watts on average while keeping a nice and cool 66°C.
At 2560×1440 there was a 47% improvement over 4K at 21 frames per second. While not optimal, you could squeeze in some gameplay here, but I would not recommend it. The card did reach 1°C over 4K and did come in consuming 10 more watts on average. 2K here was better, but not ideal.
At 1920×1080, the Challenger D came in at 46 frames per second. FreeSync again saves the day here making it very playable, but numbers don’t lie, it was not perfect. The card did come out 3°C cooler than at 2K but took it to the next level on power consumption, consuming 11.41% more power than at 2K. I am thinking here and for Metro, to keep it at High. Let me know what you think in the comments.
Coming from a bleak future, let’s break down to a more simple time with FarCry5.
FarCry 5 was night and day in performance here, at 4K, we came in at 28 frames per second and yet with FreeSync, it was totally playable, there may have been a hiccup though I did not notice it. Coming in at a chilly 67°C the card nibbled away at 252Watts on average.
Bringing up the rear, at 2560×1440, we came in at a very respectable 55 FPS, 65% better performance that at 4K and even 1°C cooler. The card did consume 2.35% more power than at 4K coming up at 258 more watts on average. At 1920×1080, the card jumped up in performance by 18.92% over 2K at 81 FPS. It again dropped by 1°C but power consumption raised over 2K by 2.99%, but with that performance, it is forgivable.
Down in the South with FarCry 5, we can see how performance was favored but let’s see what going further south to Bolivia in Tom Clancy’s Ghost Recon Wildlands brings us.
While not as harsh, WildLands can still beat up a GPU. At 3840×2160, the card on paper chugged along at 19.2 frames per second, on the screen however it ran a lot more smoothly than it seems. While not 100% smooth, a few pauses were noticed, it was anything but 19.2 in regards to FreeSync. Here the GPU kept a cool 67°C consuming on average 237Watts.
At 2560×1440, there was a 95.62% improvement over 4K, coming in at 35.4 frames per second and while you and I would assume that was not playable, there were no noticeable skips or lags, it was very smooth. At 2K, power consumption was only up 1Watt over 4K and actually came in 4°C cooler.
At 1920×1080 we came in at 40.57 FPS, a 24.52% improvement over 2K. Here it actually consumed the same amount of power that 2K did and was 2°C cooler, a nice trade off there. Like you might have guessed already though, while 40.57 FPS is not ideal, the performance was 100% smooth. AMD really hit the nail on the head with FreeSync and getting all you can from your card, in this case the ASRock Challenger Radeon RX 5500 XT 8GB OC.
So I have been talking about the wonders of FreeSync and all, though you may not have it yet, I get it. This is a 1080P card, and many of you play at 1920×1080 so I thought it would be best to show you how well this card handles 1920×1080 first hand. In this chapter, Gamplay and Performance, we go over some game play to see how well it does.
Continue To: GamePlay and Performance
We are influencers and brand affiliates. This post contains affiliate links, most which go to Amazon and are Geo-Affiliate links to nearest Amazon store.
I have spent many years in the PC boutique name space as Product Development Engineer for Alienware and later Dell through Alienware’s acquisition and finally Velocity Micro. During these years I spent my time developing new configurations, products and technologies with companies such as AMD, Asus, Intel, Microsoft, NVIDIA and more. The Arts, Gaming, New & Old technologies drive my interests and passion. Now as my day job, I am an IT Manager but doing reviews on my time and my dime.