Asus Strix Radeon R9 Fury
July 14, 2015
₹48,500 + Taxes (MSRP)
Fury. A rather interesting choice of name by AMD for its new flagship GPU series. Wonder if the fury is for the fact that they have been caught napping by NVIDIA with their Maxwell architecture or whether it’s just something that ties in with the colour Red that represents all AMD GPUs, or whether it’s an indication of the card’s operating temperatures. Asus sent us a sample of the Strix Radeon R9 Fury featuring its new DirectCU III cooling design and we find out whether the Red team can finally beat the Green team with their new parts.
Specification of the Strix Radeon R9 Fury
The card has a ROP count of 64 and comes with 4GB of Hynix HBM as the VRAM. Core clock is set to 1000MHz. The VRAM is clocked at 500MHz, which is an effective 1000MHz. The Stream processor count is reduced from the Fury X’s 4096 to 3584 despite both having the same amount of VRAM. The architecture is the new GCN 1.2, codenamed Fiji, which is new, unlike the rest of AMD’s 3xx line-up.
The rear panel of the card sports a DVI-D port, a HDMI port and three DisplayPort slots. Eyefinity allows all slots to be used at once in a multi monitor setup. Crossfire is handled through the motherboard itself and there is no bridge connection required.
Build Quality and Packaging
Asus does a good job of packaging the GPU in layers of Styrofoam. The box is quite standard with a Strix Branded black cardboard box inside. There is another smaller black box that greets you when you open this, that contains the Manual, Driver Disc, Power connector adapter cables and a Strix sticker to show off your purchase. Below this is a slim Styrofoam layer that you remove to reveal the GPU safely packed in its anti-static cover.
The card boasts a custom cooler design based on the DirectCU III cooler design for the heatsink that has twin 10 mm pipes heatpipes that dissipate heat from the GPU chip to the large cooling fin array. The fins are cooled by a trio of 0dB Wingblade design fans that actuate when the temperature soars above 60-68°C. AMD’s stock Fury GPUs come with a liquid cooler, so an air-cooled design is quite an interesting proposition.
Asus have moved their manufacturing to a new process called Auto Extreme, which automates the entire assembly process allowing for much greater quality control. This combined with the high quality components they use in manufacturing the cards goes a long way towards achieving premium build quality. This allows for better overclocking performance and fewer faulty cards.
Asus have thoughtfully angled the edge of the cooler near the screw holes so that one can use toolless mounting mechanisms with ease. In a first for the Strix series, the GPU comes with a glowing red side panel with the Strix logo in white. The colourscheme of the cooler is Black, Red and Grey and looks more like older Asus cards than the Owl like appearance of earlier Strix cards. The top of the PCB also has a black covering with Red highlighting near the GPU chip. Asus have also provided power connector lights atop the recessed dual 8 pin power connectors to indicate if they are connected properly.
The cooling vents at the back are rather small and rectangular in shape. The HBM or High Bandwith Memory is a new technology that offers greater performance per watt as compared to GDDR5, which translates into 50% power savings as per AMD. However, it is currently limited to just 4GB, which is what we get with this card. The card has dimensions of 30cmx13.7cmx4cm, which can make for a tight fit in some small cases.
We tested the Strix Radeon R9 Fury not only for the average Frames per Second(FPS) but also for the 99th Percentile Frame time which tells us about the performance of the GPU within the second. Within the Second testing is useful to understand micro-stutter which can render a game unplayable despite FPS being high. Fraps 3.5.99 allowed us to calculate both.
Since the card is a Top End GPU, we decided to use highest possible settings in our benchmark games and compare with other Top End GPUs. We tried to disable CPU dependent settings or minimise their impact where possible. VSync and frame buffering were disabled for testing. All tests were run at 1920×1080 on a single monitor configuration.
CPU: Intel i7-4790k @4GHZ (4.4GHz Turbo)
Motherboard: Asus Maximus VII Hero
RAM: 2x8GB G.Skill TridentX 2666MHz 12-13-13-36
PSU: Corsair TX650 650W
SSD: Samsung 840 Evo 1TB (For OS and Benchmarks)
HDD: Seagate Barracuda 7200.12 1TB, Seagate Barracuda 2TB, 2xWesten Digital Red 3TB (Storage)
OS: Windows 8.1 x64
NVIDIA Driver: 347.09 for G1 Gaming GTX 980
NVIDIA Driver: 347.88 for GTX Titan X
AMD Driver: 15.7 for Strix Radeon R9 Fury
While this system may not look like a purpose built test rig, we decided to use a normal usage PC so as to better reflect real world scores of the card. The games were also tested with a few applications like Antivirus, Browser, VOIP tool and others running in the background to get a realistic usage scenario. All HDDs were thoroughly defragmented prior to usage and SSDs were optimised for maximum performance. Due to lack of equipment, we were unable to conduct acoustic and power testing. Since this is a new test rig, we are lacking a comparative benchmark database for it, as it will be built up as we get more cards for testing.
3Dmark is an artificial Benchmarking tool whose Firestrike Test is very thorough on DirectX 11cards powering High-end PCs. The full test run for Firestrike includes 2 GPU only tests, a CPU dependent Physics Test and a Combined Graphics and Physics Test. The Tool is also useful for stress testing a GPU when run on loop.
Given that we are looking for the Performance of the Card itself, one should look at the Graphics score and the FPS for Graphics tests 1 and 2. We also tested the card for the Firestrike Ultra setting that tests for 4K performance. The Physics and Combined tests are CPU dependent, which is the limiting factor of our test rig. We found that 3D Mark has a bug with some AMD cards such that it fails to run the GPU at its full clock speed or detect any overclocking, but this was not an issue with the Strix Radeon R9 Fury.
Battlefield 4 uses the Frostbite 3 engine to push the visual processing boundaries of current hardware. Since Mantle is only for AMD cards, we ran the DX11 version with the highest possible settings. The game offers no benchmark tool, so we used areas from the first single player campaign mission for the FRAPS run.
Given that this game was optimised for AMD GPUs, we were quite surprised to find that the Strix Radeon R9 Fury came last in all criteria when testing under DX11. Seems NVIDIA has done a better job of optimising older games than AMD.
Company of Heroes 2
Relic’s Company of Heroes 2 is a tough nut to crack for quite a few GPUs, though it’s dependent on CPUs to a great degree as well. We used highest possible Settings with Physics turned off and Low AA for the short 45s benchmark run the game offers.
Far Cry 4
Far Cry 4 adds even more visual effects to the DUNIA engine of Far Cry 3 and it has some NVIDIA specific effects like God Rays and realistic Fur as well. This makes for an absolute visual treat when all settings are cranked up to the max like in our 2 minute benchmark run.
The Strix Radeon R9 Fury seemingly beat all the NVIDIA GPUs at their own game in Far Cry 4, but the key difference lies in its rendering of Gameworks technologies, which cause a lot of stuttering when they appear in the scene. Also note that the AMD GPU was only able to use SMAA while the NVIDIA GPUs used 4xTXAA, so they aren’t directly comparable.
The spikes show the presence of stuttering when the game engine rendered NVIDIA Gameworks effects like Soft Shadows on a large area of the scene. There were corresponding FPS dips but not as huge as one might think.
GRID Autosport is the next game in the GRID series from Codemasters that has specific effects reserved for Intel GPUs, which makes it great to judge the performance of NVIDIA and AMD cards as it provides an even playing field. We used the highest settings and used game’s benchmark in a 2 minute run.
Once more, the Strix Radeon R9 Fury surprises us with its starkly poor performance in this game. We used the same 8xMSAA settings for all GPUs but for some reason the AMD GPU was stuck in the 80s while all the NVIDIA GPUs exceeded 100. This is likely the result of lack of Driver optimisation.
Metro Last Light Redux
4A Games have really cranked up the eye candy in Metro Last Light, and the Redux version comes with even more improvements to visuals. The game looks beautiful in its cramped corridors as well as its open outdoor environments and cranking up the settings can easily bring a GPU to its knees in the menu screen itself. We used the highest settings available except for SSAA that was set at 2x to accommodate a few other cards we tested. The game has no inbuilt benchmark, so we chose a particular area to conduct out tests run in for a time of 2 minutes.
Middle-earth: Shadow of Mordor
Monolith have used some fancy effects to add some visual appeal to the dreary land of Mordor. Shadow of Mordor also uses PhysX for particle effects. The rain though doesn’t quite look natural and the game requires up to 6GB VRAM for using its Ultra HD textures. Curiously, it scales the game based on the screen’s natural resolution instead of offering resolution options. We used the highest available settings for the inbuilt benchmark, which runs for less than the standard 2 minutes of our other benchmarks.
Other than the large spike at the start, which were caused by activating the benchmarking in FRAPS, there are no noticeable spikes and the game did not stutter a bit. The FPS curve is relatively smooth.
Ryse: Son of Rome
Ryse pushes the newest iteration of the CryEngine to its limits as claimed by Crytek on the PC. It doesn’t have a benchmark mode, so we ran a fixed scenario for 2 minutes. The settings were set to the highest possible. This game scales the game as per the resolution of the screen instead of offering resolution options.
Thief (2014) is a game that boasts support for AMD’s Mantle and its TrueAudio tech. It is a graphical showcase and has a built in benchmark, though the run time is less than our standard 2 minutes. We set all settings to the highest possible.
The Frametime graph is tightly packed for much of the run though there’s some spiking in the early part. It wasn’t noticeable as stuttering however. The FPS graph doesn’t show huge fluctuations except for a couple of dips.
Tomb Raider (2013)
Tomb Raider (2013) introduced us to a new Lara with fabled TressFX hair that behaves a lot more realistically than the pre-rendered mop we were used to. However, the card does not seem to cope well with this technology and we had to drop it in our test run. We used highest possible settings with TressFX on. The Test run was shorter since the benchmark tool offered by the game runs for less than out 120s target time.
Total War: Rome II
Total War: Rome II is another CPU heavy game that offers significant visual goodness. It offers a benchmark tool that focuses more on GPU power though and that is what we used for our test run of 120s. We set the game to Extreme and Unit Size to Small so as to reduce the impact of the CPU. Unlimited Video Memory was off so the game could scale down visual settings if it reached a bottleneck.
This CPU heavy game is no threat to the Strix Radeon R9 Fury that retains its performance advantage over the G1 Gaming GTX 980, beats the GTX 980 Ti in the 99th percentile frametime and has the highest minimum FPS of all the cards that were tested.
Our acoustics testing consisted of trying to determine how noticeable the noise output from the card was, when kept in a case at 1m distance with the side panel closed, as it might be in a real world scenario. Noise is a very relative characteristic that depends not only on the person hearing, but also on the background noise of their surroundings. During the course of our testing, we found that the card barely went over 50% fan speed even at full load, which was silent to our ears. However, setting the fans at 100% rpm makes a noticeable din that sounds something like an industrial blower.
At idle the card hovered in the 48°-52°C range with an ambient temperature of 33°C, which is rather warm. The card easily went to 84°C under load. While overclocked it would plateau out at 84°C as the fan kicked up its RPM, which is the same as the stock peak. We were quite surprised by this as this a step higher than the under 80°C temperatures we have been accustomed to with Asus’s DirectCU II cooler designs, but we can’t be sure if it’s just the Fiji chip that runs hot.
The card comes in an 8pin+8pin power connector configuration, and is rated at 250W TDP. The card requires a good PSU for overclocking. The extra power as compared to the stock design increases the overclocking headroom of the card. Curiously, overclocking the cards allows us to use a +50% power target. However, AMD’s GCN 1.2 isn’t quite as power efficient as NVIDIA’s Maxwell, though the usage of HBM does improve efficiency to some extent.
Overclocking the Strix Radeon R9 Fury
Using MSI Afterburner, we were able to push the core clock to 1060MHz and the HBM could not be overclocked. This is a total overclock of just 6% over default values. This was borderline stable and was achieved by increasing Power limit to 150%. Asus’s own utility overclocked it only by 20MHz. To keep the temperatures in check, it is a good idea to set the fan to 60-70% from the outset, especially when the ambient temperatures are higher. All our tests however, were conducted in the stock configuration. Curiously, we found the card to peak out at 77°C while overclocked though we didn’t run it at those settings for extended periods.
We also ran the benchmark offered by the game Thief to get a real world idea of the performance gains. We used the Very High settings for our runs that concluded in less time than our standard 120 second runs. One must note that Thief is slightly unreliable as a benchmark due to inconsistency in results across various runs. That said we got an improvement of about 4-5 fps on average with a very slightly higher 99th percentile frametime and a lower minimum FPS, which is worth taking note of.
Asus has overhauled its GPU Tweak utility and it now looks much sleeker and integrates X-split for streaming. It also comes with a 1 year premium subscription of X-split. AMD provides gameplay tracking, streaming and in-game overlay services through its AMD Gaming Evolved app that can optimise NVIDIA cards as well, given that it is derived from Raptr. AMD has sidelined its Mantle technology in favour of DirectX 12, so any performance gains due to that will have to be seen once Windows 10 releases.
While the Strix Radeon R9 Fury seems to pack raw performance that should easily beat its rival, the GTX 980. However, we found that it fails to do so in a few games from our benchmark suite. The difference between the two GPUs becomes more pronounced as the resolution increases beyond 1080p thanks to the capabilities of the HBM, so it certainly makes for a good future-proof bet. AMD need to work on improving the performance in older games through their drivers, though the arrival of DirectX 12 should help improve performance greatly.
Asus’s cooler design seems to be pretty efficient, but it’s hard to judge in the absence of a stock cooler solution from AMD. It does a good job of keeping the temperatures down when it matters. Given that this is the only air-cooled Fury chip, the peak temperatures indicate that the Fury is quite a hot chip and thus it is a testament to Asus’s thermal design expertise that they managed to keep it under 85°C during the entirety of our testing period, even when ambient temperatures soared to 35°C. The reimagining of the Strix branding comes off as rather attractive while remaining pretty functional when it comes to assembling the PC.
The overclocking performance of the card is a major downer as is its power efficiency. Both of these factors contribute to a reduction in its Value for Money. However, the pricing point is quite competitive in India given that the GTX 980 didn’t get any price drop with the launch of the GTX 980 Ti, thus making it a viable choice in the top end GPUs. We award this card the iLL Gaming Silver based on its performance and relative future proofing.
We are extremely grateful to Asus for providing us with a test sample for reviewing.
-Mediocre Power Efficiency