AMD, Breaking, Gaming, Graphics, Nvidia, Reviews, Virtual Reality (VR)

Valve VR Test: AMD Wins Against NVIDIA?

Valve Steam VR

Update March 17th, 2016 16:52 GMT – The AMD R9 Nano x2 was the performance achieved by Radeon Pro Duo. We have discussed the SteamVR benchmark with Valve and they are working on expanding the benchmark score to incorporate more frames. The Fidelity test will stay just that (“Crank it up to 11”), but the benchmark will be enhanced to factor in the developments of graphics hardware. We will follow up as more developments around SteamVR happen.

Original Article:

Over the past three weeks, we saw two VR vendors releasing their tests to check can your machine deliver an optimal VR experience or not. As it goes, one vendor (Oculus) did a completely wrong thing by releasing a tool which only blindly looks at the specs, and the other (Valve) did a better job by releasing an actual VR benchmark.

Oculus Rift Compatibility Tool checks your system to see if you fit under the following specs, and that’s that:

  • Intel Core i5-4790
  • 8GB RAM
  • AMD Radeon 290 or NVIDIA GTX 970

Now, if we compare Facebook/Oculus approach to Virtual Reality, and how Valve and HTC are developing the SteamVR platform, it is clear that Valve/HTC are doing a better job. First and foremost, Valve’s SteamVR Performance Test is a real benchmark. This benchmark is based on Aperture Science Robot Repair demo and it will run regardless of do you have a VR setup installed on your system – or not.

Valve's Aperture Robot Repair Demo running on a conventional display.

Valve’s Aperture Robot Repair Demo running on a standard display.

Now, Valve’s initial VR benchmark shows that the authors probably belong to the Spinal Tap fan club, as the benchmark result currently peaks at 11. As you can see from the video below, it must be as important as that Marshal amp or even inside Tesla’s electric cars.

AMD reached out to us and sent us their initial benchmark results, which show AMD parts performing better than NVIDIA.


According to AMD, the best VR experience is offered by two Radeon R9 Nano cards. We would disagree with the AMD-provided slide above, since I don’t think anyone will be buying two Nanos to run in Crossfire. Secondly, the benchmark does not work properly on all multi-GPU configurations, which is why we are discarding the first result. AMD is yet to launch its long delayed AMD R9 Fury X2 i.e. AMD Gemini Dual-Fiji board, which should debut around the time when Oculus Rift and HTC Vive appear on the shelves.


We ran the benchmark on our own on a similar system, and got almost identical results to AMD when it comes to GeForce GTX 980 Ti (11), and AMD R9 Fury X (9.9). We also checked out GeForce Titan X, and a three year old Radeon R9 290X. Our system features more memory and an M.2-based SSD, all from Kingston’s HyperX division. If we remove the multi-GPU configurations, which are not supported on all platforms, this is what kind of VRPC performance should you expect:

Top 11 GPUs for Valve SteamVR Benchmark

  1. NVIDIA GeForce GTX Titan X (11.0) – VRWorld Tested
  2. NVIDIA GeForce GTX 980 Ti (11.0) – VRWorld Tested
  3. AMD Radeon R9 Fury X (9.6) – AMD in-house / VRWorld Tested
  4. AMD Radeon R9 Fury (9.2)
  5. NVIDIA GeForce GTX 980 (8.1)
  6. AMD Radeon R9 Nano (8.0)
  7. AMD Radeon R9 390X (7.8)
  8. AMD Radeon R9 290X (7.6) – VRWorld Tested
  9. AMD Radeon R9 390 (7.0)
  10. NVIDIA GeForce GTX 970 (6.5)
  11. AMD Radeon R9 380 (5.5)

Take a look at #8. AMD Radeon R9 290X was launched in September 2013, and still easily beats GeForce GTX 970. More worrisome for AMD, it also beats R9 390 and R9 380 without breaking a sweat (almost 10% faster than 390!). Naturally, take these results with a grain of salt as we don’t have all the boards, but you can follow the fast developing thread on Reddit. We will be including this benchmark in our test suite, but we hope to see SteamVR benchmark fully developed, and supporting Multi-GPU configurations from both vendors.

  • fturla

    Whether AMD or NVIDIA top line video cards is best is irrelevant to more than 95% of the consumers and businesses eyeing VR consumer devices. The sweet spot that would be acceptable for the market would have VR equipment run compatibly with AMD R9 290 or lower and NVIDIA’s GTX 950 or higher video cards with test scores over 7. The virtual reality experience must be graded down to acceptable costs and higher perceived quality than these current tests provide. At this point the majority still will not stomach VR costs over 100 and with the sweet spot around 200 these VR tests for ultra level video cards over 300 only serve to attempt to make sales for the video card manufacturers for their premium products and not realistic attempts for VR design development.

    • Biky Alex

      VR is still in development stage. It’s an enthusiast market now. When the hardware has caught up, we will see more consumers adopting VR. But don’t assume people that buy GTX 950 can afford to buy a VR headset that it’s probably more expensive than their whole rig.

      In 3-4 years 980Ti class GPU will be a low-end one (something like GTX 1460 will be the equivalent of a 980Ti). Also, R9 Fury X will be comparable to a R9 870X.

      Hope the last part wasn’t too confusing with name schemes.

  • Socius

    Really shitty article. You can’t even compare the FPS performance of SLI vs Single Card right now as when 1 GPU Per Eye rendering happens, you’ll get almost 100% scaling. I believe both Nvidia and AMD have this planned. Misleading click-bait article title ftl.

    • Shitty article? or shitty benchmark? Unlike verbatim copy coming from AMD PR team, we took the effort of actually running the benchmark on several GPUs I had in hand. Multi-GPU can never be 100%, there are always losses involved. Sorry your computer obviously doesn’t cut the mustard. Perhaps your new computer will. Attitude however, is something you can’t buy.

      • Crytek Engine 5

        Bah just ignore him, he just another butthurt Nvidia fanboy who can’t accept that AMD had major lead in VR and DX12 segment, hell, maybe he’s using over priced 980 right now XD, and i’m glad choosing good old R9 290x this year, didn’t know it beat gtx 970 in DX12 and even on par with GTX 980 ti can you believe it!

        • What are you even talking about? a 290x is not on par with a gtx 980ti in VR or otherwise. As you can see the standard 980 beats the 390x, the 290x is not going to beat the 980ti. And considering how badly the dual Nanos beat the 980ti its a horrible price/performace

          • Alexander Yordanov

            The GTX 980 is priced against the Fury Non X. The 390X is much cheaper… *sigh*

            People, this is not hard.

      • Domaldel

        Well, in a VR setting you’re more likely to get close to 100% scaling since you can atake least theoretically then use one gpu for each eye making the situation similar to running two different virtual machines sharing the same cpu but with one gpu products system each running one screen of it’s own.
        While in a single monitor setup you have to make each frame serially with one GPU making frame 1 and the other frame 2 and so one and so forth.
        Of course it all depends on the code running on the hardware.

        • We’ll take a good look at GameWorks VR and LiquidVR. It’s not as much different content than latency itself. At least these are the issues I am hearing from IHVs.

      • Socius

        Hah…my computer is more powerful than any you’ve used in your life. Multi-GPU actually can hit close to 100% scaling. AMD has demonstrated this before. Also SLI and alternate frame rendering is different from VR rendering where each card is running a full/identical workload. Essentially operating as a single card in the way it handles the workload.

        And yes. This is a shitty article. Saying that AMD beat Nvidia when the top 2 spots are Nvidia cards, which is still a useless metric as the bench mark caps out at 11, and you need to compare relative performance based on the rendered frame count. But at least you’re keeping consistent in the quality of your article and comments.

        • I would really, really refrain from ‘talking smack’ about people’s machinery. Especially given that you cannot know what computers people run, and when it comes to the ‘most powerful’ computer I’ve ever built, it was a 31 GPU setup with Quadro and Tesla boards. Sadly, NV Drives did not scale beyond 16 in a single OS installation.
          The story had a question mark, as VRWorld does not dance as AMD PR (or any other PR team) wishes us to do.

          • Socius

            And discussing a system that can’t game, on a gaming benchmark article. Only you. Kudos. And there are a few people with faster computers than myself. But I know you’re not one of them based on this article. Your title is misleading. Yes, there’s a question mark. But the real question is how you even came to posing that question. There’s nothing here showing AMD beating Nvidia. Not even from AMDs own numbers. With the exception of them showing that 2 of their GPUs can beat 1 of NVIDIAs.

            You made a mistake with your article name. Or maybe you did it because click bait. Either way, don’t be surprised that you got called out on it.

          • Actually, the goal of that project was to see could that system run a game. And again, may I remind you of your own words: “Hah…my computer is more powerful than any you’ve used in your life.”

            AMD PR firm sent out an email claiming they’re the best. We installed the benchmark with the boards we have and got the results we published here. That’s it. Better than a verbatim copy of an email, eh?

          • Socius

            See, if you knew anything about bottlenecks in gaming, you’d realize that 31 GPUs would do nothing to improve gaming performance. Even if we ignore the fact that SLI scaling is terrible, even at Quad-SLI. You’d be hamstrung by CPU single core performance and the graphics API.

            Get on a PC that scores over 15,000 frames tested on this VR benchmark with a single GPU, then come talk to me. Otherwise my statement still stands. You’ve not used a computer more powerful than mine. Feel free to challenge what I said by posting your best gaming benchmarks. If I can’t surpass it, then I’ll accept defeat. But I’m thinking even if I have to push another 0.2v into my cards, I can beat you. It’s a good thing my cards are hard modded for on-the-fly voltage tweaking.

          • Ados8000

            You sound like a 7 year old kid in the school yard. ‘No mine is better than yours’. With a mentality like yours the real question is where you came across money to buy anything.

          • Socius

            Found a very profitable street corner to work.

          • Ados8000

            Lol. As long as there is good WIFI.

          • He’s Canadian. Prolly too offended he lives above a great party. Robin Williams.

          • Ados8000

            I know some good Canadian friends so I have no problems ?. Do you think we’ll see top end titles for VR? At the moment they are all simulators and I know the technology is just getting started but it will be the proceeding AA titles that will push set ups to the limit and be worth playing.

  • Joe Joejoe

    An R9 290 doesn’t beat a GTX 970. An R9 290X doesn’t even beat a GTX 970. This is a 100% solid fact that has existed from the start.

    To give the GTX 970 a 6.5 and the R9 290X a 7.5 is just a joke, clearly inaccurate.

    The 980ti in the test got a score of 11. A 980ti is faster than even two GTX 970’s. So the 970 gets a 6.5, two 970’s should be getting a score of ~13, and a 980ti only gets a score of 11!?! Clearly there is some bias and manipulation going on. On top of that, the 980ti is faster than a Titan X too, yet they’re scored the same.

    • Tiberiu Lupescu

      Wow. Just wow.

      • Ashley Gann

        don’t judge him to harshly, he’s just doing his job – schillin like a villain 😉

    • NV_Zen

      hahaha lol, i can’t stop laughing after reading ur comment.
      what a but hurt, amd created freesync on top of displayport standard , ya thats true and they didn’t hide it or cost oem for that, if ur GSYNC is better then why freesync going to hurt your nvidia. if u cant sell your GSYNC then give it for free

    • Erm… not exactly true. AMD has spent countless R&D effort on developing open standards where all could benefit. AMD is the developer of GDDR memory standard (a small team around Joe Macri), and we’ve seen countless times in the past that NV would develop a part which has higher GDDR utilization than AMD’s own hardware. Mantle became Vulkan – with Google stand to benefit the most (in a market that AMD does not compete) – while Freesync was given to VESA, which in turn adopted it to be a open, industry-wide standard. NVIDIA also developed parts which became industry standards, like the MXM PCIe slot for mobile devices (and later industrial use a’la Tegra and other FPGAs in cars).

      I don’t see how your point is contributing to the conversation – benchmark came out, these are initial results, and that’s about it. This benchmark tests VR performance, nothing else. Is it relevant? We will see that once EVE: Valkyrie and other VR titles launch.

      • Joe Joejoe

        “AMD has spent countless R&D effort on developing open standards where all could benefit.”

        Nope. It’s all to say ‘hey we give this away for quote free unquote but Nvidia doesn’t so hate them’.

        As I said, their marketing strategy revolves around creating and fostering fanboyism.

        Hows Vulkan working out for people who bought AMD GPU’s to support it and AMD CPU’s to save money because mantle promised lower CPU overhead…..Oh wait….only a handful of games ever used mantle, then it was functionally declared dead. So a large selling point that sold AMD a lot of hardware…..never really came to fruition.

        “I don’t see how your point is contributing to the conversation”

        My point is the benchmarks are clearly manipulated and biased. There is no ‘increased VR performance advantage’ that can be had from one brand to the next. So when VR performance is inconsistent with known performance ratings for cards, there is clearly something fishy going on.

    • 200380051

      So much butthurt..

    • disqus_GB8lUuziuG

      Put the cool aid bowl down and slowly step away…

    • GPUnit


      How much of your life did you waste coming up with that

    • Bryan_S

      Pats on head, I own a SSC 970, and a Gamers edition 290x… The 290x is faster. Sorry bud.

      • Joe Joejoe

        Nope. GTX 970 consistently beats even a 290x…..and overclocks better too, while using significantly less power. Just better in virtually every way.

        • Bryan_S

          Ignoring tomskew… using reference 290x not on ubermode…
          The 3.5GB problem is real… I have both and I am sorry, I had to lower settings going from a 290x to a gtx970… It is not faster that is a straight lie… I was attracted by the lower power draw… but it is a broken card.

    • Orion4tech

      2 970’s are generally faster than a single 980Ti.
      You have to highly OC the 980Ti and will only be faster if SLI support is not on point.

      • Loque Kane

        micro stuttering, heat, power, better to buy the 980 ti imho for actual gaming experience

        I’ve run dual card setups and for the X that I saved, it was not worth it.

    • Ashley Gann

      you should have returned your 3.5GB card when you had the chance :/

  • Zon

    Already the NVidia fan boys are getting butt hurt over this!

  • ананимас

    Don’t forget that 290 was 780’s counterpart and where is 780 now? Clearly a win for AMD.

    • 780 Ti scores about 5-5.5. So yes, a clear win for AMD. Even though benchmark was apparently developed using LiquidVR. Still, we shall see how things will look once we get finalized drivers. Kinda disappointing to see Fury X score so low against the 980 Ti. Still, seeing Fury and Fury nano above 980 is a good omen.

      • “Still, we shall see how things will look once we get finalized drivers”

        No, we will look at the current beta-scores and proclaim AMD’s “clear” victory! Because its not the final score that matters, its the ones where we win! /s

    • Ashley Gann

      The 290 was actually closer to the price of a 770 at launch, even after ngreedia dropped the prices on their keplers 😉

      $100 difference with 780 / $70 difference with 770…

  • Prosp3ctus

    Ehh I think the Author is confused and should probably stay away from PC hardware.

    “More worrisome for AMD, it also beats R9 390 and R9 380 without breaking a sweat (almost 10% faster than 390!).”

    380<290<390<290X<390X – All in the PC industry know this.

    More concerning for Nvidia however is that the 780Ti flagship is beaten by a much smaller mid-range chip with much less horsepower, not so much due to improved technology as the nefarious practices over at Nvidia regarding the Kepler (GTX600&700) driver performance.

    Comparable hardware from AMD has soared above its Nvidia counterparts, when comaring AMD’s 200 range Vs. Nvidia’s Kepler range the Nvidia cards are nowhere to be found during benchmarks due to Nvidia’s planned obsolescence and anti’ consumer practices.

    It is a known fact that the Nvidia games library known as ‘GameWorks’ which is also to be used in VR is unoptimised on last Gen Nvidia cards in order to widen the performance gap vetween this and last Gen.

    • FatAmerica

      Yet, All these NVidia users claim their hardware is better than AMD’s, and AMD has taken DOOM title from nvidia where a 380x beats a 980ti.

      • “380x beats a 980ti”

        Except, that’s just not possible. Literally wrong, there’s just not enough power in the 380X to beat a 980ti, unless we’re talking about a very unoptimized comparison like a 980ti running without proper drivers and updates. There’s no reason to do benchmarking tests like that because it doesn’t show any relevant information on how the games will run on those GPU in the real world.

        • FatAmerica

          My bad it beats a non TI 980 still in different performance brackets get rekt nvidia shitty ass company.

        • Gate Overvoltage

          King of the North
          Is that U 😀

    • phaethon

      ” Nvidia’s planned obsolescence and anti’ consumer practices” exactly why after selling for a good price my GTX690 I will never ever ever EVER buy an NVIDIA again:)

    • 1pp1k10k4m1

      I <3 this. Thank you for bringing sanity to a very silly article.

  • godrilla

    I’ve seen a hydrid 980ti get 13 points with oc, my 980 ti amp extreme @ factory clocks gets 12 points with 6 year old i7 980 xe @ 4.3 ghz. This test is very gpu bound.

    • It only goes up to 11..

      • I don’t think he was born when Spinal Tap came out. 🙂

        Joke aside, I have had a lot of fun with people that are saying they’re getting more than 11.0 score…

    • FloridaOJ

      Here’s the attention you were looking for.

  • Prosp3ctus

    This Valve test is ridiculous.
    There are only 3 relevant factors to VR:
    This test is a joke and in no way represents a latency free ‘enjoyable’ VR experience.

  • Jinrui Zhang

    lol, my overclocked 970 G1 can do 8.2

  • Jinrui Zhang

    8.2 with 970 G1

  • Shamz

    Who gives a flying f about vr? It’s just a fad like last time.

    • FloridaOJ

      Hmm… sure.

    • There are several billion reasons why VR is not a fad, but everyone is entitled to his or her opinion. Have you experienced VR so far?

  • Prosp3ctus

    Meanwhile on my 390X.

    • e92m3

      Yeah, I get 8.2-8.3 depending on driver with a single 290x 4gb, 10.2-10.6 in crossfire. Some drivers have been even higher.

    • john smith


  • FatAmerica

    NVidia users are going to end up reasoning that both hardware is equally good, and not their hardware is the best over AMD blah blah blah, you see a company that doesn’t have much tends to try harder at development and you can see that now AMD will destroy Nvidias bullshit! Pascal with 16nm FinFET against the 14nm FinFET Polaris which is already a given 5-15% in performance over pascal.

    • Biky Alex

      Polaris might be manufactured by TSMC as well, so that means Polaris will be 16nm. Though that’s not a bad thing. By looking at Apple’s iPhone 6S Chipgate, we see that TSMC’s 16nm A9 chips use less power than Samsung’s 14 nm A9 chips.

      And my guess is TSMC will still be AMD’s GPU manufacturer, while Global Foundries (which uses Samsung’s 14nm FinFet process) will be making their CPU/APU.

      And nm doesn’t mean better performance. We can see that 28 nm AMD APU can compete with some low-end i5 22nm Ivy Bridge. Not all nm are “equal”.

      But AMD hardware has been known to have more processing power than nVidia, but nVidia had better drivers, that’s a fact. Not to mention their GameWorks scandal (developers could’ve used already open source TressFX). A low level API like Vulkan will show the true power of AMD GPU.

      Also, VR takes advantage of 2 GPU, one GPU per eye, unlike “combined” frames which don’t scale correctly. It’s just like screen mirroring.

      • Polaris 10 and 11 are both 14nm FinFET at GlobalFoundries. Got it confirmed yesterday at the Capascain event. Chips will be “Made in USA” or “Diffused in USA”.

  • siriq

    My GTX 570 has got 5.9 fidelity score.

  • Ace66696

    Fine and all, but this test is in DX11, not in DX12. Then the outcome is in terms of Price performance for AMD but also RAW power AMD on spot 1

    • I’m meeting Valve tomorrow and discussing the benchmark and its future. I know that they are developing DirectX 12, and that there will be a second performance index that will go past 11. But ‘crank the volume to 11’ will stay, as it measures VR fidelty, framedrops and jumps etc.

      • Ace66696

        You better ask Valve when it will be possible to stop NV forcing Gameworks(not) to developers. And by that also stop gimping AMD

  • Andrew Brooks-Davis

    This is my result from running a 295×2 with a 290 disabled (no trifire)

  • RyviusRan

    Weird my 970 gets 7.9. There has to be something wrong with yours.

  • RyviusRan

    Weird my GTX 970 gets 7.9 in single mode.
    In multigpu mode my 2 970s hit the max 11.

  • Darko Danichic

    I have beter result with XFX Fury X (1080 Mhz oc),I7-6700K (4.64 Mhz oc),16GB Corsair 2666 Mhz (2800 oc with XMP),Sabertooth mark 1 z170… Average fidelity 10,frames tested 9507.From 9.6 to 10 is a big difference …

    • Darko Danichic

      XFX Fury X (1150 Mhz oc without voltage scaling) 10.3 9676 frames

  • Darko Danichic

    I7-6700K @4.65 Ghz 1.392V

    16 GB DDR4 @ 2666Mhz

    XFX Fury X 4GB HBM 1100/500Mhz

    ASUS Sabertooth Mark 1 Z170

    256 Gb SSD

    2TB Seagate HDD

    EVGA G2 750W

    Windows 10 PRO 64-bit

    VR TEST Average fidelity 10,2 (Very high)

    • Everything is in the hardware configuration. That’s a really nice overclock of the CPU, and the memory is working nicely. Have you tried to OC the memory to 550 or to 600 MHz to see what the performance boost GPU-wise would be?

      • Darko Danichic
        i just oc to 1146 Mhz without voltage changing.I tried to oc the memory to 550,but thats need voltage change,and sistem is down…Water cooling in Fury X dont covering HBM unit,and after you change voltage,and oc to 550 ore more,you get insane high temperature.

  • Pingback: AMD Polaris 10 Unveiled - VR World()

  • Sean Forste

    I am a little confused here. I don’t see amd beating nvidia in these test. From what I see the 980ti scored a 11. That is better than any amd card. Yea the 2x nano gets and 11 but that is two cards. From what I hear 2 cards doesn’t do any good. Also everyone I know has there 980ti’s at like 1500mhz. Yea that is nearly a 40% oc. My titan x is sitting at 1548mhz core 7.9ghz memory.

    I am hoping amd does good over the next couple of years. They need it. I am just starting to see a trend. The whole fury x is a titan x killer. It really isn’t. I also see all these benchmarks for dx12 where nvidia isn’t doing so hot. I am pretty sure nvidia will come up with something with there massive R&D fund. I am not saying that amd isn’t gonna do better than nvidia.

    I am actually rooting for amd at this point. They need some market share before nvidia starts charging 600$ for the gtx x60 cards. I have been seeing alot of threats and claims from amd about how there next gen gpus are gonna destroy nvidia. If they don’t they will destroying what little market share they have left. That isn’t good for nvidia buyers or amd.

    My thoughts are both pascal and polaris will trade blows. Amd will have better prices and has a 50/50 shot at taking the crown. Now it may lean towards amd if all these dx12 issues are true about nvidia. Worst case, nvidia’s maxwell isn’t gonna do will with dx12. I know with there r&d this will change. Nvidia can afford to lose a fight or two. Amd can’t afford to lose that much more.

    I am a little worried about dx12 performance on maxwell with owning a titan x. Still I was planning on holding out until volta for my next gpu purchase. If performance doesn’t stay great over the next 2 years til then I will upgrade to whoever has the top performing card when I need the power. I have bought Nvidia for my last 4 cards because they have had better day 1 performance. I don’t buy cards that end up being faster than the competition 2 years later. By then I am upgrading.

    Don’t take any of this as a shot at amd. They have been headed in the right direction it seems. If they come out on top with the next gen of games, I will buy an amd card. I do see them throwing threats and Nvidia left and right tho. They had better back em up because they don’t have the market share to spare. Only time will tell. Nvidia has been pretty quite as of late. That means they don’t have any ammo against amd atm or they are about to unleash something that is gonna rock everything amd. I am pretty sure the chances are 50 50 tho.

    • I believe the “R9 Nano Crossfire” was actually a Radeon Pro Duo board. Also, this was the reason for the question mark. Truth to be told, it is inconceivable for me to understand why Nvidia did not work with Valve to get the SLI working, especially with VR SLI being pushed by the company.

      Competition is good for the benefit of the consumer, and we’re now really starting to see DX12, and soon Vulkan – unlocking the performance of older products. Investing in Hawaii-class GPU is starting to pay off, as you will get the second wind, especially if you were caught in the BitCoin craze and have two or more Hawaii’s (sell two and keep two).

      If you own a Titan X, you have a great product. It is more compact than R9 Fury, and offers better performance than both the R9 and 980 Ti – which both featured in our first Virtual Reality Computer. The key thing is not what kind of goodies you have in hardware if the software isn’t following it, and Liquid VR and VRWorks are doing a lot of good things for developers. If Vulkan really delivers, the market will see a lot of benefit from direct approach in engines such as Unreal Engine 4 and CryEngine V.

      We live in exciting times. And I for once, love to see what Nvidia and AMD are coming up with. Also, let’s not forget Intel, whose Skylake processor kicks ass compared to 3x more expensive processors from the same company (6700K vs. 5960X is a no brainer, really – Z170 is turning out to be a faster platform than X99, unfortunately for the customers that went with more expensive parts).

    • dude, no offense, but most of your post is a shot at AMD 🙂

  • Badelhas

    Great article!

    • neli



    meanwhile my GTX 980 gets

  • trajan2448

    Unfortunately for AMD benchmarks don’t translate into sales. The 980ti has outsold FuryX and Fury combined by almost 20 to one.

  • Well the article’s numbers for R9 390 are just a bit off, but not by much. My MSI R9 390 (not OC’d) nets me a 7.2, which according to the Steam VR Test means I will be rocking out VR titles with high settings. Oh yeah!!!

  • Jeff

    Strange results. My MSi GT72 Dominator LAPTOP with a 970m 16GB DDR4 Ram and an i7 4720HQ scored 7.3. Any chance your PC with the 970 was hindered during the benchmark? My desktop with a 980Ti/i5 6600K scored a 10.9.

  • Harvey Maginnis

    The numbers in that table are very very wrong. My ASUS Strix DCUIII 390x @ 1100MHz paired with a 6600k @ 4.5GHz achieves 8.8. Overclocking the 390x even further to 1180 (at 80% fanspeed) just to see how far i could go achieves 9.7. Now i’m not sure about anyone else but that 7.8 smells like bullsh*t to me….