AMD, Breaking, Gaming, Graphics, Nvidia

DOOM Alpha Benchmark: AMD Dominates over NVIDIA

When AMD introduced its Fiji GPU family, the company said that the innovations put into the GPU will give solid performance in applications of today, but that they will ‘kick ass’ as soon as future applications begin to appear. Precisely, two high ranked AMD executives we talked to openly discussed their focus on DirectX 12 and Virtual Reality. As Fiji approached the launch date, NVIDIA countered by launching GeForce GTX 980 Ti, i.e. a $1000 Titan X card with half the memory – for 35% less.

Overall winners were gamers, who got an incredible choice, high performing parts for (more) affordable prices. Depending on who you like, you could build an almost future-proof system today. How will it perform? Russian website GameGPU released benchmark results for upcoming 2016 edition of DOOM, an upcoming hit-title from id Software / Bethesda Softworks. The site got a hold of DOOM Multiplayer Alpha build, released to crash-proof the mutiplayer servers.

The game will launch in just a bit over two weeks, and id Software promises this time they will ‘crank the volume to 11’. In a series of blogs, developers promised they will deliver bleeding edge tech and utilize everything OpenGL/Vulkan-based idTech 6 engine can deliver.

“We want players to wonder how DOOM and idTech 6 games can be so visually stunning at 60 frames-per-second at 1080p on all platforms, when other titles cannot even achieve a similar look at 30 frames-per-second,” says Lead Project Programmer Billy Khan. “Our goal is to be the best-looking game at 1080p at 60fps.”

“Many on our tech team worked at Crytek before, so we’ve known and worked with each other for some time,” says Lead Render Programmer Tiago Sousa, who’s been with id for a year-and-a-half. “It’s been an interesting and productive venture, learning about idTech foundations and researching where and how we will take it to new levels. My hope is not a very humble one, but I’d like to help bring idTech back to the forefront of technology once again.”

Typically, we would refrain from commenting on Alpha benchmark results but given that all hardware vendors regularly quote the said performance in their email blasts to us, from today we will be covering as much game titles as we can.

Onto the benchmark results. GameGPU site tested the hardware using a variety of hardware configurations, including Intel Core i7-5960X (Haswell-E) and AMD’s FX-8350. Do note that the CPU was overclocked to 4.6 GHz in order to remove any potential CPU-related bottlenecks.

DOOM Multiplayer Closed Alpha – Full HD, 1920×1080

DOOM Multiplayer Alpha Benchmark in 1080p

As you can see in results above, the framerate is capped at 60 Hz / fps. Quite surprisingly, majority of AMD hardware simply flies in the benchmark, achieving 60 fps for both minimum and average frames per second. What is interesting to see is that AMD Radeon R9 290, and R9 290X – almost three year old hardware is capable of achieving 60 fps. When we look at the difference between R9 290 and its competitor of the times, GeForce 780 Ti – the difference probably comes not just from maturity of the OpenGL drivers, but also from efficiency of the internal architecture.

From the other hand, do not expect multi-GPU scaling, as it only creates an overhead with this build.

Do note the width of memory controller on all tested parts. R9 290 and R9 290X score substantially higher than contemporary R9 300-series cards, probably due to integrated 512-bit memory controller being able to feed the units faster than the 256-bit memory controller on the 300 series. Do note that there is an apparent error in this graph, where R9 280X 3GB scores 61 fps (Average FPS) in a benchmark where the maximum is 60.

DOOM Multiplayer Closed Alpha – 2560×1440

Doom Multiplayer Alpha Benchmark 1440p

The difference really starts to show in 2560×1440 resolution, which alongside 3440×1440 is becoming increasingly popular among the gamers worldwide. Regardless of who is your GPU vendor of choice, it is clear that NVIDIA has a lot of work to do with their drivers. A single, three year old R9 290X is beating a contemporary GeForce GTX 980 Ti in both single and multiple-GPU configuration (SLI). Based on our experience with Bethesda, CryTek, and idSoftware – it is hardly believable that NVIDIA did not work hand in hand with the AAA-level developers which often released titles that perform superbly on NVIDIA hardware. The question is, did someone slept on its laurels?

DOOM Multiplayer Closed Alpha – 4K, 3840×2160

DOOM Multiplayer Alpha Benchmark in 4K, 3840x2160

4K is the new frontier for gamers who want ultimate fidelity. Pushing eight million pixels per frame is a challenge for any computer out there, yet alone when we look at the performance of these cards. Radeon R9 Fury X in Crossfire mode is capable of pushing over 340 million pixels per second, same with the single card – AMD is in need of some serious multi-GPU optimization, and this perhaps is one of reasons why a single board with two Fiji GPUs (R9 Fury X2, R9 ‘Gemini’) hasn’t seen the light of day. But if we look at single GPU performance, R9 Fury delivers the same 320 million pixels as the dual board one. Perhaps the most impressive board is R9 Nano, which for $499 gives serious performance, single-handedly beating a $1,339.98 configuration from NVIDIA. Given how good the board is, perhaps R9 Nano is the best gaming card on the market.

Top 10 Graphics Cards for DOOM – Who Delivers the Most?

In order to calculate the results above we took the worst-case scenario, which is the 4K resolution, and calculated the number of pixels each GPU was capable of pushing through the display output.

  1. AMD Radeon R9 Fury X – 340.07 MPixel/sec
  2. AMD Radeon R9 Nano – 298.59 MPixel/sec
  3. NVIDIA GeForce GTX 980 Ti – 282.01 MPixel/sec
  4. AMD Radeon R9 295X2 – 273.72 MPixel/sec
  5. AMD Radeon R9 290X – 273.72 MPixel/sec
  6. AMD Radeon R9 290 – 248.83 MPixel/sec
  7. AMD Radeon R9 280X – 215.65 MPixel/sec
  8. NVIDIA GeForce GTX 980 – 199.07 MPixel/sec
  9. ATI (AMD) Radeon HD 7990 –  199.07 MPixel/sec
  10. ATI (AMD) Radeon HD 7970 –  199.07 MPixel/sec

The results above show a direct vindication of my recommendations since I joined the industry. I have always firmly believed that when you build your system, you need to build the top end graphics card, followed by as much system memory and fastest storage you can buy.

ATI Radeon HD 7970 debuted in December 2011, which is four and a half years ago. Yet, this card offers better performance than hardware you can buy in 2016 – note that AMD’s 300 series disappeared from the top 10, i.e. was incapable of reaching 200 million pixels per second. Same story goes for NVIDIA – unless you have GeForce GTX 980 or 980 Ti, you’re SOL when it comes to DOOM. Given that the authors from GameGPU did not had access to TITAN X, or the regular Fury – we can only guess how would the Top 5 GPUs list end up looking like, but it is safe to bet that TITAN X and Fury would probably push the 295X2 and 290X out of Top 5.

DOOM is coming out on March 15, 2016 – and we’ll follow up with our in-house testing when the game gets released. Clock is ticking, and let’s see what can change in two weeks. NVIDIA and AMD sure need to fix their multi-GPU scaling (if possible), and NVIDIA needs to ramp up the performance across the board. Otherwise, it could be a second big loss in as many weeks (after loosing to AMD in DirectX 12 benchmark Ashes of the Singularity).

  • 200380051

    It is weird that the R9 390s did not make it; they are 8GB versions of the R9 290s with slightly higher clocks.

    Also, factoring price in the list makes the good’ol 290S look even better : one can be had for around 300$, used.

    It will be interesting to see the picture change once SLI/CFX support is added. If anything, Radeons will shine even more as their XDMA engine has shown beter scalability than current GTXs.

    • Renzoe

      Do note the width of memory controller on all tested parts. R9 290 and R9 290X score substantially higher than contemporary R9 300-series cards, probably due to integrated 512-bit memory controller being able to feed the units faster than the 256-bit memory controller on the 300 series(doesn’t apply to 390 and 390x both are 512 bit)

      • 200380051

        I see that, and my comment pertained only to the 390s series indeed. In fact, only Hawaii has a 512-bit memory bus; no other GCN chip has that.

        I beleive the discrepancy stems from the Tonga chip that replaced Tahiti from the 280s to the 380s serie; Tahiti has a 384-bit bus, and Tonga has a 256-bit bus, with delta color compression. (This may suggest that Doom makes use of the memory bandwidth for something other that raster. Compute, maybe? That, or the compression doesn’t make up for the performance deficit)

        In the lower tiers, there has been no such decrease in bus width. All the 300 series except the 380s use revisions of their 200 series predecessors (Trinidad = Curaçao = Pitcairn, Oland is a revised Cape Verde). Some of the 260s did ship with a 128-bit config although their core could handle 256.

    • Chiang Ken Lui Ken

      i don’t get why the r9 390x and r9 390 are not being put up on the benchmarks, every single benchmark i have seen so far doesnt include the r9 390 but when i do the test my self with my CFX r9 390 or a single r9 390 it does as a good job as a fury x LOL i think thats why they don’t put it up because its basically a good card. i do all my testing in 4k. 1080 always hits 60 or more fps with a single card.

  • PublicStaticVoidMain

    I always see AMD as the one who trumps benchmarks when it comes to raw performance. However, features and drivers wise, no thanks. Always had bad experience with their drivers. Only way I’d get one if when Nvidia polishes their DX12 support so I can combine the two.

    • Moravid

      Care to elaborate on these features and drivers?

      • PublicStaticVoidMain

        Features like GameStream, and even its driver update (a feature within itself). When it comes to features, Nvidia always comes first like Shadowplay and G-Sync (“comes first”, not saying that AMD does not have an equivalent). Google will yield you better results. When it comes to drivers, AMD always gave me issues – my worse was it essentially bricking one of my laptops.

        • onstrike112

          In other words you like the proprietary, locked down, have to buy more expensive hardware versions of things that AMD hardware does for less….. Sad.

          • kachunkachunk

            That’s not what he’s saying at all. It doesn’t matter if your card’s faster, if a feature set you need is broken or missing altogether.

            That said, even as an Nvidia user, I prefer seeing AMD come up preferentially from time to time; Nvidia needs to stay on their toes. Competent competition is a good thing.

          • onstrike112

            Nuts to that, when the Linux boss give the finger to Nvidia.

          • Paul

            Onstrike does have a point though, the more people that have the view that Void does, the more likely Nvidia would become a monopoly and I’m quite sure given the chance, Nvidia would lock the gpu market down so games only work on their hardware making it next to impossible for a rival to enter the market, what you think Phisx and Gameswork is about, lock in, and the funny thing is, they are quite close to getting that dream which if they did, expect gpu prices to sky rocket and progress to plummet.

            I don’t favor either AMD, Nvidia or Intel when it comes to buying but could you imaging how bad it would be for the PC as well as consoles if AMD was to go under, Intel and Nvidia would have a lock in with no competition and seeing as they already keep trying to push prices up, we really do need AMD to get back in the game, anyhow, about drivers, never had any problems with AMD drivers and rarely ever did with Nvidia drivers apart from the odd blue screen of death I got.

          • onstrike112

            Thank you, that’s what I’m trying to say! If their positions were reversed, I’d be an Nvidia user.

          • Paul

            Same here, I don’t have any brand loyalty and usually buy what gives best bang for bucks but, my last two cards have been from AMD, a 280x and 6950 and never really had any drivers issue on any, the card before that was a Nvidia 8800gt I think it was called, didn’t really have any drivers issues there but I do recall getting more blue screens of death on that then AMD hardware, it was rare but it happened, Just like Pururun, I find it hard to justify buying a graphic card from Nvidia because of their shady things they’ve been doing the last few years so given the choice, if AMD have a card of similar performance and price to Nvidia hardware, I’ll choose AMD every time until Nvidia changes their ways, I would be the same with Intel cpu’s and go AMD if they was competitive but it’s kind of hard to justify when the performance and power consumption is worse on the AMD cpu’s, here’s hoping Zen can change all that so I can dump Intel as well.

            I’m amazed AMD have been so competitive as they have been considering that AMD is smaller than both Intel and Nvidia on their on and are competing with both of them but I do feel AMD could do with a big cash infusion from other big companies to help them compeat better, apart from that they are doing better then expected considering the resources they have.

            Onstrike, your welcome.

          • Pururun

            I’ve been using both companies GPUs for years (back to AMD only because of Nvidia shady shit nowadays, I can’t support that) and have had no issues with drivers while using my GTX 590, 780ti, 270X, 280X, and 390X cards.

            I think people that like to blame companies for shitty drivers are people that don’t know how to properly configure their PCs/games on a per game basis or just suck using their hardware to begin with.

            Both companies are equally guilty with drivers though and its mostly down to people just liking to kick AMD while they’re down when the more recent Nvidia drivers of the past 2 years have been just as bad or worse than AMD’s worst over the past 5 years.

            People really need to get off the Nvidia dick riding bandwagon or we will have a monopoly on our hands in a year or two and then you’ll be crying while wishing we had AMD around still.

            Moral of this story, don’t jsut stick with one companies brands and try switching it up sometimes. You might be pleasantly surprised.

          • Corey

            Nah Devs wont jump on board like that. If NVidia forced devs to not support AMD at all a lot of devs would jump to Havok or do their own physics engine. I think soon NVidia will be forced to support AMD better. Especially if GPU Open is worth anything. I hope it becomes a massive hit. AMD need support to go with their great hardware.

          • Paul

            Pururun, I agree that the drivers for both have been a joke but I don’t think it’s there fault, the idea of needing new drivers for a lot of new games on release is a complete joke that shouldn’t happen, AMD came up with Mantle with a lot of push from developers to sort that problem out, with OpenGL and older DirectX, developers had a hard time knowing what they was developing for because of drivers and AMD and Nvidia had a hard time having to release drivers just to fix the problems with games, it was a mess that Mantle, Vulcan and DirectX 12 is supposed to sort out by trimming the driver stack, given more control to developers and making development on PC more predictable, now we’ll see if this pans out when games start coming out using these new api’s, if it does pan out, games should run faster, be less buggy and need far less driver fixes to fix game problems.

            I also agree that if your PC isn’t set up properly then expect problems but in this day and age, the PC should be pretty much setting itself up, drivers and games graphic settings should be pretty much automated by now, I blame Microsoft for neglecting the PC in favor of the Xbox console, something they seem to be changing on the last year or two.

            Speaking of drivers, Nvidia have just had a stinky of a driver release just now the Division game giving many users blue screens of death.

            Corey, I wouldn’t bet on it, it’s amazing how many publishers and developers are willing to take bribes, also it’s not that hard for Nvidia to force the issue, there are many tactics they can use on developers that don’t tow the Nvidia line, making their games run like garbage is one of them to hurt game sales, this is the problem with monopolies, they can force the issue whether we like it or not, so it’s in all our interest that AMD sticks around and is competitive, otherwise PC and console gamers will cop it.

            I should point out, if Nvidia had a monopoly, you know how easy it would be for them to cripple performance of Havok at a driver level to force developers to use Physx?, developers will think, use Havok and performance tanks, use Phisx and it works good, this is how they could force the issue and the scary thing is, Nvidia already show signs of doing this the last few year, god forbid what they would be like if they had full control.

            The moral of the story is that we need AMD back in the game otherwise we’re all going to pay the price, better yet, another competitor like ARM’s enters the PC cpu and GPU market, a 3 horse race is better than 2, I remember the days about 15 years ago where you could get a high end gpu for around £130, today a high end card is around £500, much more competition in the olden days.

          • PublicStaticVoidMain

            Thank you. A lot of people react like you are some kind of Nvidia fanboy the moment you say something not good about AMD, even if you also admitted something good. I never even said Nvidia was perfect and I even openly admitted that they have flaws (like DX12 implementation).

          • PublicStaticVoidMain

            On the contrary, no. I develop software/firmware for a living. I used to give AMD a lot of love before, but I got a lot of frustrations in return. I am speaking strictly on my own experience. If you had a better experience, good for you. I am an engineer and I always, always try to justify any purchase on a hardware perspective. You would not see me using any Apple device, nor drive any luxury car. My point is, AMD kind of lost me as a customer because of the issues I had with them.

            I am not a fanboy of Nvidia. A lot of people are (even more so with AMD). This is the reason I said I’d get one when Nvidia gets a better DX12 support. Because if AMD decided to fail on me again, I’d have a backup. On the other end, Nvidia never failed me. Not once.

          • onstrike112

            It sounds like to me you’re just unwilling to give a chance to the smaller one because the guy with the bigger stick is more entrenched.

          • Corey

            I know we can only go on what experiences we have had in the past but I have owned A lot of AMD cards and lightly overclocked them all. I have had HD4850/4870/4890 (different machines), 5450 in my server, 2×6870 xfire and now an R9 290 and never had any fail for any reason. Before the HD 4850 I had Nvidia cards like the 8600 and 9600. I tend to go for a few series and swap and change. But AMD has done me good for a long time. not to mention the AMD CPU’s I have had in the past although I admit Intel is king in this area now unless Zen turns out to be good. Could not recommend an AMD CPU at the moment unless you actually don’t want a dGPU. Like I have said we can mostly only go on what we have experienced. Mine has been good. Sounds like yours, not so much. Don’t mean for this to sound like an attack so I hope it doesn’t. Guess I just want to say I have been lucky in the “Silicon Lottery”. Sorry rant over

          • Andy White

            Care to point me in the direction of a cheaper AMD version of the Nvidia Shield and Gamestream?

          • Christopher Lennon

            the shield? cause that’s a mass market device, and AMD has streaming software, in fact the latest iteration was released and in the news a few days ago…Care to point me to Nvidia’s DX12 support/performance?

          • Andy White

            So still waiting for you to explain the AMD alternative to the Nvidia Shield and Gamestream, no? Maybe that’s because there isn’t one.

            As for Nvidia’s Dx12 support:

        • Corey

          Hmm sure you have G-sync and gameworks but it hardly compares to the implementation and development of Mantle (which we all know is mixed in with DX12 and Vulkan). Mantle was heralded as the most important PC improvement since DX9. Massive gains from all manufacturers (except NVidia but that is mostly due to them not supporting it fully yet) not just one company trying to gimp everyone else out of the market.

    • R Valencia

      How about BSOD from SW TOR MMO on Maxwellv2 cards?

  • David Curtis

    This is what happened with Black ops 3 with performance issues, but then new drivers came out and the game ended up performing fine.

  • John Pombrio

    Alpha, Beta, Gamma. This game and Ashes of Singularity are not finished by either the developers or have the release day drivers, yet AMD fans are using them to justify buying their cards from the company that is failing miserably. Let’s wait for the release of the games before making “I gotta crow” remarks.

    • Christopher Lennon

      Sounds like you’re bitter, Nvidia and their fans do and would do the exact same thing, so don’t make it seem like it’s a behavior of one side. Also, what exactly does AMD’s financial status have absolutely ANYTHING to do with benchmarks, their hardware, or the way it performs? Oh yeah, absolutely nothing! You know how you can tell when an Nvidia fan realizes their hardware has been beat by AMD hardware? Easy, they’ll bring up Nvidia’s market share.

      • App4that

        Do you know how you can tell when an AMD fan is trying to validate buying a lesser quality product with a fraction of the support? When they use Alfa benchmarks that just got made irrelevant.

        • Its interesting to see that the table is listed alphabetically. More so, it is really interesting to see Radeon R9 290 beating an old TITAN. What matters is the performance offered for your dollar. And from the table you’re linking, it’s obvious that some graphics cards from both parties ended up in the ‘wilderness’, offering sub-par performance based on their asking price.

        • From the looks of it, it seems that they used alphabetical order, and I would be quite worried if a $499.99 part performs the same as a $100 more expensive part. Also, multi-GPU systems seem to have a detrimental effect to the performance, as they score lower.

          Glad to see my recommendations for R9 290X and GTX 980 withstanding the test of time.

          • App4that

            The game is capped at 60fps, that’s why you see the grouping at the top. If you go to the 1440 *beta* benchmark you’ll see the 980ti pull ahead. And though I’m a fan of the 290x it does create considerably more heat. I actually had to sell my Vapor X for that reason.

          • Medi Zerovan

            No, I don’t see that, actually. I see eve 980Ti SLI losing to Fury and not reaching the 60fps cap.

      • Love how you’re criticising John, one of the most rational and knowledgeable members of the disqus tech community, as being ‘bitter’ for stating that we should base our opinions on good data rather than jumping to conclusions.
        Pretty sure he’s owned AMD cards in the past and pretty sure he would swap back if AMD released a competitive product for a change. But lets look at their latest offering, the fury series: unavailable for months due to awful yields, plagued by coil whine issues, ‘an overclockers dream’ being a 5% overclock at best, and still being nowhere near as feature rich as an Nvidia card. Yet whenever a benchmark on an unfinished game shows a performance advantage for AMD theres always the same people jumping on the “lol phew i didn’t waste my money buying AMD!” bandwagon.
        Sorry guys but I’m afraid you did. I cancelled my preorder for a fury x and got a 980Ti instead and I’m very glad I did. Overclocks 12% further and runs faster in nearly every game. AMD seriously need to do better with Polaris / Vega. Lets hope they do!

        • PuiuCS

          and yet, after a few driver updates and a few months the products suddenly became much better and started to outperform the 980 TI in benchmarks that it was ahead.
          Let’s face it, the fury line will continue to get better in time, while Nvidia will do like it has always done: forget about the old gen cards as soon as it launches new cards.

  • Andrew_Kirk

    I didn’t not see results like this or anywhere near with my 970 in fact it was a smooth 60 the whole way. And my processor is on an lower end i5 not the i7. Not sure on the validity of these results

  • PiGood

    AMD cards are used in consoles, and sadly this game seems to be more of a console targeted game… So for right now the AMD’s should and will triumph.

    • Corey

      Your comment would be more correct if it were using DX12/11 but it is not. It is using Vulkan which essentially is Mantle. When utilizing low level API’s AMD is King at the moment at least until NVidia release new hardware to support. They have been pushing the market towards this ever since GCN was released. Besides NVidia hasn’t really had much of a chance to bring drivers out for Vulkan.

      • iTile

        At the time the Alpha version was not using vulkan, it was still running oldschool OpenGL.conversion to Vulkan only started in March 2016.

        Nvidia GPU’s have never beeen the better performer in openGL

        • Corey

          Fair enough. I stand corrected 🙂

  • Jayson Neal

    Clearly they didn’t get these facts straight, as Doom comes out in May 13th, not March 15. Cmon, guys.

  • Katana Man

    The article states that an amd fx 8350 was also used in benching? Where are the figures? I hope it’s not being suppressed to show how worthless an overpriced Intel cpu is.

    • Link to the original article was shown multiple times in the article. Unfortunately, they held the AMD results back. Probably keeping it for the print version. As far as your comment on ‘worthless’ – that’s really offensive and completely incorrect. These products are manufactured using bleeding edge technology. AMD’s processors are manufactured in 28nm, while Intel uses 14nm. However, ever since Bulldozer debuted, AMD’s weak spot in L3 cache performance dragged them down. WE will see what the result will be once ZEN debuts.

  • Christopher Lennon

    It’s obvious that the ALPHA has no multi-gpu support for either AMD or Nvidia, so why is the author actually citing and refering to the crossfire/sli results? The fact that there is zero multi-gpu support is supported by the fact that at 4k, the 295×2 performs exactly the same as the 290x

  • 1pp1k10k4m1

    “The results above show a direct vindication of my recommendations since I joined the industry. I have always firmly believed that when you build your system, you need to build the top end graphics card, followed by as much system memory and fastest storage you can buy.”

    I don’t mean to be overly critical here, but how can you expect anyone to take that comment seriously? Why not just throw in “and the fastest processor”, so that you can encompass the whole system? Then you could say “that just vindicates my recommendations that you should buy the best system you can”. It’s not a mystery that higher end hardware sticks around longer than lower-end hardware. If you’re going to spend a pseudo-paragraph writing something, you should actually say something and not try and “vindicate” the obvious. It’s great that you’re self-affirming, but that statement was the equivalent of “I think you should buy the best house you can, or the best car you can”…of course you should, and everyone already knows that. You didn’t vindicate anything…you just made an obvious statement and agreed with yourself to try and bolster your own position. If that’s vindication to you, then you need to stop asking yourself easy questions and answering them, and go find some questions you don’t know the answer to.

    • PuiuCS

      pretty sure he meant the fastest you can fit for your budget 😀

      • 1pp1k10k4m1

        That’s probably true. 😀

        I still don’t understand why bother to say that? Isn’t every gamer building a custom system (or even reading benchmarks and comparisons like this) already trying to do that?

        It just seems like a bit of preaching to the choir.

  • Iluv2raceit

    Uhhh, completely invalid results. AMD ‘tailored” drivers were used to optimize performance, yet stock Nvidia drivers were used that have no optimization for DX12. This article is purely click bait for AMD fan boys and nothing more. Oh, and we’re talking about an “Alpha” version? THEO IS AN IDIOT.

    • PuiuCS

      this is OpenGL and AMD has generally done well in such benchmarks compared to Nvidia.

  • Robdarian

    Your telling me a single nano is faster than crossfire fury-xs? Bullshit!