3D, AMD, Business, Gaming, Graphics, Hardware, Intel, Nvidia, Opinion

AMD Moves to Justify Radeon 300 Series Rebrand

AMD Radeon R9 290X Board showcases the Hawaii GPU and 4GB GDDR5 memory.

Just like in the world of sports, there is a lot of personal and below-the-belt attacks on companies we report about, and quite often the heated exchanges happen about Sony, Microsoft or Nintendo consoles, or if we go down to the chip level – discussions about Intel, AMD or Nvidia (interestingly though, we don’t see Qualcomm receiving a lot of flak).

2015 AMD Radeon Lineup: R7 and R9 300 Series, and the new Fury (TBA in 3 forms: liquid-cooled Fury X, air cooled Fury and compact R9 Nano).

2015 AMD Radeon Lineup will feature seven products: 300 Series R7 and R9 and the new Fury lineup (3 forms: liquid-cooled Fury X, yet unannounced air cooled Fury and the compact R9 Nano).

Recently, we saw a lot of heated media coverage and comments criticizing the company’s decision to rebrand some of its silicon again, now into the Radeon 300 Series moniker. As a part of its 2015 line-up, some Radeon 300 Series parts are based on silicon we first saw couple of years ago. Naturally, the focus of attacks are not R7 Series parts, but R9 300 Series (even though R7 has older GPUs). Thus, AMD thought it needed to move to clarify the differences between 200 Series (launched in September 2013) and 300 Series, which are as follows:

“AMD is pleased to bring you the new R9 390 series which has been in development for a little over a year now. To clarify, the new R9 390 comes standard with 8GB of GDDR5 memory and outpaces the 290X.

Some of the areas AMD focused on are as follows:

1) Manufacturing process optimizations allowing AMD to increase the engine clock by 50MHz on both 390 and 390X while maintaining the same power envelope

2) New high density memory devices allow the memory interface to be re-tuned for faster performance and more bandwidth

· Memory clock increased from 1250MHz to 1500MHz on both 390 and 390X

· Memory bandwidth increased from 320GB/s to 384GB/s

· 8GB frame buffer is standard on ALL cards, not just the OC versions

3) Complete re-write of the GPUs power management micro-architecture

· Under “worse case” power virus applications, the 390 and 390X have a similar power envelope to 290X

· Under “typical” gaming loads, power is expected to be lower than 290X while performance is increased”

There. This is what AMD believes to be the difference good enough for a rebrand. Now, if we would move the comparisons of this rebrand/architectural improvement to car industry, would it result with the same number of attacks from automotive media and end users?

In all honesty, how many readers get heated up when a 2015 or 2016 model is changed when compared to the original one launched in 2013, for example? If we were talking about cars, nobody gives a d*** if a car manufacturer did not change the complete car 12 months after the launch. For example, my 2013 Cadillac ATS actually fares better than a 2015 Coupe version of the said car, since they reduced the size of rear-view mirrors to a ‘here they are’, rather than a usable mandatory safety feature. What to say about 2014 Tesla Model S, which had an August 10th switchover between “your car can drive automatically or not”, or this year’s “from April 2015, Model S has Tegra 4 and supports 4G LTE,” and before that – you’re SOL. Where are droves of negative comments that a lot of car manufacturers openly lied about its cars to useless organization called NHTSA, which ultimately led to deaths of hundreds of people, rather than running a few framerates more?

At the end of the day, what matters is that performance good enough for the intended tasks. And AMD says that R9 390 and 390X are ideal for 4K gaming and are good enough to take the fight to Nvidia’s GeForce GTX 900 Series graphics cards, where GTX 970 was caught in hot water (Nvidia denies issues, Nvidia admits issues, CEO apologizes), even with resulting in refunds and an ambulance-chasing class-action suit.

Even though AMD left us out of Radeon 300 and Fury sampling, we do not care about that. It is each company’s decision to support the media organizations, analysts, customers – or not. And deal with self-inflicted consequences of those decisions. Thus, if you want (somewhat) justified criticism over AMD’s recent media moves, you’re free to read good articles from our British colleagues at eTeknix and KitGuru.

We will just leave the market to decide has AMD made a good move with the 300 Series or not. And we all know what the market said about the Radeon 200 Series.

  • danglingparticiple

    From what I understand, the next driver enables Frame Rate Target Control, which will reduce average power consumption significantly. The silicon may not be brand new, but the software is adding value! Looking forward to trying FRTC on my 290 as it should work, being essentially the same silicon.

    • Theo Valich

      Agreed. AMD is making the best moves they can with the hardware they have. Let us know how is your experience with FRTC goes, looks like a great addon to the lineup.

    • (>_<)

      Frame Rate Target Control is an nVidia Feature. AMD has Dynamic Frame Rate Control. The purposes are just about the same but the use is a bit easier.

      I’d give you a link but this site doesn’t like to direct it’s readers away from it’s click bait.

      • danglingparticiple

        Thanks for the correction. So many acronyms these days. DFRC… A synopsis by Thomas Ryan at Semiaccurate titled “Frame Rate Limiting with AMD’s Fury X” shows the potential power savings.

  • Hung Doan Tran

    Is there a source for this statement?

    • (>_<)

      Of course not. It’s just the authors opinion.

  • Gabest

    I bet there will be models that still don’t support opencl 2.0, because they are based on R9 270X or 280X.

  • (>_<)

    What nobody is taking into consideration is that ALL of the performance benchmarks are based on an OBSOLETE API: DX11.

    By Christmas this year MOST if not all top tier games will be ported to DX12. Witcher3 and Arkham Knights being two controversial games have announced this intention.

    Arkham Knights just does not run on DX11 and this is behind the reason that Arkham was pulled from retail sales.

    Port it to DX12 and it will fly.

    Yet NO writers have benchmarked Fury or ANY of AMD’s 300 series using DX12.

    Would you consider spending $800 dGPU based on benchmarks from 3dFX, DX9 or DX10? Of course NOT.

    Console API’s are very close to DX12 in performance.

    Why are the Tech media writers ignoring the DX12 benchmarks Starswarm and 3dMark API overhead test?

    • philconl

      MOST IF NOT ALL= a few and the vast, vast majority – NO

      Stop the insane spin, please

    • dfirday

      deja vu indeed

      in 2011 someone says Windows 8 will bring improvement that make bulldozer crush sandy bridge, and we all know what really happen.

  • Moral Max

    The silicon may not be brand new, but the software is adding value! Looking forward to trying FRTC on my 290 as it should work, being essentially the same silicon.