Breaking, Graphics, News, Virtual Reality (VR)

GDDR5X Memory Shows Better Than Expected Results

2016 will be marked with the arrival of two memory standards, which should spread across the mainstream and high-end / enthusiast line-up like fire. First, we have the HBM2, an improved version of HBM memory which debuted (and so far, only ships inside) with AMD R9 Fury family of cards. HBM2 promises a four times increase in capacity and double the memory bandwith – meaning a single card can go from 4GB and 512GB/s to 16GB and 1TB/s. Given the low volume of HBM and HBM2 memory, those two will probably remain only on enthusiast graphics cards, such as recently renamed Greenland, high-end Polaris graphics processor from AMD


3D, AMD, Business, Companies, CPU, Intel, Microsoft, Software Programs

Microsoft WARP proves that Intel’s current graphics suck

Thanks to Thomas from, I learned that Microsoft released a document explaining the way how WARP10 works. WARP stands for Windows Advanced Rasterization Platform, or “The Return of the Software Rasterizer”. According to the document, this software rasterizer will come bundled with DirectX 11 and Windows 7. What make the matters important are performance scores. Microsoft states that the company tested Crysis in DX10 mode at 800×600, and saw better performance with WARP than graphics subsystem. The company compared G45 graphics subsystem with Core 2 Extreme QX9650 (3.0 GHz) and saw that WARP10 will bring up a framerate of 5.69 fps (341 frame per


3D, AMD, Business, Companies, Graphics, Hardware, Intel, Memory & Storage Space

ANALYSIS: Why will GDDR5 rule the world?

This memory standard will become a pervasive memory during next four years in much more fields than “just” graphics. Just like GDDR3 ended up in all three consoles, network switches, cellphones and even cars and planes, GDDR5 brings a lot of new features that are bound to win more customers from different markets. Background The reason for development of radical ideas inside GDDR5 lies in the fact that ATI was looking at future GPU architectures, and concluded that the DRAM industry has to take a radical step in design and offer interface more flexible than any other memory standard. Then, ATI experienced huge issues with