AMD, Augmented Reality (AR), Breaking, Game Developer Conference (GDC), News, Virtual Reality (VR)

AMD Powers Sulon Q, Microsoft HoloLens Competitor

The holy grail for the VR and AR experience will be creating a headset which will have sufficient computational power to deliver seamless VR experience, yet be in a compact form and untethered from computers or smartphones we are forced to use today. First development product which offers such capabilities is Microsoft HoloLens, which recently started accepting pre-orders for Development Kits. At $3000, Hololens is a steep buy-in to Microsoft’s vision of holographic computing.

Enter Sulon, a startup from Toronto located not far from AMD’s Canadian HQ (formerly known as ATI Technologies). On the 30th Game Developer’s Conference in San Francisco, Sulon unveiled the Q, world’s first high-performing, stand-alone, all-in-one, thether free, ‘wear and play’ Head Mounted Device. The company is promoting their product as the ultimate platform for “VR, AR and spatial computing.”

Unlike Microsoft’s HoloLens which uses Intel’s 32-bit Atom processor, Sulon Q packs more performance. Current ‘Q’ prototype is powered by an FX-8800P, a 64-bit AMD Carrizo APU:

  • AMD FX-8800P 4-core CPU
  • Radeon R7 512-core GPU
  • Sulon Spatial Processing Unit (SPU)
  • 8GB DDR3L
  • 256 GB SSD
  • 2560×1440 OLED display at 90 Hz, 110 FOV
  • GenAudio 3D spatial AstoundSound
  • Dual noisecancelling microphones
  • Accelerometer
  • Gyroscope
  • Magnetometer
  • Spatial mapping and tracking
  • Microsoft Windows 10
Just like Microsoft HoloLens, Sulon designed its own custom ASIC, named Spatial Processing Unit (SPU) in order to drive the Augmented Reality / Spatial Computing: 
“Our Spatial Processing Unit (SPU) has been built with developer scalability and future applications in mind to ensure not just amazing spatial computing experiences today, but also encourage developer experimentation to help develop significant advances in computer vision, AR and VR tomorrow. The Sulon Q™ headset is a consumption platform, but also a developer platform as well. We want people to create using it. The SPU receives information from the inertial measurement unit (IMU) which includes an accelerometer, gyroscope, and magnetometer and combines this with revolutionary computer vision and machine vision advancements that follow the user and their environment to augment the user’s visual perspective with virtual content to help make for a more immersive experience.” 
Following bullet points come from a Q&A document:
  • Proprietary environment mapping and tracking developed by Sulon
  • Developer access to real-time, synchronized vision system with highly tuned cameras
  • Developer access to camera images directly from high speed memory taking advantage of the APU HSA memory
  • A high-definition, beautifully enhanced Augmented Realty imaging system for seamless visual AR and VR experiences
  • Developer access to high-speed robust camera pose tracking which uses the environment to map and track oneself
  • Inside-out mapping and tracking of the user with minimal setup, and environment-based inside-out tracking
  • Dynamic, real-time virtualization which dynamically reconstructs and displays a virtual version of the real world to the user and to applications
  • Our goal is to beautify and accessorize the environment where it can seamlessly blend the real world and virtual content together
  • A robust and continually improving gesture library
We do not know how bonded AMD and Sulon are, but with Microsoft HoloLens going from an Intel CPU, Nvidia GPU and Microsoft HPU into a Intel + Microsoft, AMD (and its spin-off, Radeon Technology Group) probably decided to help a small Canadian startup which probably employs former ATI/AMD-ers. In any case, we look forward to testing the Sulon Q headset, and see what’s all the fuss about.
  • theflew

    If nothing else MS’s HoloLens won in the looks category, but this does look powerful. Granted things that immediately come into mind is weight, battery live and for AR this will be limited because you’re not seeing the “real” world. You’re seeing a digital image of it.

    • I don’t think we will be able to seamlessly, and in real time – handle the merger between real world and digital assets. I saw the “Magic Beans” demo and it is very cool, without any doubt. But physically correct interactions will probably need ray-tracing for light tracking and then high resolution asset print.

      So far, a lot of games are less than satisfactory in terms of fidelity but everyone is afraid of dropping under 90 fps.

      And yes, they need to work on the design. But hardware wise, can you imagine ‘Q2’ with Zen APU and built-in HBM memory?