Have you ever wanted to have 360 videos on the palm of your hand? Have you wanted to wanted to watch anything from game-plays, TV Shows, documentaries and even some other random videos? You can with Littlstar, a content distribution network that’s using immersive VR and 360-degree video powered by the NVIDIA DesignWorks, Video Codec SDK.
This week at the GPU Technology Conference, NVIDIA launched new updates to DesignWorks, helping developers take advantage of an entire suite of tools and technologies designed for cutting-edge 360-degree video and VR. With DesignWorks, Littlstar streams video from major brands like Sony Music, The Economist, CNN and Showtime. Individuals can also post to its platform their own 360-degree video for people to experience from every angle. Whether you’re using a VR headset, a mobile app, the NVidia SHIELD TV or a laptop, Littlstar opens the world of 360-degree content to everyone. And NVIDIA technology helps make it happen.
Littlstar uses dual Pascal architecture-based NVIDIA Tesla GPU accelerators and the NVIDIA DesignWorks Video Codec SDK to compress source footage into multiple formats, resolutions and bitrates. NVIDIA GPUs with hardware video encoders make fast work of massive amounts of footage. And Littlstar operates at the bleeding-edge. It uses the NVIDIA DesignWorks Codec SDK to provide both H.264 and HEVC (H.265) video content as part of running the FFmpeg multimedia format.
“Using the DesignWorks SDK via FFmpeg, we’ve been able to encode media for MPEG-DASH delivery using only our GPUs – cutting CPUs out of the picture entirely,” said Andrew Grathwohl, director of media technology at Littlstar. “Our entire encoding stack for DASH, from decoding to scaling and filtering, all the way to the eventual encodes and transcodes, is performed on our NVIDIA Tesla GPUs, dramatically increasing our encoding efficiency and guaranteeing timebase-synchronized outputs.”
New enhancements in DesignWorks enable developers to significantly expand their VR and 360 video quality and performance capabilities. Updates include:
- GVDB Voxels: This first public release enables the rendering of complex particle simulations of sparse voxels in medical, manufacturing, science and even movies. GVDB is 10 – 30X faster than CPU-based rendering. In addition, GVDB delivers the ability to intelligently create, render and generate complex interior structures in 3D printing, providing stability while saving on materials.
- Video Codec SDK 8: Now supports 12 bit decode giving developers the ability to accept High Dynamic Range (HDR) content for brighter, more immersive VR and 360 video.