Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by Dario

  1. Today we are announcing to developers an early access release of the Vive Hand Tracking SDK for the Vive, Vive Pro and the Vive Focus (Wave platform). This SDK will provide the ability to track your hands, recognize gestures and on the Vive and Vive Pro track your fingers as well (21 point tracking). For more info please attend the sessions at GDC on Vive Developer Day, Monday March 18. Or just try out the SDK available now here: https://developer.vive.com/resources/
  2. Getting Started with VIVE Wave™ for Developers First download the Wave SDK : https://developer.vive.com/resources/knowledgebase/wave-sdk/ The five components of the Wave Platform SDK: Wave Native (Android) SDK https://hub.vive.com/storage/app/doc/en-us/GettingStarted.html Wave Unity SDK (Plugin) -- see the getting started guide below -- https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html Wave UE4 SDK (Plugin) https://hub.vive.com/storage/app/doc/en-us/UnrealPlugin/UnrealPluginGettingStart.html Wave PluginKit SDK (for 3rd party accessories like controllers) https://hub.vive.com/storage/app/doc/en-us/Pluginkit_SDK_Tutorial.html Wave OEM SDK (for 3rd party headsets) https://hub.vive.com/storage/app/doc/en-us/VROEMService_Tutorial.html Note: Porting to the VIVE Wave platform A case study of porting from the Vive (PC) to the Wave (mobile) includes tips for optimizing for a mobile GPU available here. Additional Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR. So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide. A Quick Start Guide for developing in Unity: The following are the steps for setting up a scene with the Wave SDK, but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support. 1) Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html) 2) Import wavevr.unitypackage (Assets->Import Package->Custom Package...) 3) From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete the existing Camera object there’s one already included in WaveVR) 4) To future proof your application for more than one controller, duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first) 5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging) Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to. You can safely dismiss the AndroidManifest related popup for now until you are ready to publish (this is for indicating 3DOF vs 6DOF support - it is recommended to support both). At this point you should be able to see your empty scene with your controller on your Wave device ! Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin: There is also an additional Unity SDK for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both). The Vive Input Utility: This is a Unity plugin that can support Vive, Vive Pro, Rift, Daydream, Go and the Wave SDK (e.g. Focus) in addition to Unity’s UnityXR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platforms. It’s available on the Unity Asset Store (search for VIU) or at https://github.com/ViveSoftware Steps to create the same application but using the VIU plugin: 1) Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android ) 2) Import wavevr.unitypackage (Assets->Import Package->Custom Package...) and the Vive Input Utility from the Unity Asset Store. 3) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig) 4) Build and Run VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene. Here’s what you’ll see in your Vive Wave HMD after additionally adding a Plane and a Sphere to your scene using GameObject-> 3D Object This screenshot is of a Wave build (using the VIU plugin) and if you simply switch the build target to Windows it’ll work in a Vive/Vive Pro and you would simply see different controllers for the same scene with no changes. You can then develop with a Vive and then only switch the build target back to Android to create a build for the Vive Focus. You can use a simulator in Unity for testing a VIVE Focus in your Unity Editor: https://hub.vive.com/storage/app/doc/en-us/Simulator.html And if you have a VIVE Focus but are waiting on 6DOF controllers, you can also simulate a 6DOF controller. More info here: https://github.com/ViveSoftware/ViveInputUtility-Unity/wiki/Wave-VR-6-DoF-Controller-Simulator Support is provided at the official developer forum for the Wave SDK: http://community.viveport.com/t5/Vive-Wave-SDK/bd-p/vive-wave-sdk
  3. HTC VIVE 3DSP SDK HTC VIVE 3DSP is an audio SDK which provide applications with spatial audio, a key factor for an immersive VR experience. With the HTC VIVE 3DSP SDK, the spatial perception is simulated by specific functions and features, such as head-related transfer functions recording and improvement, higher-order ambisonic simulation of sound direction, room audio simulation, adding background noise floor, real-world acoustic property of distance, geometric and raycast occlusion, Hi-Res audio support, and many other features. There are a lot of factors that could influence the human perception of audio spatialization, such as interaural time difference, interaural level difference, the human body factors (pinna, head, shoulder and torso), the environment factors (room reflections and reverberation), the distance from sound source to user, and the obstacle occlusion. Based on these factors, the HTC VIVE 3DSP generates immersive and realistic audio perception with the following key features: Higher Order Ambisonics (HOA) with very low computing power. Head-Related Transfer Function (HRTF) based on refined real-world modeling (horizontally and vertically) resulting in a better algorithm that is applied to all sound filters and effects. Room Audio simulates the reflection and reverberation of a real space. Hi-Res Audio Settings source files and playback. Distance Model based on real-world modelling. Geometric occlusion uses no Unity collider and the cover area is calculated by itself Higher Order Ambisonics (HOA) Ambisonics is the technology that uses a full-sphere surround sound technique to simulate spatial sound. The 3rd order ambisonics model have been represented in HTC VIVE 3DSP SDK. Room Audio The technology simulates the room audio with the early reflection, late reverberation, background noise, environment materials and so on. Distance Model There is a sound level decrease during sound transmission in real word. However, the changes are different in different conditions. In HTC VIVE 3DSP SDK, serval decadence models are provided. Occlusion The occlusion effect is used to accurately simulate what happens to sound when it encounters an obstacle in its transmission path. Both of mono and binaural occlusion modes could be set up in HTC VIVE 3DSP SDK. Geometric Occlusion: The geometric occlusion calculates the covering ability by analytical geometry techniques. Since there is no need to use the Unity collider. Raycast Occlusion: The covering ability of an obstacle is calculated by casting many rays into space and enumerate how many of them are blocked. The VIVE 3DSP SDK supports room effect, room reverberation and reflection, and acoustic occlusion. It also has spatial effect optimization for VIVE Pro Headphones. However the original VIVE and other HMDs and headphones are also supported. For more information and to request a preview of the the upcoming ambisonic decoder please visit the VIVE Audio SDKs community forum: http://community.viveport.com/t5/Vive-Audio-SDKs/gp-p/devgroup3
  4. VIVE SRWorks SDK With the launch of VIVE Pro, developers will now have access to the stereo front facing cameras to create new experiences that can mix the see-through stereo camera view and their virtual worlds. This will enable developers to perform 3D perception and depth sensing with the stereo RGB sensors, opening new worlds for more creative, interactive experiences. In addition to the updated OpenVR camera APIs that can now handle more than the mono camera of the original VIVE, the VIVE Software team is also providing developers the VIVE SRWorks SDK. With this SDK you can access more than just the raw camera images: Depth Spatial Mapping (static and dynamic meshes) Placing virtual objects in the foreground or background Live interactions with virtual objects and simple hand interactions These features are provided by three service modules, a Depth module, a See-through module and a 3D reconstruction module thus allowing developers to focus on the content. The SDK includes support for native development with plugins for Unity and Unreal. The following videos illustrate some of the features: Here's a portal for passing between the real and virtual worlds Note: The project code for the portal example above is included in the VIVE SRWorks SDK Unity package. Here's an example from developer Jonathan Schenker from Alvios using screen filter effect to mix the realities and here's another example from developer Vladimir Storm using the 3D Reconstruction module. VIVE Audio We are also announcing two VIVE audio SDKs, available for Unity, with support for UE4 coming soon. - VIVE 3DSP SDK The VIVE 3D Sound Perception SDK provides a Unity compatible, audio spatialization plugin with the following features: Higher Order Ambisonics HRTFs based on refined real-world modeling (horizontally and vertically) Support for Hi-Res audio source files and playback. Acoustic distance effect with real-world modeling. The VIVE 3DSP plugin supports room effect, room reverberation and reflection, and acoustic occlusion that is tuned for the VIVE Pro, however the VIVE and other HMDs and headphones are also supported. - VIVE Pro Audio Mode Since the VIVE Pro has dual microphones with support for alert and conversation modes, APIs are provided so that you can toggle between the three audio modes. This provides applications with the ability to toggle between listening to foreground or background audio or a mix of both. Additionally, a USB Type-C high power mode setting (on/off) is also available. Initially available as early access (beta) SDKs, you can find the downloads and join the developer support forums at http://developer.vive.com/resources
  • Create New...