Getting Started with VIVE Wave™ for Developers
First download the Wave SDK :
The five components of the Wave Platform SDK:
Wave Native (Android) SDK
Wave Unity SDK (Plugin) -- see the getting started guide below --
Wave UE4 SDK (Plugin)
Wave PluginKit SDK (for 3rd party accessories like controllers)
Wave OEM SDK (for 3rd party headsets)
Note: Porting to the VIVE Wave platform
A case study of porting from the Vive (PC) to the Wave (mobile) includes tips for optimizing for a mobile GPU available here. Additional Porting Guides from Daydream (mobile) or from the VIVE to Wave are available here: https://hub.vive.com/storage/app/doc/en-us/PortingGuide.html
If your existing application used the SteamVR APIs directly, note that most of the Wave APIs have a one to one correspondence to SteamVR. So if you have a 3rd party framework that wraps SteamVR you should also be able to support the Wave SDK by mapping the APIs as shown in the VIVE porting guide.
A Quick Start Guide for developing in Unity:
The following are the steps for setting up a scene with the Wave SDK, but also see the alternative below it when using the VIU toolkit along with the Wave SDK for cross platform support.
1) Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android https://hub.vive.com/storage/app/doc/en-us/UnityPluginGettingStart.html)
2) Import wavevr.unitypackage (Assets->Import Package->Custom Package...)
3) From the Project Assets window, drag and drop the WaveVR and ControllerLoader prefabs into your Scene Hierarchy window to create objects in your scene (delete the existing Camera object there’s one already included in WaveVR)
4) To future proof your application for more than one controller, duplicate the ControllerLoader in your scene (or drag another one in) and select its Type as the left controller in the Inspector window as shown above. At this point it’s up to you how to handle a second controller’s inputs (or simply handle it the same as the first)
5) from File->Build Settings.., select Build and Run (make sure device is attached via USB and Developer settings on the device allows for USB debugging)
Note if at any time you get prompted by a WaveVR plugin popup window to accept preferred settings, simply accept unless you have a good reason not to. You can safely dismiss the AndroidManifest related popup for now until you are ready to publish (this is for indicating 3DOF vs 6DOF support - it is recommended to support both).
At this point you should be able to see your empty scene with your controller on your Wave device !
Alternative Quick Start Using the VIU (Vive Input Utility) Unity plugin:
There is also an additional Unity SDK for developing VR projects that can target multiple platforms and devices and is highly recommended especially for new projects or projects that don’t require extensive use of the Wave SDK APIs (although you can access both).
The Vive Input Utility: This is a Unity plugin that can support Vive, Vive Pro, Rift, Daydream, Go and the Wave SDK (e.g. Focus) in addition to Unity’s UnityXR APIs which in turn can support Windows MR and more. This is an abstraction that wraps other SDKs like the Wave SDK creating a common code base for many platforms. It’s available on the Unity Asset Store (search for VIU) or at https://github.com/ViveSoftware
Steps to create the same application but using the VIU plugin:
1) Launch Unity and create a New Project and make sure you Switch Platform to Android in File->Build Settings... (see the getting started guide to setup Android )
2) Import wavevr.unitypackage (Assets->Import Package->Custom Package...) and the
Vive Input Utility from the Unity Asset Store.
3) Drag and drop the ViveCameraRig (or the ViveRig for additional features) into your scene and remove the existing Camera object (there is a camera already included in ViveRig)
4) Build and Run
VIU Note: Since these prefabs also support other platforms you already get 2 controller support (in addition to falling back to single controller). The ViveRig adds controller support for teleporting, grabbing and toggling controller models and can be easily modified in the Inspector when ViveRig->ViveControllers is selected in the scene.
Here’s what you’ll see in your Vive Wave HMD after additionally adding a Plane and a Sphere to your scene using GameObject-> 3D Object
This screenshot is of a Wave build (using the VIU plugin) and if you simply switch the build target to Windows it’ll work in a Vive/Vive Pro and you would simply see different controllers for the same scene with no changes. You can then develop with a Vive and then only switch the build target back to Android to create a build for the Vive Focus.
You can use a simulator in Unity for testing a VIVE Focus in your Unity Editor:
And if you have a VIVE Focus but are waiting on 6DOF controllers, you can also simulate a 6DOF controller. More info here: https://github.com/ViveSoftware/ViveInputUtility-Unity/wiki/Wave-VR-6-DoF-Controller-Simulator
Support is provided at the official developer forum for the Wave SDK: