Oculus Rift Dev: Day #2

It seems my Oculus Rift finally arrived! I'm currently on vacations, staying at my hometown, itching to give it a try!!

In the meantime, there is still a lot to learn. As before, prior to holiday's crazyness,  I will be reading the documentation and writing down some notes...

7.1 Head tracking and sensors

Oculus Rift sensors: gyroscope, accelerometer and magnetometer. After DK2: external camera to track headset position. Sensor fusion: sensor information combination. Calculate the motion of the head to synchronize the user’s virtual view in real-time.

First of all, initilalize sensors:

ovrBool ovrHmd_ConfigureTracking( //return false when required capabilities are not present
     ovrHmd hmd,
     unsigned int supportedTrackingCaps, //flag with the tracking capabilities supported by the app
     unsigned int requiredTrackingCaps); //flag with the required capabilities by the app in order to work properly
Poll sensor fusion for head pos and orientation:
// Start the sensor which provides the Rift’s pose and motion.
ovrHmd_ConfigureTracking(hmd, ovrTrackingCap_Orientation |
     ovrTrackingCap_MagYawCorrection |
     ovrTrackingCap_Position, 0);
// Query the HMD for the current tracking state.
ovrTrackingState ts = ovrHmd_GetTrackingState(hmd,
     ovr_GetTimeInSeconds()); //Cool, oculus has its own delta time management...
if (ts.StatusFlags & (ovrStatus_OrientationTracked | ovrStatus_PositionTracked))
{
     Posef pose = ts.HeadPose;
     //note this code is not using the camera, so it's valid for dk1 and dk2
}
ovrTrackingState includes orientation, position, and their first and second derivatives. If we pass a time in the future it will be a prediction. For production better use one of the real-time computed values returned by ovrHmd_BeginFrame or ovrHmd_BeginFrameTiming.

The orientation is expressed in right handed coordinate system, x-z plane aligned with the ground regardless of camera orientation. Rotation is a quaternion. But we can extrat yaw pitch & roll
values this way:

Posef pose = trackingState.HeadPose.ThePose;
float yaw, float eyePitch, float eyeRoll;
pose.Orientation.GetEulerAngles(&yaw, &eyePitch, &eyeRoll); 

7.1.1 Position Tracking

To retrieve camera frustum:
ovrHmd hmd = ovrHmd_Create(0);
if (hmd)
{
      // Extract tracking frustum parameters: HFov, VFov, NearZ, FarZ
      float frustomHorizontalFOV = hmd->CameraFrustumHFovInRadians;
Tracking origin: one meter away from the camera in the direction of the optical axis, same height as the camera. Origin orientation: yaw angle of zero corresponds to the user looking towards the camera. ovrHmd_RecenterPose resets the tracking origin and yaw to the headset’s current location and yaw.

Other thing returned by ovrHmd_GetTrackingState: LeveledCameraPose is the pose of the camera relative to the tracking origin but with roll and pitch zeroed out. This can be used as a reference point to render real-world objects in the correct place.

StatusFlags: Contains  three status bits

  • ovrStatus_PositionConnected: is the camera connected and working?
  • ovrStatus_PositionTracked:is the headset being actively tracked ?
  • ovrStatus_CameraPoseTracked: has the calibration taken place? (requires to be stationary within the frustrum for a second.
Conditions may cause position tracking (ovrStatus_PositionTracked) to be interrupted (after that, normally resumes quickly):
  • Headset moved outside the frustrum
  • Headset adopts orientation hardly trackable
  • Headset occluded (hair, hands, ...)
  • Velocity excess.

Comentarios

Entradas populares