jueves, 29 de octubre de 2015

Pirates: Treasure Hunter has been finally released

After 7 years, Virtual Toys, a Spanish company I worked for, has released this awesome MOBA.

In this project, I worked on gamplay features and physics for more than a year.


Well done guys!

lunes, 21 de septiembre de 2015

Resources Manager

Working 15h per day and updating a blog is not quite compatible. Anyway, the other day I saw tons of Resources manager code and I decided to play a bit with template to tidy things:

template <typename T> class ResourcesManager
ResourcesManager() {}

virtual bool Init() = 0;
void DeInit(void(*UnloadImpl) (T*))
for (std::map<const char*, T*>::iterator it = m_ResourcesMap.begin(); it != m_ResourcesMap.end(); ++it)


void Load(T*& pIn, const char * szName, void(*LoadImpl) (T*&, const char *))
std::map<const char*, T*>::iterator it = m_ResourcesMap.find(szName);
pIn = NULL;

if (it == m_ResourcesMap.end())
LoadImpl(pIn, szName);

if (pIn != NULL)
std::pair<const char*, T*> newPair(szName, pIn);
pIn = it->second;
std::map<const char*, T*> m_ResourcesMap;

Implementing LoadImpl and UnloadImpl for your specific resources should be enough to use it.

lunes, 10 de agosto de 2015


When you are trying to gold a game, crunching time for months and you find tons of leaks, bad designs, loosy code, you look back trying to figure out how you got here.

But that's futile, you have to deliver, and then these tools are really useful:


Thanks to DrMemory and Cpp Check teams, I owe you one.

martes, 27 de enero de 2015

Oculus Rift Dev: Day #4

8. Rendering to the Oculus Rift

The Oculus Rift requires split-screen stereo with distortion correction for each eye to cancel the distortion due to lenses. Two ways of doing it:

  • SDK distortion rendering approach: library takes care of timing, distortion rendering, and buffer swap. Developers provide low level device and texture pointers to the API, and instrument the frame loop with ovrHmd_BeginFrame and ovrHmd_EndFrame.
  • Client distortion rendering: distortion rendered by the application code. Distortion rendering is mesh-based. The distortion is encoded in mesh vertex data rather than using an explicit function in the pixel shader. The Oculus SDK generates a mesh that includes vertices and UV coordinates used to warp the source render target image to the final buffer. The SDK also provides explicit frame timing functions used to support timewarp and prediction.
8.1 Stereo Rendering concepts
IPD: Inter pupillar distance. 65mm on average
Reprojection stereo rendering technique: (left and right views generated from a single fully rendered view), is usually not viable with an HMD because of significant artifacts at object edges.

Distortion: The lenses in the Rift magnify the image to provide a very wide field of view (FOV), to counteract pincushion distortion, the software must apply equal barrel distorsion, plus chromatic aberration correction (color separation at the edges caused by the lens). 

The Oculus SDK takes care of all necessary calculations when generating the distortion mesh, with the right parameters (they depend on lens characteristics and eye position).

When rendering for the Rift, projection axes should be parallel to each other and the left and right views are completely independent of one another.The two virtual cameras in the scene should be positioned so that they are pointing in the same direction (maching real world direction) and with an separated by a IPD (adding the ovrEyeRenderDesc::HmdToEyeViewOffset translation vector to the translation component of the view matrix or using ovrHmd_GetEyePoses which performs this
calculation internally and returns eye poses)

8.2 SDK distortion rendering

The Oculus SDK provides SDK Distortion Rendering. Developers render the scene into one or two render textures, passing them into the API. Beyond that point, the Oculus SDK handles the rendering of distortion.


1. Initialization
  • Modify your application window and swap chain initialization code to use the data provided in the ovrHmdDesc struct e.g. Rift resolution etc.
  • Compute the desired FOV and texture sizes based on ovrHMDDesc data.
  • Allocate textures in an API-specific way.
  • Use ovrHmd_ConfigureRendering to initialize distortion rendering, passing in the necessary API specific device handles, configuration flags, and FOV data.
  • Under Windows, call ovrHmd_AttachToWindow to direct back buffer output from the window to the HMD.
2. Frame Handling
  • Call ovrHmd_BeginFrame to start frame processing and obtain timing information.
  • Perform rendering for each eye in an engine-specific way, rendering into render textures.
  • Call ovrHmd_EndFrame (passing in the render textures from the previous step) to swap buffers and present the frame. This function will also handle timewarp, GPU sync, and frame timing.
3. Shutdown
  • You can use ovrHmd_ConfigureRendering with a null value for the apiConfig parameter to shut down SDK rendering or change its rendering parameters. Alternatively, you can just destroy the ovrHmd object by calling ovrHmd_Destroy.

Oculus Rift Dev: Day #3

After many tries, it seems the oculus doesn't work on my laptop... fair enough. At least I had the chance to test it on my girlfriend's PC.

So I will keep taking notes from the documentation to allow my clarify how the oculus works:

7.1.2 User input integration

7.2 Health and Safety Warning
The warning will be displayed for at least 15 seconds for the first time a new profile user puts on the headset and for 6 seconds afterwards.

To support it:

  • ovrHmd_GetHSWDisplayState: reports the state of the warning described by the ovrHSWDisplayState structure, including the displayed flag and how much time is left before it can be dismissed
  • ovrHmd_DismissHSWDisplay: called in response to a keystroke or gamepad action to dismiss the warning.
(sample code attached)

To use the Oculus Configuration Utility to suppress the Health and Safety Warning, a registry key setting must be added for Windows builds:

(32 bits)
HKEY LOCAL MACHINEnSoftwarenOculus VR, LLCnLibOVRnHSWToggleEnabled

(64 bits)
HKEY LOCAL MACHINEnSoftwarenWow6432NodenOculus VR, LLCnLibOVRnHSWToggleEnabled

sábado, 10 de enero de 2015

Oculus Rift Dev: Day #2

It seems my Oculus Rift finally arrived! I'm currently on vacations, staying at my hometown, itching to give it a try!!

In the meantime, there is still a lot to learn. As before, prior to holiday's crazyness,  I will be reading the documentation and writing down some notes...

7.1 Head tracking and sensors

Oculus Rift sensors: gyroscope, accelerometer and magnetometer. After DK2: external camera to track headset position. Sensor fusion: sensor information combination. Calculate the motion of the head to synchronize the user’s virtual view in real-time.

First of all, initilalize sensors:

ovrBool ovrHmd_ConfigureTracking( //return false when required capabilities are not present
     ovrHmd hmd,
     unsigned int supportedTrackingCaps, //flag with the tracking capabilities supported by the app
     unsigned int requiredTrackingCaps); //flag with the required capabilities by the app in order to work properly
Poll sensor fusion for head pos and orientation:
// Start the sensor which provides the Rift’s pose and motion.
ovrHmd_ConfigureTracking(hmd, ovrTrackingCap_Orientation |
     ovrTrackingCap_MagYawCorrection |
     ovrTrackingCap_Position, 0);
// Query the HMD for the current tracking state.
ovrTrackingState ts = ovrHmd_GetTrackingState(hmd,
     ovr_GetTimeInSeconds()); //Cool, oculus has its own delta time management...
if (ts.StatusFlags & (ovrStatus_OrientationTracked | ovrStatus_PositionTracked))
     Posef pose = ts.HeadPose;
     //note this code is not using the camera, so it's valid for dk1 and dk2
ovrTrackingState includes orientation, position, and their first and second derivatives. If we pass a time in the future it will be a prediction. For production better use one of the real-time computed values returned by ovrHmd_BeginFrame or ovrHmd_BeginFrameTiming.

The orientation is expressed in right handed coordinate system, x-z plane aligned with the ground regardless of camera orientation. Rotation is a quaternion. But we can extrat yaw pitch & roll
values this way:

Posef pose = trackingState.HeadPose.ThePose;
float yaw, float eyePitch, float eyeRoll;
pose.Orientation.GetEulerAngles(&yaw, &eyePitch, &eyeRoll); 

7.1.1 Position Tracking

To retrieve camera frustum:
ovrHmd hmd = ovrHmd_Create(0);
if (hmd)
      // Extract tracking frustum parameters: HFov, VFov, NearZ, FarZ
      float frustomHorizontalFOV = hmd->CameraFrustumHFovInRadians;
Tracking origin: one meter away from the camera in the direction of the optical axis, same height as the camera. Origin orientation: yaw angle of zero corresponds to the user looking towards the camera. ovrHmd_RecenterPose resets the tracking origin and yaw to the headset’s current location and yaw.

Other thing returned by ovrHmd_GetTrackingState: LeveledCameraPose is the pose of the camera relative to the tracking origin but with roll and pitch zeroed out. This can be used as a reference point to render real-world objects in the correct place.

StatusFlags: Contains  three status bits

  • ovrStatus_PositionConnected: is the camera connected and working?
  • ovrStatus_PositionTracked:is the headset being actively tracked ?
  • ovrStatus_CameraPoseTracked: has the calibration taken place? (requires to be stationary within the frustrum for a second.
Conditions may cause position tracking (ovrStatus_PositionTracked) to be interrupted (after that, normally resumes quickly):
  • Headset moved outside the frustrum
  • Headset adopts orientation hardly trackable
  • Headset occluded (hair, hands, ...)
  • Velocity excess.

martes, 6 de enero de 2015

Oculus Rift Dev: Day #1

Good news! Yesterday I received a mail from the Oculus guys and it seems my Oculus Rift is on its way!

In the meantime I'm reading the documentation and taking some notes, just to highlight the most important points I've found in the  Oculus Developer Guide.

LibOVR integration:

  • Initialize LibOVR.
  • Enumerate Oculus devices, create the ovrHmd object, and configure tracking.
  • Integrate head-tracking into your application’s view and movement code. This involves:
  1. Reading data from the Rift sensors through ovrHmd_GetTrackingState, ovrHmd_GetHmdPosePerEye, or ovrHmd_GetEyePoses. (so this stuff gets track of the eyes position? interesting...)
  2. Applying Rift orientation and position to the camera view, while combining it with other application controls.
  3. Modifying movement and game play to consider head orientation.
  • Initialize rendering for the HMD.

  1. Select rendering parameters such as resolution and field of view based on HMD capabilities.
  2. For SDK rendered distortion, configure rendering based on system rendering API pointers and viewports.
  3. or, for client rendered distortion, create the necessary distortion mesh and shader resources.

  • Modify application frame rendering to integrate HMD support and proper frame timing:

  1. Make sure your engine supports multiple rendering views.
  2. Add frame timing logic into the render loop to ensure that motion prediction and timewarp work correctly.
  3. Render each eye’s view to intermediate render targets.
  4. Apply distortion correction to render target views to correct for the optical characteristics of the lenses (only necessary for client rendered distortion).

  • Customize UI screens to work well inside of the headset

// Include the OculusVR SDK
#include "OVR_CAPI.h"
void Initialization()
     ovrHmd hmd = ovrHmd_Create(0); // from 0 to the value returned by ovrHmd_Detect
     if (hmd)
           // Get more details about the HMD.
           ovrSizei resolution = hmd->Resolution;
     // Do something with the HMD.
The ovrHmd handle is actually a pointer to an ovrHmdDesc struct that contains information about the
HMD and its capabilities, and is used to set up rendering. It includes info for version numbers, frustum, fov, near, far, distorsion, tracking, resolution, ...