Monado OpenXR Runtime
t_hand_tracking.h File Reference

Hand tracking interfaces. More...

#include "xrt/xrt_defines.h"
#include "xrt/xrt_frame.h"
#include "xrt/xrt_tracking.h"
Include dependency graph for t_hand_tracking.h:

Go to the source code of this file.

Data Structures

struct  t_image_boundary_circle
 Circular image boundary. More...
 
struct  t_camera_extra_info_one_view
 Information about image boundary and camera orientation for one view. More...
 
struct  t_camera_extra_info
 Information about image boundaries and camera orientations for all the cameras used in a tracking system. More...
 
struct  t_hand_tracking_create_info
 Creation info for the creation of a hand tracker. More...
 
struct  t_hand_tracking_sync
 Synchronously processes frames and returns two hands. More...
 
struct  t_hand_tracking_async
 

Enumerations

enum  t_image_boundary_type { HT_IMAGE_BOUNDARY_NONE , HT_IMAGE_BOUNDARY_CIRCLE }
 Image boundary type. More...
 
enum  t_camera_orientation { CAMERA_ORIENTATION_0 = 0 , CAMERA_ORIENTATION_90 = 90 , CAMERA_ORIENTATION_180 = 180 , CAMERA_ORIENTATION_270 = 270 }
 Logical orientation of the camera image, relative to the user's head. More...
 

Functions

struct t_hand_tracking_asynct_hand_tracking_async_default_create (struct xrt_frame_context *xfctx, struct t_hand_tracking_sync *sync)
 

Detailed Description

Hand tracking interfaces.

Author
Jakob Bornecrantz jakob.nosp@m.@col.nosp@m.labor.nosp@m.a.co.nosp@m.m

Enumeration Type Documentation

◆ t_camera_orientation

Logical orientation of the camera image, relative to the user's head.

For example, Rift S uses CAMERA_ORIENTATION_90 for the two front cameras.

Feel free to move this out of t_hand_tracking if this becomes more generally applicable.

Function Documentation

◆ t_hand_tracking_async_default_create()

struct t_hand_tracking_async* t_hand_tracking_async_default_create ( struct xrt_frame_context xfctx,
struct t_hand_tracking_sync sync 
)
Todo:
We came up with this value just by seeing what worked. With Index and WMR, we'd be around 40ms late by the time the camera frames arrived and were processed.

We really need a way to calibrate this live - something like an exponential filter that looks at the typical maximum time between the time at which we were asked for a sample and most recent processed sample timestamp.

References xrt_frame_sink::push_frame, and U_TYPED_CALLOC.