Monado OpenXR Runtime
xrt::auxiliary::tracking::slam Namespace Reference

Namespace for the interface to the external SLAM tracking system. More...

Data Structures

struct  feature_count_sample
 
class  CSVWriter
 Writes a CSV file for a particular row type. More...
 
struct  TrajectoryWriter
 Writes poses and their timestamps to a CSV file. More...
 
struct  TimingWriter
 Writes timestamps measured when estimating a new pose by the SLAM system. More...
 
struct  FeaturesWriter
 Writes feature information specific to a particular estimated pose. More...
 
struct  TrackerSlam
 Main implementation of xrt_tracked_slam. More...
 

Typedefs

using Trajectory = map< timepoint_ns, xrt_pose >
 
using timing_sample = vector< timepoint_ns >
 

Functions

ostream & operator<< (ostream &os, const xrt_pose_sample &s)
 
ostream & operator<< (ostream &os, const timing_sample &timestamps)
 
ostream & operator<< (ostream &os, const feature_count_sample &s)
 
static void timing_ui_setup (TrackerSlam &t)
 
static vector< timepoint_nstiming_ui_push (TrackerSlam &t, const vit_pose_t *pose, int64_t ts)
 Updates timing UI with info from a computed pose and returns that info. More...
 
static void features_ui_setup (TrackerSlam &t)
 
static vector< int > features_ui_push (TrackerSlam &t, const vit_pose_t *pose, int64_t ts)
 
static xrt_pose get_gt_pose_at (const Trajectory &gt, timepoint_ns ts)
 Gets an interpolated groundtruth pose (if available) at a specified timestamp. More...
 
static struct xrt_pose xr2gt_pose (const xrt_pose &gt_origin, const xrt_pose &xr_pose)
 Converts a pose from the tracker to ground truth. More...
 
static struct xrt_pose gt2xr_pose (const xrt_pose &gt_origin, const xrt_pose &gt_pose)
 The inverse of xr2gt_pose. More...
 
static void gt_ui_setup (TrackerSlam &t)
 
static void gt_ui_push (TrackerSlam &t, timepoint_ns ts, xrt_pose tracked_pose)
 
static bool flush_poses (TrackerSlam &t)
 Dequeue all tracked poses from the SLAM system and update prediction data with them. More...
 
static void predict_pose_from_imu (TrackerSlam &t, timepoint_ns when_ns, xrt_space_relation base_rel, timepoint_ns base_rel_ts, struct xrt_space_relation *out_relation)
 Integrates IMU samples on top of a base pose and predicts from that. More...
 
static void predict_pose (TrackerSlam &t, timepoint_ns when_ns, struct xrt_space_relation *out_relation)
 Return our best guess of the relation at time when_ns using all the data the tracker has. More...
 
static void filter_pose (TrackerSlam &t, timepoint_ns when_ns, struct xrt_space_relation *out_relation)
 Various filters to remove noise from the predicted trajectory. More...
 
static void setup_ui (TrackerSlam &t)
 
static void add_camera_calibration (const TrackerSlam &t, const t_slam_camera_calibration *calib, uint32_t cam_index)
 
static void add_imu_calibration (const TrackerSlam &t, const t_slam_imu_calibration *imu_calib)
 
static void send_calibration (const TrackerSlam &t, const t_slam_calibration &c)
 

Variables

constexpr int UI_TIMING_POSE_COUNT = 192
 
constexpr int UI_FEATURES_POSE_COUNT = 192
 
constexpr int UI_GTDIFF_POSE_COUNT = 192
 

Detailed Description

Namespace for the interface to the external SLAM tracking system.

See also
t_slam_tracker_config

Function Documentation

◆ filter_pose()

static void xrt::auxiliary::tracking::slam::filter_pose ( TrackerSlam t,
timepoint_ns  when_ns,
struct xrt_space_relation out_relation 
)
static

Various filters to remove noise from the predicted trajectory.

Todo:
Implement the quaternion averaging with a m_ff_vec4_f32 and normalization. Although it would be best to have a way of generalizing types before so as to not have redundant copies of ff logic.

◆ flush_poses()

static bool xrt::auxiliary::tracking::slam::flush_poses ( TrackerSlam t)
static

Dequeue all tracked poses from the SLAM system and update prediction data with them.

References xrt::auxiliary::tracking::slam::TrackerSlam::tracker, and xrt::auxiliary::tracking::slam::TrackerSlam::vit.

◆ get_gt_pose_at()

static xrt_pose xrt::auxiliary::tracking::slam::get_gt_pose_at ( const Trajectory &  gt,
timepoint_ns  ts 
)
static

Gets an interpolated groundtruth pose (if available) at a specified timestamp.

References xrt_pose::XRT_POSE_IDENTITY.

◆ gt2xr_pose()

static struct xrt_pose xrt::auxiliary::tracking::slam::gt2xr_pose ( const xrt_pose gt_origin,
const xrt_pose gt_pose 
)
static

The inverse of xr2gt_pose.

◆ predict_pose()

static void xrt::auxiliary::tracking::slam::predict_pose ( TrackerSlam t,
timepoint_ns  when_ns,
struct xrt_space_relation out_relation 
)
static

Return our best guess of the relation at time when_ns using all the data the tracker has.

◆ predict_pose_from_imu()

static void xrt::auxiliary::tracking::slam::predict_pose_from_imu ( TrackerSlam t,
timepoint_ns  when_ns,
xrt_space_relation  base_rel,
timepoint_ns  base_rel_ts,
struct xrt_space_relation out_relation 
)
static

Integrates IMU samples on top of a base pose and predicts from that.

Todo:
Instead of using same a and g values, do an interpolated sample like this:

References xrt::auxiliary::tracking::slam::TrackerSlam::gyro_ff, xrt::auxiliary::tracking::slam::TrackerSlam::lock_ff, m_ff_vec3_f32_get(), and os_mutex::os_mutex_lock().

◆ timing_ui_push()

static vector<timepoint_ns> xrt::auxiliary::tracking::slam::timing_ui_push ( TrackerSlam t,
const vit_pose_t *  pose,
int64_t  ts 
)
static

◆ xr2gt_pose()

static struct xrt_pose xrt::auxiliary::tracking::slam::xr2gt_pose ( const xrt_pose gt_origin,
const xrt_pose xr_pose 
)
static

Converts a pose from the tracker to ground truth.

Todo:
Right now this is hardcoded for Basalt and the EuRoC vicon datasets groundtruth and ignores orientation. Applies a fixed transformation so that the tracked and groundtruth trajectories origins and general motion match. The usual way of evaluating trajectory errors in SLAM requires to first align the trajectories through a non-linear optimization (e.g. gauss newton) so that they are as similar as possible. For this you need the entire tracked trajectory to be known beforehand, which makes it not suitable for reporting an error metric in realtime. See this 2-page paper for more info on trajectory alignment: https://ylatif.github.io/movingsensors/cameraReady/paper07.pdf