Tutorials-Live Motion Programming

This page presents links to lab guides and notes for learning to write computer programs that read data from eye tracking and motion tracking systems using the professor Bares’ Live Motion wrapper library.

Live Motion provides a uniform way for novice programmers to access motion data in a device-independent fashion. The wrapper is well-suited for programming Web apps written in HTML/JavaScript. The Live Motion toolkit provides C++ server utilities that open a connection to a device and stream motion data in JSON text format over a WebSocket. This makes it possible for any client computing device, tablet, mobile phone, robot, etc. to open a WebSocket to receive a motion data stream. The desktop workstations in the Immersive Stories Lab all have the Live Motion software server and client software installed under the users-public folder. The separate server and client programs may run on the same computer or two different computers. If the client and server run on the same computer, then the client connects to the server using the localhost Internet Protocol (IP) address. If the client and server are two separate computers, then both must be joined onto the same wired or wireless network. For example, an instructor could connect their computer directly to a motion device and run the Live Motion server utility simultaneously stream motion data to all student computers joined on the same network. Optimal performance may require the instructor using a local wifi router. Click this link to open the instructions to stream data to a classroom of clients.

The Live Motion wrapper provides server utilities for a range of motion devices including Tobii consumer-level eye trackers, Microsoft Kinect V2, marker-based trackers from Optitrack, PhaseSpace, and Vicon, and inertial trackers from InterSense, Shadow, and XSens MT, and magnetic trackers from Polhemus. Touch and pressure data can be streamed from the Sensel Morph (discontinued).

Eye Tracking via Tobii Consumer-Level Devices

The Tobii Eye Tracker 5 provides a stream of normalized X, Y points that represent the current gaze target on a screen. Normalized X, Y coordinates are expressed as floating-point values between 0.0 to 1.0, which is independent of the pixel resolution of the monitor screens. Below is a minimalist JavaScript listing to open a connection and receive eye gaze updates.

// Create LiveMotion WebSocket client using default localhost and port 8080.
var motionReader = new LiveMotion( );
// Open WebSocket connection to LiveMotion server program.
motionReader.makeConnection();
// Designate function to receive data updates.
motionReader.startTrackedEyes( doWhenReceiveEyeData );

// Eye track callback receives one object having x and y attributes.
function doWhenReceiveEyeData( eye ) {
console.log("X: " + eye.x + " Y: " + eye.y);
}

Click this link to open the Google Drive folder that contains lab activity notes.

Rigid-Body Prop Tracking

Rigid-body prop tracking devices include the InterSense IS-900 and IS-1200, Optitrack Motive, PhaseSpace, Polhemus, and Vicon Shogun provide updates for each unique rigid body prop. Each received Prop object has a unique id, pos, rot, and quat, where pos has x, y, and z position coordinates, and rot has x, y, and z Euler angles, and quat has x, y, z, and w quaternion rotation values.

Below is a minimalist JavaScript listing to open a connection and receive eye gaze updates.

// Create LiveMotion WebSocket client using default localhost and port 8080.
var motionReader = new LiveMotion( );
// Open WebSocket connection to LiveMotion server program.
motionReader.makeConnection();
// Designate function to receive data updates.
motionReader.startTrackedProps( doWhenReceivePropData );

function doWhenReceivePropData( prop ) {
console.log("ID: " + prop.id);
console.log("Position X: " + prop.pos.x + " Y: " + prop.pos.y + " Z: " + prop.pos.z);
}

Click this link to open the Google Drive folder that contains lab activity notes.

Full-Body Tracking

Full-body tracking devices include the Microsoft Kinect, Optitrack Motive Body, and Vicon Shogun provide updates for each unique human body. Each received Body data object has a unique id to distinguish one human body from another and an array of Joint objects. The indices into the joints array follow the numbering and naming conventions used in the NYU Kinectron project. Each Joint contains the same data found in a RigidBody object – a unique id, pos, rot, and quat, where pos has x, y, and z position coordinates, and rot has x, y, and z Euler angles, and quat has x, y, z, and w quaternion rotation values.

Below is a minimalist JavaScript listing to open a connection and receive eye gaze updates.

// Create LiveMotion WebSocket client using default localhost and port 8080.
var motionReader = new LiveMotion( );
// Open WebSocket connection to LiveMotion server program.
motionReader.makeConnection();
// Designate function to receive data updates.
motionReader.startTrackedBodies( doWhenReceiveBodyData );

function doWhenReceiveBodyData( body ) {
console.log("ID: " + body.id);
let leftHand = body.joints[ kinectron.HANDLEFT ];
console.log("Left Hand X: " + leftHand.pos.x + " Y: " + leftHand.pos.y + " Z: " + leftHand.pos.z);
}

Click this link to open the Google Drive folder that contains lab activity notes.