Learning Three.js

or WebGL for Dummies

Move a Cube With Your Head or Head-Tracking With WebGL

This post is about head tracking and how to use it in 3D. It is surprisingly easy to do with the suitable libraries. We will experiment with headtrackr.js and three.js. headtrackr.js is a nice library from auduno to do head tracking in the browser. You will learn how to do head tracking in webgl in only 20lines of javascript. I love the web and how easy it is :)

tl;dr; links

WebRTC is great!

WebRTC starts to get traction. I love that! We have seen WebRTC and getUserMedia several times in the past: in “Punch a Doom Character in Augmented Reality” post, in “Fun With Live Video in WebGL” post and “Augmented Reality 3D Pong” post. It is already in chrome stable , and will be in firefox real soon. They already talk to each other. Here we don’t need the network part of webrtc. We only need get the webcam video, so getUserMedia is enougth. It is in opera 12 too as you can read here.

auduno is part of Opera team. He wrote it as a demo for opera 12 release which contained getUserMedia. For more info on the library, auduno blogged about internals of his library. You can find details in his blog post. Additionnal info are available in the reference documentation. Some examples are already in three.js, like targets or facekat.

Demo Time !!

As usual we did a plugin for tQuery API to make it easy to use in our environement. One can find 2 examples for it: A educational example where your heads controls a box in 3d. For best result, make sure your face is well and evenly lighted

Another demo where the camera follows your head. The whole scene moves as you move your head, providing quite an immersive experience. You can play with it thru jsfiddle too.

Let’s Get Started

Ok now lets see how to use this library with tQuery API. First, we include the tquery.headtrackr files in your code. tQuery plugins supports require.js. It makes dependancies much easier to handle. tquery.headtrackr is no exception, so to include it you can do

1
2
3
require(['tquery.headtrackr'], function(){
  // Your code ...   
});

Or if you use the good old <script>, do something like that to include headtrackr.js itself, the library which handle the head tracking. Then you just include the plugin itself, and you are done.

1
2
<script src="headtrackr.js"></script>
<script src="tquery.headtrackr.js"></script>

Start Tracking Heads

First, you instanciate the object with this simple line. You can pass various options to .createHeadtrackr(opts). Here, opts is an Object with those properties

  • opts.width : width of the image containing the face. default to 320px
  • opts.height : height of the image containing the face. default to 240px
  • opts.headtrackrOpts : options passed directly to headtrackr.js. default to {}
1
var headTracker    = tQuery.createHeadtrackr();

Default are reasonable, so chances are you dont need to specify anything. To start tracking the head on the webcam, just do the following

1
headTracker.start();

It is possible to stop it with .stop() or to reset it via .reset().

Debug View is Cool For User Feedback

If you wish enable the debugView, aka the little visualisation the headtracker result. It gives feedback to the user on what is happening. Thus the user can move his head accordingly or to change lighting of the room.

1
headTracker.debugView(true);

Face Position Notified thru Events

When a face is found, events are dispatched to notify the detected positions.

1
2
3
headTracker.addEventListener("found", function(event){
  // Your code ...
});

event contains normalized coordinates of the detected face. They use the same axis as WebGL. If the head is on the center, event.x and event.y will be 0. And if the head is vertical, event.angle is 0. More precisely

  • .x and .y : It is the center position. it varies from [-1,+1], from left to right and bottom to top.
  • .width and .height: the width and height :) If it is half of whole image, it is equal to 1.
  • .angle: the Z rotation of the detected head. It is in radian as usual.
  • .headtrackrEvent: the original facetrackingEvent event from headtrackr.js (see reference )

Head tracking… Kesaco ?

Head tracking is a well known concept. One can find head tracking on ipad. One can find head tracking on wii. They got impressive result using the informations from the wiimote or even the device orientation. With the kinect, they even track the features of the face itself (e.g. mouth, noze, eyes etc…)

In our case, we use the image from the webcam. Unfortunatly face localisation from an image isn’t exactly 100% accurate to say the least :) See here, this is the same demo as the wii one or the ipad one. Yet the result isn’t as convincing. With headtrackr.js and webrtc , we use only the webcam in a uncontrolled environement. So the accuracy is in consequences.

You can improve efficiency by following a few simples advices: Avoid hats or a too crazy haircut. Being bold with a beard doesn’t help :) Make sure your face is well and evenly lighted and you should be fine.

Conclusion

In this post, we have seen it is now possible to do head tracking in a web browser !! Impressive if you ask me! Even better, it is easy if you use suitable libraries. Coupled with three.js and tQuery API, it is possible provide new immersive experience in 20lines of javascript. Im so excited. This kind of things was academic research 5 years ago, and now everybody can easily use it. We will likely do more with headtrackr.js. This is a very nice library with lots of possibilities. For example, one can use it the head as a game controller, or in a artistic exposition. Stay tuned!

That’s all folks, have fun :)

Comments