Setting up Kinect for programming in Linux (part 1)
The Kinect, a Microsoft device originally made for the Xbox360 (a gaming console), has become incredibly popular among developers in the past few months; it allows easy tracking of predefined movements without having to wear special clothes or complicated sensors. Once you’re set-up, you would be able to access functions such as hand tracking, scene analysis (count how many people are in the room, where they are), and much more. The first part of this tutorial will guide you through all the required steps to set up the Kinect on your Ubuntu environment.
The Kinect at a glance
The device, with its two cameras, and the IR projector on the left.
How does it work? Using the Structured light technique, the Kinect outputs a depth map, which is basically a matrix containing the distance of each pixel according to the camera.
The device, as recognized by the computer.
The Kinect also comes with audio capabilities and a motorized tilt, so we end up with 3 sub-devices:
Bus 002 Device 003: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor Bus 002 Device 004: ID 045e:02ad Microsoft Corp. Xbox NUI Audio Bus 002 Device 005: ID 045e:02ae Microsoft Corp. Xbox NUI Camera
- Xbox NUI Motor
- Handles the tilt engine and the status LED.
- Xbox NUI Audio
- Handles microphones array
- Xbox NUI Camera
- Handle the cameras
Communicating with the Kinect
Warning: I first tried to define custom installation directories for the different libraries, but it turned out that they need default paths (like /usr/bin) in order to work properly.
To operate the Kinect, we need two drivers:
To control the LED and the tilt motor, we will use freenect library (open source, unofficial):
To control the video flux and get the depth map, we’ll take openNI drivers (open source, official):
We also need PrimeSense(the company that makes the Kinect) sensor module (open source): Warning: Official PrimeSense driver is not compatible with the Kinect, we need to take a modified version.
- Git: https://github.com/avin2/SensorKinect (it’s the modified version)
Finally, we use a higher level library to get hand tracking, predefined gestures (swipes, push, circles, etc) without having to create the algorithms.
Warning: As this library is not open source and therefore only available as binaries, you only would be able to install/use it on Windows or Ubuntu.
- choose “OpenNI compliant Middleware Binaries” – “stable” then select your OS
1- Download needed libraries/software.
mkdir KinectLibs; cd KinectLibs git clone https://github.com/OpenKinect/libfreenect git clone https://github.com/OpenNI/OpenNI git clone https://github.com/avin2/SensorKinect
sudo apt-get install cmake libglut3-dev pkg-config build-essential libxmu-dev libxi-dev libusb-1.0-0-dev python
- if you get “Unable to locate package libglut3-dev”, use this command instead:
sudo apt-get install cmake freeglut3-dev pkg-config build-essential libxmu-dev libxi-dev libusb-1.0-0-dev python
sudo add-apt-repository "deb http://archive.canonical.com/ lucid partner" sudo apt-get update sudo apt-get install sun-java6-jdk
sudo apt-get install doxygen mono-complete graphviz
1- Install openKinect (libFreenect)
# in libfreenect directory, in the KinectLibs dir mkdir build cd build cmake .. make sudo make install sudo ldconfig /usr/local/lib64/
- Once libFreenect is installed, plug the Kinect, then set permission to R/W on the usb devices (motor and camera).
sudo chmod a+rw /dev/bus/usb// sudo chmod a+rw /dev/bus/usb//
lsusb | grep Xbox
libusb couldn't open USB device /dev/bus/usb/001/006: Permission denied. libusb requires write access to USB device nodes.
- Now, let’s see if everything is correctly setup, just run glview, you should get something like
Didn’t get that? try the library FAQ
Tip: you can play a bit with the features with these commands:
‘w’-tilt up, ‘s’-level, ‘x’-tilt down, ‘0’-‘6’-select LED mode, ‘f’-video format
On the left there is an openGL representation of the depth map, the pixel color is set according to the point’s distance to the sensor, on the right you can get the regular RGB camera view, or the infrared one (so you can see the infrared pattern, switch between them with ‘f’)
Let’s now have a look on how to setup the gesture recognition libraries.
2- Install OpenNi
We just installed a perfectly fine working library here, that seems to handle all functions of the Kinect, why would we need another one?
It’s because of the high level library, NITE, which works only with OpenNi drivers, but the OpenNi drivers (which are not Kinect specific) can’t control the Kinect motorized tilt or it’s led. So we need both libraries to have full access to the Kinect.
- we will use libfreenect to control the tilt and the led (so the device Xbox NUI Motor, which also handle the led).
- we will use OpenNi + Sensor module to get the camera streams (the device Xbox NUI Camera)
- we will use NITE libraries in concert with OpenNI to get the high level API (so gesture recognition, hand/skeleton tracking and so on)
# in OpenNI directory, in the ultratronik dir cd Platform/Linux/CreateRedist chmod +x ./RedistMaker ./RedistMaker cd ../Redist/OpenNI-Bin-Dev-Linux-x64-v184.108.40.206/ sudo ./install.sh
Note: it’s Sensor-Bin-Linux-x64-v220.127.116.11for me, but might be different for you, there is only one directory in Redist/ anyway, just replace in case the name is wrong.
4- Install NITE
- Download the library according to your system, then just run install.sh as root. that’s it.
You’re now all set for using the kinect!
Discover the Kinect potential with the examples
Go into your NITE directory, then
cd Samples/Bin/x64-Release ; ls Sample*
These are the available examples, these cover pretty much all the high level recognition handled by NITE.
You can find detailed documentation of these functions in NITE/Documentation/ directory, here is just a “quick start” guide for each example.
- move your hand left<>right (quickly) in front of the sensor until the green sliders appears on screen, you now control the slider with your hand, the cubes are zones.
- move your hand left<>right (quickly) in front of the sensor until the window gets a green frame, then make circles with your hand, you now control the spoke.
- for this one, it’s better if the kinect see your entire body. Get in front of the kinect, you will turn blue, ask some friends to come, they will be detected as well (different colors). Then put your hands up, the kinect now tracks your skeleton. It seems to work only for one player though.
- not sure about this one, it seems to be a handTracker, in a separate thread.
- move your hand left<>right (quickly) or click (push your screen) in front of the sensor until your hand gets tracked.
- seems similar to Sample-Players, without skeleton tracking
- hold your hand steady in front of the Kinect then gently push forward and come back, you’ll get hand points and gesture tracking in debug output (no openGL fancyness this time)
- move your hand left<>right (quickly) or click (push your screen), once you get the focus, move your hand around, squares get selected according to the hand position.
The samples don’t work, I get “Find generator failed Input pointer is null!”
The program needs xml files in order to setup it’s configuration:
- what we are going to use (handtracking, gesture recognition)
- Some device info (licences and so)
- and other settings, have a look here if you want to know everything about these files.
So, the samples are configured to search for the xml files in ../../../Data/ and if they don’t find them, you’ll get this error.
Solution: recompile the samples with another path or put the xml files where the samples expect them.
Next month I’ll show you how to use the Kinect in your Qt applications...