using the kinect

83
Using the Kinect for fun and profit

Upload: wyoming-snider

Post on 02-Jan-2016

46 views

Category:

Documents


1 download

DESCRIPTION

Using the Kinect. for fun and profit. About /me. Tam HANNA Director, Tamoggemon Holding k,s Runs web sites about mobile computing Writes scientific books. Agenda. Kinect – what is that? Streams Skeletons Facial tracking libfreenect OpenNI. Slide download. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Using the  Kinect

Using the Kinect

for fun

and profit

Page 2: Using the  Kinect

About /me

• Tam HANNA– Director,

Tamoggemon Holding k,s

– Runs web sites about mobile computing

– Writes scientific books

Page 3: Using the  Kinect

Agenda

• Kinect – what is that?

• Streams

• Skeletons

• Facial tracking

• libfreenect

• OpenNI

Page 4: Using the  Kinect

Slide download

• http://www.tamoggemon.com/test/ Codemotion-Kinect.ppt

• URL IS case sensitive

Page 5: Using the  Kinect

Kinect – what is that?

Page 6: Using the  Kinect

History - I

• Depth: PrimeSense technology– Not from Redmond

• First public mention: 2007– Bill Gates, D3 conference– „Camera for game control“

Page 7: Using the  Kinect

Contrast detection

Where does the shirt end?

Page 8: Using the  Kinect

Dot matrix

Page 9: Using the  Kinect

Shadows / dead areas

Page 10: Using the  Kinect

Shadows / dead areas - II

Page 11: Using the  Kinect

History - II

• 2008: Wii ships– Best-selling console of its generation

• 2009: E3 conference– Announcement of „Project Natal“

• 2010: no CPU in sensor– Takes 10% of XBox 360 CPU

Page 12: Using the  Kinect

History - III

• 4. November 2010– First shipment– “We will sue anyone who reverse engineers“

• June 2011– Official SDK

Page 13: Using the  Kinect

System overview

Page 14: Using the  Kinect

Kinect provides

• Video stream

• Depth stream– (IR stream)

• Accelerometer data

• Rest: computedRest: computed

Page 15: Using the  Kinect

Family tree

• Kinect for XBOX– Normal USB

• Kinect bundle– MS-Fucked USB– Needs PSU

• Kinect for Windows– Costs more– Legal to deploy

Page 16: Using the  Kinect

Cheap from China

Page 17: Using the  Kinect

Streams

Page 18: Using the  Kinect

Kinect provides „streams“

• Repeatedly updated bitmaps

• Push or Pull processes possible– Attention: processing time!!!

Page 19: Using the  Kinect

Color stream

• Two modes– VGA@30fps– 1280x960@12fps

• Simple data format– 8 bits / component– R / G / B / A components

Page 20: Using the  Kinect

Depth stream

• Two modes– Unlimited range– Reduced range, with player indexing

Page 21: Using the  Kinect

Depth stream - II

• 16bit words

• Special encoding for limited range:

Tiefe[12]

Tiefe[11]

Tiefe[10]

Tiefe[9]

Tiefe[8]

Tiefe[7]

Tiefe[6]

Tiefe[5]

Tiefe[4]

Tiefe[3]

Tiefe[2]

Tiefe[1]

Tiefe[0]

Spieler[2]

Spieler[1]

Spieler[0]

Page 22: Using the  Kinect

Depth stream - III

Page 23: Using the  Kinect

IR stream

• Instead of color data

• 640x480@30fps

• 16bit words

• IR data in 10 MSB bits

Page 24: Using the  Kinect

Finding the Kinect

• SDK supports multiple Sensors/PC

• Find one

• Microsoft.Kinect.Toolkit

Page 25: Using the  Kinect

XAML part<Window x:Class="KinectWPFD2.MainWindow" xmlns:toolkit="clr-

namespace:Microsoft.Kinect.Toolkit;assembly=Microsoft.Kinect.Toolkit" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="MainWindow" Height="759" Width="704"> <Grid> <Image Height="480" HorizontalAlignment="Left" Name="image1"

Stretch="Fill" VerticalAlignment="Top" Width="640" /> <toolkit:KinectSensorChooserUI x:Name="SensorChooserUI"

IsListening="True" HorizontalAlignment="Center" VerticalAlignment="Top" />

<CheckBox Content="Overlay rendern" Height="16" HorizontalAlignment="Left" Margin="267,500,0,0" Name="ChkRender" VerticalAlignment="Top" />

</Grid></Window>

Page 26: Using the  Kinect

Code - I public partial class MainWindow : Window { KinectSensor mySensor;

KinectSensorChooser myChooser;

public MainWindow() { InitializeComponent();

myChooser = new KinectSensorChooser(); myChooser.KinectChanged += new

EventHandler<KinectChangedEventArgs>(myChooser_KinectChanged); this.SensorChooserUI.KinectSensorChooser = myChooser; myChooser.Start();

Page 27: Using the  Kinect

Code - II void myChooser_KinectChanged(object sender,

KinectChangedEventArgs e) { if (null != e.OldSensor) {

if (mySensor != null) { mySensor.Dispose(); } }

if (null != e.NewSensor) { mySensor = e.NewSensor;

Page 28: Using the  Kinect

Initialize streammySensor.DepthStream.Enable(DepthImageFormat.Resolution640x480Fps30);mySensor.ColorStream.Enable(ColorImageFormat.RgbResolution640x480Fps30);myArray = new short[this.mySensor.DepthStream.FramePixelDataLength];myColorArray = new byte[this.mySensor.ColorStream.FramePixelDataLength];mySensor.AllFramesReady += new

EventHandler<AllFramesReadyEventArgs>(mySensor_AllFramesReady); try { this.mySensor.Start(); SensorChooserUI.Visibility = Visibility.Hidden; }

Page 29: Using the  Kinect

Process stream

void mySensor_AllFramesReady(object sender, AllFramesReadyEventArgs e)

{ ColorImageFrame c = e.OpenColorImageFrame(); DepthImageFrame d = e.OpenDepthImageFrame();

if (c == null || d == null) return;

c.CopyPixelDataTo(myColorArray); d.CopyPixelDataTo(myArray);

Page 30: Using the  Kinect

Problem: Calibration

• Depth and Color sensors are not aligned

• Position of data in array does not match

Page 31: Using the  Kinect

Solution

• CoordinateMapper class

• Maps between various frame types– Depth and Color– Skeleton and Color

Page 32: Using the  Kinect

On Push mode

• Kinect can push data to application

• Preferred mode of operation

• But: sensitive to proc time

• If handler takes too long -> App stops

Page 33: Using the  Kinect

Skeletons

Page 34: Using the  Kinect

What is tracked?

• Data format– Real life coordinates

• Color-Mappable

Page 35: Using the  Kinect

Initialize stream

if (null != e.NewSensor)

{

mySensor = e.NewSensor; mySensor.SkeletonStream.Enable();

Page 36: Using the  Kinect

Get joints void mySensor_AllFramesReady(object sender, AllFramesReadyEventArgs e) { ColorImageFrame c = e.OpenColorImageFrame(); SkeletonFrame s = e.OpenSkeletonFrame();

if (c == null || s == null) return;

c.CopyPixelDataTo(myColorArray); s.CopySkeletonDataTo(mySkeletonArray);

foreach (Skeleton aSkeleton in mySkeletonArray) {

DrawBone(aSkeleton.Joints[JointType.HandLeft], aSkeleton.Joints[JointType.WristLeft], armPen, drawingContext);

Page 37: Using the  Kinect

Use joints private void DrawBone(Joint jointFrom, Joint jointTo, Pen aPen,

DrawingContext aContext) { if (jointFrom.TrackingState == JointTrackingState.NotTracked || jointTo.TrackingState == JointTrackingState.NotTracked) {}

if (jointFrom.TrackingState == JointTrackingState.Inferred || jointTo.TrackingState == JointTrackingState.Inferred) { ColorImagePoint p1 =

mySensor.CoordinateMapper.MapSkeletonPointToColorPoint(jointFrom.Position, ColorImageFormat.RgbResolution640x480Fps30);

} if (jointFrom.TrackingState == JointTrackingState.Tracked || jointTo.TrackingState == JointTrackingState.Tracked)

Page 38: Using the  Kinect

Facial trackingFacial tracking

Page 39: Using the  Kinect

What is tracked - I

Page 40: Using the  Kinect

What is tracked - II

Page 41: Using the  Kinect

What is tracked - III

Page 42: Using the  Kinect

AU‘s?

• Research by Paul EKMAN

• Quantify facial motion

Page 43: Using the  Kinect

Structure

• C++ library with algorithms

• Basic .net wrapper provided– Incomplete– Might change!!

Page 44: Using the  Kinect

Initialize face tracker

myFaceTracker = new FaceTracker(mySensor);

Page 45: Using the  Kinect

Feed face tracker FaceTrackFrame myFrame = null; foreach (Skeleton aSkeleton in mySkeletonArray) { if (aSkeleton.TrackingState == SkeletonTrackingState.Tracked) { myFrame =

myFaceTracker.Track(ColorImageFormat.RgbResolution640x480Fps30, myColorArray, DepthImageFormat.Resolution640x480Fps30, myArray, aSkeleton);

if (myFrame.TrackSuccessful == true) { break; } } }

Page 46: Using the  Kinect

Calibration

• OUCH!– Not all snouts are equal

• Maximums vary

Page 47: Using the  Kinect

libfreenect

Page 48: Using the  Kinect

What is it

• Result of Kinect hacking competition

• Bundled with most Linux distributions

• „Basic Kinect data parser“

Page 49: Using the  Kinect

Set-up

• /etc/udev/rules.d/66-kinect.rules

#Rules for Kinect ####################################################SYSFS{idVendor}=="045e", SYSFS{idProduct}=="02ae", MODE="0660",GROUP="video"SYSFS{idVendor}=="045e", SYSFS{idProduct}=="02ad", MODE="0660",GROUP="video"SYSFS{idVendor}=="045e", SYSFS{idProduct}=="02b0", MODE="0660",GROUP="video"### END #############################################################

Page 50: Using the  Kinect

Set-up II

• sudo adduser $USER plugdev

• sudo usermod -a -G video tamhan

• tamhan@tamhan-X360:~$ freenect-glviewKinect camera test

Number of devices found: 1

Could not claim interface on camera: -6

Could not open device

Page 51: Using the  Kinect

Set-up III

Page 52: Using the  Kinect

Problems

• gspca-kinect– Kernel module, uses Kinect as webcam– Blocks other libraries– sudo modprobe -r gspca_kinect

• Outdated version widely deployed– API not compatible

Page 53: Using the  Kinect

Update library

• sudo foo

• sudo add-apt-repository ppa:floe/libtisch

• sudo apt-get update

• sudo apt-get install libfreenect libfreenect-dev libfreenect-demos

Page 54: Using the  Kinect

libfreenect - II

color stream

Page 55: Using the  Kinect

Implementing it

• libfreenect: C++ library

• Question: which framework

• Answer: Qt ( what else ;) )

Page 56: Using the  Kinect

The .pro file

QT += core gui

TARGET = QtDepthFrame

CONFIG += i386

DEFINES += USE_FREENECT

LIBS += -lfreenect

Page 57: Using the  Kinect

The freenect thread

• Library needs processing time– Does not multithread itself

• Should be provided outside of main app

Page 58: Using the  Kinect

class QFreenectThread : public QThread{ Q_OBJECTpublic: explicit QFreenectThread(QObject *parent = 0); void run();

signals:

public slots:

public: bool myActive; freenect_context *myContext;};

Page 59: Using the  Kinect

QFreenectThread::QFreenectThread(QObject *parent) : QThread(parent){}

void QFreenectThread::run(){ while(myActive) { if(freenect_process_events(myContext) < 0) { qDebug("Cannot process events!"); QApplication::exit(1); } }}

Page 60: Using the  Kinect

QFreenect

• Main engine module– Contact point between Kinect and app

• Fires off signals on frame availability

Page 61: Using the  Kinect

• class QFreenect : public QObject• {• Q_OBJECT• public:• explicit QFreenect(QObject *parent = 0);• ~QFreenect();• void processVideo(void *myVideo, uint32_t myTimestamp=0);• void processDepth(void *myDepth, uint32_t myTimestamp=0);

• signals:• void videoDataReady(uint8_t* myRGBBuffer);• void depthDataReady(uint16_t* myDepthBuffer);

• public slots:

Page 62: Using the  Kinect

• private:• freenect_context *myContext;• freenect_device *myDevice;• QFreenectThread *myWorker;• uint8_t* myRGBBuffer;• uint16_t* myDepthBuffer;• QMutex* myMutex;

• public:• bool myWantDataFlag;• bool myFlagFrameTaken;• bool myFlagDFrameTaken;• static QFreenect* mySelf;• };

Page 63: Using the  Kinect

Some C++

QFreenect* QFreenect::mySelf;

static inline void videoCallback(freenect_device *myDevice, void *myVideo, uint32_t myTimestamp=0)

{ QFreenect::mySelf->processVideo(myVideo, myTimestamp);}

static inline void depthCallback(freenect_device *myDevice, void *myVideo, uint32_t myTimestamp=0)

{ QFreenect::mySelf->processDepth(myVideo, myTimestamp);}

Page 64: Using the  Kinect

Bring-up• QFreenect::QFreenect(QObject *parent) :• QObject(parent)• {• myMutex=NULL;• myRGBBuffer=NULL;

• myMutex=new QMutex();• myWantDataFlag=false;• myFlagFrameTaken=true;• mySelf=this;

• if (freenect_init(&myContext, NULL) < 0)• {• qDebug("init failed");• QApplication::exit(1);• }

Page 65: Using the  Kinect

Bring-up – II• freenect_set_log_level(myContext, FREENECT_LOG_FATAL);

• int nr_devices = freenect_num_devices (myContext);• if (nr_devices < 1)• {• freenect_shutdown(myContext);• qDebug("No Kinect found!");• QApplication::exit(1);• }

• if (freenect_open_device(myContext, &myDevice, 0) < 0)• {• qDebug("Open Device Failed!");• freenect_shutdown(myContext);• QApplication::exit(1);• }

Page 66: Using the  Kinect

• myRGBBuffer = (uint8_t*)malloc(640*480*3);• freenect_set_video_callback(myDevice,

videoCallback);• freenect_set_video_buffer(myDevice,

myRGBBuffer);• freenect_frame_mode vFrame =

freenect_find_video_mode(FREENECT_RESOLUTION_MEDIUM,FREENECT_VIDEO_RGB);

• freenect_set_video_mode(myDevice,vFrame);• freenect_start_video(myDevice);

Page 67: Using the  Kinect

• myWorker=new QFreenectThread(this);

• myWorker->myActive=true;

• myWorker->myContext=myContext;

• myWorker->start();

Page 68: Using the  Kinect

Shut-Down

• QFreenect::~QFreenect()• {• freenect_close_device(myDevice);• freenect_shutdown(myContext);• if(myRGBBuffer!=NULL)free(myRGBBuffer);• if(myMutex!=NULL)delete myMutex;• }

Page 69: Using the  Kinect

Data passingvoid QFreenect::processVideo(void *myVideo, uint32_t

myTimestamp){ QMutexLocker locker(myMutex); if(myWantDataFlag && myFlagFrameTaken) { uint8_t* mySecondBuffer=(uint8_t*)malloc(640*480*3); memcpy(mySecondBuffer,myVideo,640*480*3); myFlagFrameTaken=false; emit videoDataReady(mySecondBuffer); }}

Page 70: Using the  Kinect

Format of data word

• Array of bytes

• Three bytes = one pixel

Page 71: Using the  Kinect

Format of data word - II

for(int x=2; x<640;x++) { for(int y=0;y<480;y++) { r=(myRGBBuffer[3*(x+y*640)+0]); g=(myRGBBuffer[3*(x+y*640)+1]); b=(myRGBBuffer[3*(x+y*640)+2]); myVideoImage->setPixel(x,y,qRgb(r,g,b)); } }

Page 72: Using the  Kinect

libfreenect - III

depth stream

Page 73: Using the  Kinect

Extra bring-up

myDepthBuffer= (uint16_t*)malloc(640*480*2);freenect_set_depth_callback(myDevice,

depthCallback);freenect_set_depth_buffer(myDevice,

myDepthBuffer);freenect_frame_mode aFrame =

freenect_find_depth_mode( FREENECT_RESOLUTION_MEDIUM, FREENECT_DEPTH_REGISTERED);

freenect_set_depth_mode(myDevice,aFrame);freenect_start_depth(myDevice);

Page 74: Using the  Kinect

Extra processingvoid QFreenect::processDepth(void *myDepth, uint32_t

myTimestamp){ QMutexLocker locker(myMutex); if(myWantDataFlag && myFlagDFrameTaken) { uint16_t* mySecondBuffer=(uint16_t*)malloc(640*480*2); memcpy(mySecondBuffer,myDepth,640*480*2); myFlagDFrameTaken=false; emit depthDataReady(mySecondBuffer); }}

Page 75: Using the  Kinect

Data extraction

void MainWindow::depthDataReady(uint16_t* myDepthBuffer)

{ if(myDepthImage!=NULL)delete myDepthImage; myDepthImage=new

QImage(640,480,QImage::Format_RGB32); unsigned char r, g, b; for(int x=2; x<640;x++) { for(int y=0;y<480;y++) { int calcval=(myDepthBuffer[(x+y*640)]);

Page 76: Using the  Kinect

Data is in meters if(calcval==FREENECT_DEPTH_MM_NO_VALUE) { r=255; g=0;b=0; } else if(calcval>1000 && calcval < 2000) { QRgb aVal=myVideoImage->pixel(x,y); r=qRed(aVal); g=qGreen(aVal); b=qBlue(aVal); } else { r=0;g=0;b=0; } myDepthImage->setPixel(x,y,qRgb(r,g,b));

Page 77: Using the  Kinect

Example

Page 78: Using the  Kinect

OpenNI

Page 79: Using the  Kinect

What is OpenNI?

• Open standard for Natural Interfaces– Very Asus-Centric

• Provides generic NI framework

• VERY complex APIVERY complex API

Page 80: Using the  Kinect

Version 1.5 vs Version 2.0

Page 81: Using the  Kinect

Supported platforms

• Linux

• Windows– 32bit only

Page 82: Using the  Kinect

Want more?

• Book– German language– 30 Euros

• Launch– When it‘s done!

Page 83: Using the  Kinect

?!?

[email protected]@tamhanna

Images: pedroserafin, mattbuck