This talk will show how to build a simple open source based NUI (Natural User Interface) game with 3D Sensors, incorporating PyOpenNI with PyGame and WebGL. OpenNI allows you operate several 3D sensors, enabling hardware independent game development (supported 3D sensors are Microsoft Kinect, PrimeSense Carmine or Asus XTion). It also runs on Linux, Mac OS X and Windows.
This talk will start with a brief introduction to 3D Sensors and OpenNI. Then we’ll surf into PyOpenNI, features such as the skeleton, hand and gesture tracking, RGB and depth video. Every topic will be presented with practical demos. The talk will end with a demo integrating WebGL (THREE.JS), 3D sensors, Flask and ZMQ to produce a simple fully open source based NUI game.
Attendees will not only learn about game related technologies but also about innovative ways of doing domotics, cinema & art, Interactive visualization, scientific research, educations, etc.
3D Sensors will be available for testing during the event - you can get yours for about 80 to 140 Euros (depending on the brand). Slides and demo code will be available at Github.
- Introduction: hardware and OpenNI goodies and a tale of PCL (5’)
- Hands On PyOpenNI
- Normal and Depth camera - basics concepts and small demo (5’)
- Skeleton - basics concepts and small demo. (5’)
- Hand & gesture - basics concepts and small demo. (5’)
- Final Demo
- What we’re going to use? Flask, ZMQ, THREE.JS, PyOpenNI. (6’)
- Q&A. (4’)