Embedded systems are no joke, more devices (computers) are implemented into our living environment than ever before. There are new ways how we interact with devices. We are not limiting ourselves with buttons and touchscreens. Instead, we created devices that understand our verbal communication, our face expressions, body gestures and other intuitive ways of communication.

With this project we extended the experience of a passive TV to gesture sensing flavoured experience. Sparkfun’s gesture sensor recognises user’s gestures. Animation and low response latency give it the immersive and intuitive recognition. We used it for marketing purposes, but it can be much more than that!

Hardware: Raspberry PI, Sparkfun ZX gesture sensor Software: Python, OpenGL, Python, bash, Dropbox

This Project vas possible because MakerLab (maker.si) was kind enough to support me for this project. I would also like to thank my mentor Dejan Križaj for all the support.

Video: Youtube