Homebuilt augmented reality touch table uses Kinect sensors, lasers and PlayStation Eye

Bastian Broecker, an NUI Group forum member, has created what the highly paid engineers at big software companies have not even attempted yet- a touch table that makes 3D holographic display possible. To give the semblance of a user interacting with objects floating in space, the designer used Microsoft’s Kinect sensor, a projector and touch table along with four infrared laser beams.

3D augmented reality coffee table

To make the image appear like it were interacting with a user’s hand, Kinect is used to track the user’s head in relation to the table and then the image is adjusted. Christened Augmented Reality Table v0.1, the design consists of a mirror mounted at an angle that reflects the output of a projector mounted under the table, a PlayStation Eye finger-tracking camera housed next to the projector and lasers mounted at each of the four corners of the table.

To create a 3D augmented reality experience, the head tracking software works in tandem with the 3D environment while finger motion is detected by the PlayStation Eye camera to allow the user’s hand to seem like it is interacting with projected image. Broecker has not provided the details about the processors used and the software needed to make this trick work.

Via: The Verge

Add your comment

Your email address will not be published. Required fields are marked *

*

seventeen + 19 =

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Archives

Copyright © ESHI INTERNATIOAN PTE LTD All Rights Reserved | Privacy Policy