Multitouch table uses a Kinect for a 3D display

[Bastian] sent in a coffee table he built. This isn’t a place to set your drinks and copies of Make, though: it’s a multitouch table with a 3D display. Since no description can do this table justice, take a look at the video.

The build was inspired by the subject of this Hackaday post where [programming4fun] was able to build a ‘holographic display’ using a regular 2D projector and a Kinect. Both builds work on the principle of redrawing the 3D space in relation to the user’s head – as [Bastian] moves his head around the coffee table, the Kinect tracks his location and moves the 3 dimensional grid of boxes in the opposite direction. It’s extremely clever, and looks to be a promising user interface.

In addition to a Kinect, the coffee table uses a Microsoft Surface-like display; four infrared lasers are placed at the corner and detected with a camera next to the projector in the base.

After the break you can see the demo video and a gallery of the images [Bastion] put up on the NUI group forum.

[Read more…]

DIY "Project Glass" clone looks almost too good to be true

vuzix-project-glass

By now we’re assuming you are all familiar with Google’s “Project Glass”, an ambitious augmented reality project for which they revealed a promotional video last week. [Will Powell] saw the promo vid and was so inspired that he attempted to rig up a demo of Project Glass for himself at home.

While it might seem like a daunting project to take on, [Will] does a lot of work with Kinect-based augmented reality, so his Vuzix/HD webcam/Dragon Naturally Speaking mashup wasn’t a huge step beyond what he does at work. As you can see in the video below, the interface he implemented looks very much like the one Google showed off in their demo, responding to his voice commands in a similar fashion.

He says that the video was recorded in “real time”, though there are plenty of people who debate that claim. We’re guessing that he recorded the video stream fed into the Vuzix glasses rather than recording what was being shown in the glasses, which would make the most sense.

We’d hate to think that the video was faked, mostly because we would love to see Google encounter some healthy competition, but you can decide for yourself.

[Read more…]

Heads-up display mounts on brim of your cap

[Matt Kwan] says that coming up with a personal heads-up display wasn’t that hard. Well that’s because he made design choices that make all the difference.

The goal here was to add some augmented reality to his field of vision. He went with a baseball cap because it’s a pretty easy way to strap something to your head. You can’t see it from this angle, but the setup requires you to cut a rather large hole in brim. The image from a smartphone (HTC Desire Z in this case) which is situated with the screen pointing toward [Matt’s] forehead. The screen reflects off of a small mirror, guiding the image down through a Fresnel lens mounted in the hole of the brim. The image is reflected a second time by the plastic in front of his eyes which is coated with a slightly mirrored material. Since the image is reflected twice it appears right-side up, and the use of the Fresnel lens places the image out about 20 cm in front of his view. He tried to get some images of the effect, but we think you’ve got to see it in person before passing judgement.

This does away with the need to track head movement (there’s a few hacks for that out there though). Augmented reality software is used to turn the view from the smartphone camera into overlay data for the display.

[Thanks Tom]