Kinect Hacks

Using a stereoscopic projector and the Kinect camera, real objects are rendered digitally in a 3-D space.

What humans can accomplish with a gesture is amazing. By holding out a hand, palm forward, we can stop a group of people from approaching a dangerous situation; by waving an arm, we can invite people into a room. Without a touch, we can direct the actions of others, simply through gestures. Soon, with those same types of gestures, we’ll be directing the operations of heavy pieces of machinery and entire assembly lines.

Manufacturing workers are on the verge of replacing the mouse-and-keyboard-based graphical user interface (GUI) with newer options. Already, touchscreens are making great inroads into manufacturing. And in many locations, the adoption of other natural user interfaces (NUIs) is expanding to incorporate eye scans, fingerprint scans and gesture recognition. These interfaces are natural and relevant spinoffs of the type of technology we find today in video games, such as those using Microsoft’s Kinect.

In the gaming world, gestures and voices are recognized by Kinect through an orchestrated set of technologies: a color video camera, a depth sensor that establishes a 3-D perspective and a microphone array that picks out individual players’ voices from the background room noise. In addition, Kinect has special software that tracks a player’s skeleton to recognize the difference between motion of the limbs and movement of the entire body.

The combined technologies can accurately perceive the room’s layout and determine each player’s body shape and position so that the game responds accordingly.One can expect to see NUI applications working in every industry imaginable—from health care to education, retail to travel—extending user interactions in multiple ways.

NUI technology is of particular interest to the manufacturing industry. For instance, when a worker logs on to a machine, instead of clicking a mouse and entering a personal ID and password on a computer screen, the user will look into a sensing device that will perform a retinal scan for identification. Then, just by using hand gestures, the identified worker can start a machine or, with an outstretched hand, stop it. The machine may ask for the employee to confirm the requested action verbally, and a simple “yes” response will execute the command.

Avatar Kinect replicates a user’s speech, head movements and facial expressions on an Xbox avatar, and lets users hang out with friends in virtual environments and shoot animated videos to share online.

NUI technologies can improve ways to move products across assembly lines, as well as to build them on an individual line. For example, if a batch of partially assembled components must be transferred to a pallet or another machine, the worker can use a gesture to designate the subassemblies to be moved and the location of their destination.

Safeguards can be built into the NUI system so that unrelated movements or conversations in the plant do not accidentally initiate a command. Each machine will know who is logged in to it and will respond exclusively to that individual’s motions and voice. The computer could even be set to shut down automatically if its “commander” is away from the station for more than a selected period of time.

The benefits of NUI technology specific to manufacturing will be extensive. Many of these examples are already in development:

• Employees who must wear gloves on the job no longer need to remove them to operate a keyboard, so they can carry out their work and react to situations more speedily, resulting in higher productivity, faster throughput and higher safety in the workplace.

• Those who work in areas that contain considerable dirt, dust and grease know that touchscreens quickly can become smudged and difficult to view. With gestures, the screen can remain clean. Using the gesture-based NUI in these situations also reduces the spread of contagion and therefore improves health and productivity on the job.

• When computers remain cleaner, because they are touched only infrequently, the manufacturer can cut costs significantly. The screen and other computer components require less maintenance and repair, and elements such as a keyboard are no longer required investments.

Microsoft Dynamics is taking a lead in incorporating NUI technologies into its offerings. The Microsoft Dynamics AX 2012 enterprise resource planning solution offers a touch-based user interface for the shop floor, and independent software developers are working on gesture-based interfaces to provide touchless commands.

The first generation of gesture-based equipment will soon be installed in plants that manufacture heavy equipment, such as cars and large machine tools. Dirt and grease in such facilities can cause substantial problems for conventional computer control units.

NUIs also are expected to become popular in such difficult environments as cold rooms, where workers must wear heavy gloves, and pharmaceutical and food-processing plants, which require exceptional levels of cleanliness.

In the near future, we might see systems that can track the eyes of workers to anticipate the next command. And, soon, NUI interfaces will enter the office environment, where the productivity and cost-effectiveness they offer will be just as important as they are on the plant floor. With such widespread applications, voice- and gesture-based interfaces are certain to usher in an era in which interacting with technology becomes easier, faster and less costly.

by Rakesh Kumar in EE|Times

That’s right — not even CES can stop the endless wave of Kinect hacks. The latest, and one of the more impressive to date, is the so-called “Magic Mirror” developed by Tobias Blum from the Technical University of Munich, which bridges augmented reality with x-ray vision (of sorts). Of course, the “of sorts” is that it doesn’t actually peer through your body to reveal your skeleton (yet), but instead maps a random skeleton from a CT scan onto your frame to create a real-time freakout!!

Check out the video ..

DaVinci is an App for Microsoft Surface Table.  The creators have ported this to Kinect.

Gestures are used to create objects and control the physics of the environment. Your hands appear in the interface which allows you to literally grab objects out of thin air and move them in the environment. Additional gestures allow you to affect the gravity, magnetism and attraction.

Right now i thing this is the best Kinect usage i have ever seen. Watch the video to understand .