SixthSense 'is a wearable gestural interface that adds to the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
We have evolved over millions of years to sense the world around us. When we encounter something, someone or some place, we use our five natural senses to see information about it, information that helps us make decisions and choose the appropriate action to take. But arguably the most useful information that can help us make the right decisions are not understood by our five natural senses, ie data, information and human knowledge that has accumulated about everything and that increasingly all available online. Although the miniaturization of computing devices allows us to carry computers in our pockets, keeping us continually connected to the digital world, there is no relationship between our digital devices and our interactions with the physical world. Traditionally limited information on paper or digitally on the screen. SixthSense bridges this gap, bringing intangible, digital information into the real world, and allows us to interact with this information via natural hand gestures. 'SixthSense' frees information from the confines of seamlessly integrating with reality, and thus make the entire world your computer.
SixthSense prototype comprises a pocket projector, mirror and camera. Hardware components are coupled in a pendant like mobile wearable. Both the projector and the camera is connected to the mobile computing device in the pocket of the user. The projector projects visual information enabling surfaces, walls and physical objects around us to use as the interface, while the camera recognizes and tracks user's hand gestures and physical objects using computer vision-based techniques. The software program processes the video stream data captured by the camera and track the location of the colored markers (visual tracking fiducials) at the fingertips of users using a simple computer vision techniques. Movement and setting these fiducials are interpreted into a movement that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.
SixthSense prototype implements several applications that demonstrate the usefulness, feasibility and flexibility of the system. Map application allows users to explore the map is displayed on a nearby surface using hand gestures, similar to the movement supported by the Multi-Touch based systems, letting users zoom in, zoom out or pan using intuitive hand movements. Drawing application lets the user draw on any surface by tracking the movement of the fingertips of the user's index finger. SixthSense also recognizes the user's freehand gestures (postures). For example, SixthSense system implements a gestural camera that takes pictures of the scene by detecting the user's view 'framing' gesture. Users can stop by any surface or wall and flick through the photographs he / she has taken. SixthSense also allows the user draw icons or symbols in the air using the movement of the index finger and recognizes the symbols as interaction instructions. For example, drawing a magnifying glass symbol takes users to the application a map or drawing an '@' symbol allows the user to check his mail. SixthSense system also adds to the physical objects the user interacts with by projecting more information about the objects projected on them. For example, a newspaper can show live video news or dynamic information can be provided on a sheet of plain paper. The movement drew a circle on the user's wrist projects an analog clock.
Sumber Sixthsense
0 comments:
Posting Komentar