Called Skinput, the system is a marriage of two technologies: the ability to detect the ultralow-frequency sound produced by tapping the skin with a finger, and the microchip-sized "pico" projectors now found in some cellphones.
The system beams a keyboard or menu onto the user's forearm and hand from a projector housed in an armband. An acoustic detector, also in the armband, then calculates which part of the display you want to activate.
But how does the system know which icon, button or finger you tapped? Chris Harrison at Carnegie Mellon University in Pittsburgh, Pennsylvania, working with Dan Morris and Desney Tan at Microsoft's research lab in Redmond, Washington, exploit the way our skin, musculature and skeleton combine to make distinctive sounds when we tap on different parts of the arm, palm, fingers and thumb (see video).
Read full article...See full PDF from the inventors "Skinput: Appropriating the Body as an Input Surface"
No comments:
Post a Comment