Skinput
Skinput was a short-lived research project by Microsoft Research, and later Carnegie Mellon University, that aimed to turn human skin into a touchscreen-like interface for interacting with mobile devices. It involved using a pico-projector to project a visual interface onto the user's forearm and a custom armband to detect acoustic vibrations generated when the user touched their skin.
The technology worked by identifying the unique acoustic signature produced when different parts of the arm and hand were tapped. These signatures were then correlated with specific commands or functions on the paired device. Different locations on the arm produce unique sounds due to bone structure, muscle density, and tissue composition, allowing the system to differentiate between input locations with reasonable accuracy.
The envisioned use cases for Skinput included controlling music playback, answering phone calls, navigating menus, and interacting with other mobile applications without requiring the user to physically hold or look at their device. The goal was to create a more seamless and discreet interaction paradigm.
While the Skinput project generated significant interest and media coverage, it ultimately did not transition into a commercially available product. Challenges likely included perfecting the accuracy and reliability of the acoustic sensing technology, user comfort concerns related to wearing the armband, and competition from existing touchscreen technologies.