You might think the touchscreen on your smartphone is pretty sweet, but somewhere deep inside your psyche, you know it's totally at a 2007 level of techie hotness by now. But if Carnegie Mellon University and Microsoft have their way, their joint project Skinput will take touch input off a device's screens entirely.
Using images projected onto the user's arm or hand, Skinput uses an arm band-style device to measure the acoustic signatures of taps on the skin to determine what sort of command input it's receiving.
It's certainly futuristic in its execution, but the jury is still out on whether this concept has the potential to bring something truly useful to the table.
I mean, texting on your arm sounds neat and all, but when you realize you're losing the use of one of your hands to do it, it kind of makes the whole idea moot, doesn't it?
Source: New Scientist