As sensors become cheaper and cheaper, uses for them are multiplying. Through a new technology, device users will now be able to “paint” or project keyboards or other interfaces, such as a calendar, onto everyday objects and surfaces.
Researchers at Carnegie Melon University’s Human-Computer Interaction Institute (HCII) are developing a new system that uses sensors to determine the origin of sound by calculating how long it takes for the sound to arrive at a destination. By triangulating sensors, the position of the source of the sound can be determined.
This system is similar to mobile tracking systems found in Microsoft’s Kinect for the Xbox 360 and Sony’s PlayStation Move.
Bo (Richard) Xiao, a graduate assistant and doctoral student at HCII, and his Professor, Scott Hudson, have developed WorldKit, a new system which converts hand movements into a touch-based computer interface that can be located anywhere in the environment around a device. Xiao describes this method of position determination as “passive time difference of arrival”. This is currently used frequently in military scenarios and will soon show up in consumer devices more and more.
Xiao’s background is in computer science and mathematics and he has been working to create new technologies that improve human interaction with computers, especially through application of mobile touch screens, mobile tracking systems and other novel interfaces.
Related aricles on IndustryTap:
- Virtually Explore All of the $1.2 Billion Atlanta Falcons Stadium Coming in 2017
- Using Virtual Space Station To Diagnose, Treat Psychosocial Problems
- Wunderkind Magic Leap, Blending Real & Virtual Worlds, Raises $542 Million
References and related content: