Creating Virtual Touch Screens to Control Mobile and Desktop Devices

By: | October 23rd, 2014

IMG 5644 e1412269479517

IMG 5644 e1412269479517

As sensors become cheaper and cheaper, uses for them are multiplying. Through a new technology, device users will now be able to “paint” or project keyboards or other interfaces, such as a calendar, onto everyday objects and surfaces.

Researchers at Carnegie Melon University’s Human-Computer Interaction Institute (HCII) are developing a new system that uses sensors to determine the origin of sound by calculating how long it takes for the sound to arrive at a destination. By triangulating sensors, the position of the source of the sound can be determined.

This system is similar to mobile tracking systems found in Microsoft’s Kinect for the Xbox 360 and Sony’s PlayStation Move.

Bo (Richard) Xiao, a graduate assistant and doctoral student at HCII, and his Professor, Scott Hudson, have developed WorldKit, a new system which converts hand movements into a touch-based computer interface that can be located anywhere in the environment around a device. Xiao describes this method of position determination as “passive time difference of arrival”. This is currently used frequently in military scenarios and will soon show up in consumer devices more and more.

Xiao’s background is in computer science and mathematics and he has been working to create new technologies that improve human interaction with computers, especially through application of mobile touch screens, mobile tracking systems and other novel interfaces.

Related aricles on IndustryTap:

References and related content:

David Russell Schilling

David enjoys writing about high technology and its potential to make life better for all who inhabit planet earth.

More articles from Industry Tap...