It seems that this year is going to introduce us to very different computing interfaces. First, Google Glass and its wearable technology has given hope to many people who think that staring at your mobile can rob you of the real world experience. Fujitsu, on the other hand has decided that it would be simpler if we turn any surface into a touch screen and allow data transfer between the real and virtual world by using some sophisticated optics and sensors.
The demonstration embedded below shows how data can be selectively manipulated using Fujitsu’s new tech. You can use it to scan data and content from physical stuff like newspapers or printed sheets. This is done by accurately mapping your index finger and allowing your hands to manipulate the viewing dimensions. The tech is surprisingly smooth and may have great applications for the business and education sectors in addition to building virtual models and manipulating them like Tony Stark in Ironman.
Fujitsu says that the tech is currently suitable for demonstration only and consumer versions will probably start shipping in 2014.
Isn’t this pretty cool? Tell us what you think.
Source | Diginfo