Technology / Robotics
Hands free everything
11 Jan 2011 at 07:51hrs | Views
Tech visionaries have long dreamed of the day when PCs, TVs and phones can be controlled with a wave of the hand or blink of an eye.
"Natural user interface" technologies on display at the Consumer Electronics Show, held in Las Vegas last week, suggest this vision is inching closer to the mainstream, tearing down barriers between user and device, and dispensing with unwieldy keyboards and remote controls.
The technology, stoked in the public imagination by the sci-fi hit film Minority Report, could be approaching a tipping point, according to industry insiders.
"The idea of controlling everything without touching it - technology is moving that way faster than ever," said Janine Kutliroff, CEO and founder of Omek Interactive, an Israeli company that makes software for gesture recognition through 3-D sensors and cameras, so you can play games or manipulate a television just by moving your hands and body.
"Cameras are going to get smaller and cheaper. There's a lot of competitive technology out there," said Kutliroff, whose company was one of a handful showing off gesture-control technologies at the Vegas exhibition.
The opportunities for using sensors, cameras and voice recognition to make everyday objects "intelligent" are almost endless, promoters of the new technology say.
"We see a whole world where machines interact with you," said Uzi Breier, chief marketing officer for PrimeSense, an Israeli company that built some of the technology behind Microsoft's Kinect system, and that has teamed up with Taiwan's Asustek to bring gesture-controlled TV and computer functions to living room screens this year.
His company is providing sensors for iRobot Corp's latest robots, including a new Roomba automatic vacuum cleaner that can "see" dirt and head for it, rather than trundling aimlessly around the room.
He predicts that sensors will soon be used in homes so that heating and cooling systems can recognise who is in a room and set the temperature to predetermined levels, or in cars to adjust settings according to the driver.
Another company, Omek, is helping pioneer digital signage - signs in stores that interact with shoppers and can initiate conversations when people walk close or hover in an area.
Kutliroff sees this technology reaching far into the commercial sphere, creating virtual salespeople and providing virtual clothes-fitting.
It could aid physical rehabilitation by sending 3-D images of a patient in real time to a remote therapist. Security is also a potential avenue as cameras learn to identify people based on the way their bodies move.
"These are the applications that are going to push this technology further than the use in game consoles alone," said Kutliroff.
So far, gaming has been the only mainstream use of gesture-recognition technology. Microsoft Corp's Kinect add-on for the Xbox - which allows gamers to move avatars on screen just through body motions - has already sold 8million units in just over two months on the market.
The logical step is for Microsoft to bring the same technology to its Windows operating system, allowing users to manipulate documents or move photos around a screen or projection, as Tom Cruise does in Minority Report.
In the meantime, Microsoft's research labs are working on intriguing new possibilities, such as its "Skinput" project, a way of controlling devices just by touching your arm or hand in different places, and its LightSpace project for manipulating virtual documents, the closest it has come to the Minority Report scenario.
But not everyone is convinced that the technological revolution of virtual controlling is nigh. Many people still like to touch objects, and the mouse will probably remain the tool of choice for precise jobs, from desktop publishing to graphic design, says IDC analyst Al Hilwa.
"These user interfaces will continue to be the most effective for many areas, but overall, technology marches towards more diversity and alternatives."
Other companies at the show showed alternative uses for hands-free interaction with computers, for example, Norway's Elliptic Labs demonstrated a way of manipulating an iPad with close-up hand gestures, based on reflected sound waves, or ultrasonics, rather than light.
"We are moving from an era in which we think about computers as things with screens and keyboards to an era where the computer becomes invisible and pervasive in our lives," said Peter Haynes, a Microsoft director.
"Natural user interface" technologies on display at the Consumer Electronics Show, held in Las Vegas last week, suggest this vision is inching closer to the mainstream, tearing down barriers between user and device, and dispensing with unwieldy keyboards and remote controls.
The technology, stoked in the public imagination by the sci-fi hit film Minority Report, could be approaching a tipping point, according to industry insiders.
"The idea of controlling everything without touching it - technology is moving that way faster than ever," said Janine Kutliroff, CEO and founder of Omek Interactive, an Israeli company that makes software for gesture recognition through 3-D sensors and cameras, so you can play games or manipulate a television just by moving your hands and body.
"Cameras are going to get smaller and cheaper. There's a lot of competitive technology out there," said Kutliroff, whose company was one of a handful showing off gesture-control technologies at the Vegas exhibition.
The opportunities for using sensors, cameras and voice recognition to make everyday objects "intelligent" are almost endless, promoters of the new technology say.
"We see a whole world where machines interact with you," said Uzi Breier, chief marketing officer for PrimeSense, an Israeli company that built some of the technology behind Microsoft's Kinect system, and that has teamed up with Taiwan's Asustek to bring gesture-controlled TV and computer functions to living room screens this year.
His company is providing sensors for iRobot Corp's latest robots, including a new Roomba automatic vacuum cleaner that can "see" dirt and head for it, rather than trundling aimlessly around the room.
He predicts that sensors will soon be used in homes so that heating and cooling systems can recognise who is in a room and set the temperature to predetermined levels, or in cars to adjust settings according to the driver.
Another company, Omek, is helping pioneer digital signage - signs in stores that interact with shoppers and can initiate conversations when people walk close or hover in an area.
Kutliroff sees this technology reaching far into the commercial sphere, creating virtual salespeople and providing virtual clothes-fitting.
It could aid physical rehabilitation by sending 3-D images of a patient in real time to a remote therapist. Security is also a potential avenue as cameras learn to identify people based on the way their bodies move.
"These are the applications that are going to push this technology further than the use in game consoles alone," said Kutliroff.
So far, gaming has been the only mainstream use of gesture-recognition technology. Microsoft Corp's Kinect add-on for the Xbox - which allows gamers to move avatars on screen just through body motions - has already sold 8million units in just over two months on the market.
The logical step is for Microsoft to bring the same technology to its Windows operating system, allowing users to manipulate documents or move photos around a screen or projection, as Tom Cruise does in Minority Report.
In the meantime, Microsoft's research labs are working on intriguing new possibilities, such as its "Skinput" project, a way of controlling devices just by touching your arm or hand in different places, and its LightSpace project for manipulating virtual documents, the closest it has come to the Minority Report scenario.
But not everyone is convinced that the technological revolution of virtual controlling is nigh. Many people still like to touch objects, and the mouse will probably remain the tool of choice for precise jobs, from desktop publishing to graphic design, says IDC analyst Al Hilwa.
"These user interfaces will continue to be the most effective for many areas, but overall, technology marches towards more diversity and alternatives."
Other companies at the show showed alternative uses for hands-free interaction with computers, for example, Norway's Elliptic Labs demonstrated a way of manipulating an iPad with close-up hand gestures, based on reflected sound waves, or ultrasonics, rather than light.
"We are moving from an era in which we think about computers as things with screens and keyboards to an era where the computer becomes invisible and pervasive in our lives," said Peter Haynes, a Microsoft director.
Source - Reuters