Wednesday, 29 May 2013

The robot butler that can tend to your every need - even predicting when you ... - Daily Mail

Debarjun Saha | 06:16 |
  • The robot, developed at Cornell University, uses Kinect sensors, 3D cameras and a database of videos to work out what its owner wants
  • In tests, the robot correctly anticipated its owner's needs 82% of the time

By Victoria Woollaston

|

A beer-pouring robot that can read your body movements and anticipate when you want another drink has been developed by American students.

Researchers from Cornell University used Microsoft Kinect sensors and 3D cameras to help the robot analyse its surroundings and identify its owner's needs.

The robot then uses a database of videos showing 120 various household tasks to identify nearby objects, generate a set of possible outcomes and choose which action it should take - without being told.

Scroll down for video

A robot developed by researchers from Cornell University uses Kinect sensors, 3D cameras and a database of household task videos to anticipate their owner's needs.

A robot developed by researchers from Cornell University uses Kinect sensors, 3D cameras and a database of household task videos to anticipate their owner's needs. For example, it scans the surrounding area for clues and when it spots an empty beer bottle, can open the fridge, pick up a full bottle of beer and hand it to its owner - without being told

Robot depth view

The robot can anticipate its owner's actions

The Cornell robot uses sensors and a 3D camera to analyse the depth of its surroundings (left). The view seen by the robot in the right-hand picture shows how it anticipates its owner's actions. It compares the actions against a database of household task videos and chooses what it thinks is the most appropriate response. The more actions the robot carries out, the more accurate its decisions become

BEER DRONE WILL DELIVER DRINKS TO FESTIVAL GOERS FROM ABOVE

Festival-goers in South Africa this summer will be able to order beer from their smartphones and have it delivered by a flying drone dropping a can attached to a parachute.

The drone has been developed by Darkwing Aerials and will be tested at the Oppikoppi music festival in the Limpopo province of South Africa this August.

Customers will be able to place their drink orders through an iOS app that will send their GPS coordinates to the drone operators.  

As the actions continue, the robot can constantly update and refine its predictions. 

As well as fetching drinks for thirsty owners, the robot can also work out when its owner is hungry and put food in a microwave, tidy up, make cereal, fetch a toothbrush and toothpaste, open fridge doors and more.

Ashutosh Saxena, Cornell's professor of computer science and co-author of a new study tied to the research: 'We extract the general principles of how people behave.

'Drinking coffee is a big activity, but there are several parts to it.

'The robot builds a 'vocabulary' of such small parts that it can put together in various ways to recognise a variety of big activities.'

The Cornell robot can also help its owner tidy up.

The Cornell robot can also help its owner tidy up. In this image, the robot scanned the area and noticed that its owner was carrying a pot of food and heading towards the fridge. The robot then automatically opened the fridge door. During tests, the robot made correct predictions 82% of the time when looking one second into the future, 71% correct for three seconds and 57% correct for 10 seconds

The robot was initially programmed to refill a person's cup when it was nearly empty.

To do this the robot had to plan its movements in advance and then follow this plan.

But if a human sitting at the table happens to raise the cup and drink from it, the robot was put off and could end up pouring the drink into a cup that isn't there.

After extra programming the robot was updated so that when it sees the human reaching for the cup, it can anticipate the human action and avoid making a mistake.

During tests, the robot made correct predictions 82 per cent of the time when looking one second into the future, 71 per cent correct for three seconds and 57 per cent correct for 10 seconds.

This image shows the robot anticipating its owner walking towards a fridge and automatically opens the fridge door for him.

This image shows the robot anticipating its owner walking towards a fridge and automatically opens the fridge door for him. The first three images show the robot's view, the fourth is from the view of the owner

'Even though humans are predictable, they are only predictable part of the time,' Saxena said.

'The future would be to figure out how the robot plans its action.

Right now we are almost hard-coding the responses, but there should be a way for the robot to learn how to respond.'

Saxena and Cornell graduate student Hema S. Koppula will they present their research at the June International Conference of Machine Learning in Atlanta.

They will also demonstrate the robot at the Robotics: Science and Systems conference in Berlin, Germany, also in June.

VIDEO: Could this be the future? Robot learns how to pour you a beer 

Could this be the future? Robot learns how to pour you a beer.



via Science - Google News http://news.google.com/news/url?sa=t&fd=R&usg=AFQjCNERWSjhyQLLzqqYhinZHdzTGg3hjg&url=http://www.dailymail.co.uk/sciencetech/article-2332547/The-robot-butler-tend-need--predicting-want-beer-AND-pouring-you.html?ito=feeds-newsxml




ifttt
Put the internet to work for you. via Personal Recipe 2954071

No comments:

Post a Comment

Twitter Delicious Facebook Digg Stumbleupon Favorites More

Search