Pour Me a Glass
CMU researchers use AI to teach robots to see water A horse, a zebra and artificial intelligence helped a team of Carnegie Mellon researchers teach a robot to recognize water and pour it into a glass. Water presents a tricky challenge for robots because it is clear. Robots have learned how to pour water before, but previous techniques like heating the water and using a thermal camera or placing the glass in front of a checkerboard background don't transition well to everyday life. An easier solution could enable robot servers to refill water glasses, robot pharmacists to measure and mix medicines, or robot gardeners to water plants. Gautham Narasimhan, who earned his master's degree in the Robotics Institute in 2020, worked with a team in the institute's Robots Perceiving and Doing Lab to use AI and image translation to solve the problem. Image translation algorithms use collections of images to train artificial intelligence to convert images from one style to another, such as transforming a photo into a Monet-style painting or making an image of a horse look like a zebra. For this research, the team used a method called contrastive learning for unpaired image-to-image translation, or CUT, for short.
Advert