People are fairly easy to determine the density and prominence of the subject, just by looking at him. With the same success we can say what the object looks like, just touching it with my eyes closed. These abilities would help the robot to interact with objects but unfortunately, so far they have not been available to them. Researchers from the artificial intelligence Lab at MIT (CSAIL) have solved this problem by equipping a robotic arm KUKA GelSight tactile sensor — thus, artificial intelligence was able to examine the relationship between visual and tactile information, and combine them.
Used GelSight tactile sensor was developed by a group of engineers under the leadership of Ted Adelson in 2014. Essentially it is an electronic copy of the tip of the human finger, which to create three-dimensional maps of the surface using the camera and a sensitive rubber film. The device has been repeatedly tested in real conditions — for example, he once helped the robot to properly connect the USB cable to the port.
Artificial intelligence combined organs of touch and sight
In the new project the sensor was mounted to a KUKA robot, and combined with artificial intelligence — thus the robotic arm learned the eye to determine the relief items, and blind to recognize their shape. Learning system used a set of 12 000 videos featuring 200 objects, such as fabrics, tools, and household items. The video was divided into frames, and that based on them the robot combined tactile and visual information.
At the moment the robot is able to perform work in a controlled environment only, and only with a pre-known objects. System developers want to extend its capabilities, allowing artificial intelligence and more data to study.
Looking at the scene, our model can imagine the feeling of touching the flat surface or sharp edge. Touching blindly, it can identify the shape of objects solely on tactile sensations. The Union of these two feelings can extend the capabilities of the robot and to reduce the amount of data that he may need to perform tasks related to manipulating and grasping objects, — explained Jungju Lee, CSAIL graduate student.
Robots are constantly evolving, and at this point even know how to work in a team. For example, developed at the University of California at Berkeley robot cockroach VelociRoACH recently learned to help each other to stand up. To read more about this and watch the video in our material.
If you are interested in news of science and technologies, be sure to subscribe to our channel at Yandex.Zen. There you will find materials that were not published on the website!