|
[br]
[br]Researchers at MIT's Computer Science and Artificial Intelligence Lab have developed a robot that can simply look at an object to define how it 'feels'.[br][br]The machine uses a novel AI algorithm that has been trained on a mix of tactile and visual data.[br][br]Ultimately, the researchers believe, this work could make warehouse robots more efficient at handling objects of different types.[br][br]Here's more.[br][br][br]MIT's machine imagines feeling of touching an object[br][br]The robot developed by MIT researchers - a robotic arm named KUKA - captures visual data representing an object.[br][br]It processes this information with a sophisticated AI engine and predicts how it would feel to touch the seen object.[br][br]But, here's the thing, the robot even works the other way around - where it touches an object to predict what it would look like.[br][br][br]But, how all of this happens?[br][br]The algorithm developed by the researcher achieves this identification by connecting tactile and visual data fed by the researchers.[br][br]First, the team applied a special tactile sensor called GelSight on the robotic arm and made it touch 200 household objects 12,000 times.[br][br]Then, the tactile data captured by the robot and visual data representing the objects was fed into the AI algorithm.[br][br]This could make robots efficient in real world settings[br][br]As of now, the AI-powered robotic arm works in controlled environments. However, the team hopes to expand the capabilities of this system by training it on a wide set of data.[br][br]They hope that the AI would be able to use these capabilities in real-world environments - like in a warehouse - and differentiate between different types of objects to handle them appropriately.[br][br][br][br][br]#Infinix_India.... |
|