How Can Artificial Brain Enable Robots to Perform Complex Tasks?

How Can Artificial Brain Enable Robots to Perform Complex Tasks?

Artificial brains could mimic biological neural networks and make robots smarter

The recent advances in robotics have made it possible to perform complex tasks that were previously risky for human workers. There is no wonder that robots are gaining unprecedented interest as companies are seeking to deliver seamless customer experience. Robots today are also capable of picking up a can of soft drink or cooking foods in the same way humans do. Yet, they have much to learn as they require to locate an object, infer its shape, determine the right amount of strength, and hold the object without letting it slip. In order to perform such tasks with precision, robots need an extraordinary sense of touch equivalent to the human skin.

In this way, a team of computer scientists and materials engineers from the National University of Singapore has recently developed a sensory integrated artificial brain system. The system mimics biological neural networks, and can run on a power-efficient neuromorphic processor, such as Loihi chip from Intel. The system also assimilates artificial skin and vision sensors that can equip a robot with the ability to draw accurate conclusions about an object it grasps based on the data captured by the vision and touch sensors in real-time.

As the field of robotic manipulation has made great progress in recent years, fusing both vision and tactile information to provide a highly precise response in milliseconds remains a technology challenge, according to assistant professor Benjamin Tee at the NUS Department of Materials Science and Engineering. He said, "Our recent work combines our ultra-fast electronic skins and nervous systems with the latest innovations in vision sensing and AI for robots so that they can become smarter and more intuitive in physical interactions."

Enabling a Sense of Touch in Robotics

Just like humans, researchers nowadays are exploring to enable a sense of touch for robots. Meanwhile, recent years have seen a great stride towards it. For the new robotic system, the NUS researchers built Asynchronous Coded Electronic Skin (ACES), an advanced artificial skin, in 2019. The skin has ultra-high responsiveness and robustness to damage and can be paired with any kind of sensor skin layers to function effectively as an electronic skin.

Led by Asst Prof Tee, ACES is made up of a network of sensors connected through a single electrical conductor and varies from current electronic skins that have interlinked wiring systems that can make them sensitive to damage and intricate to scale up. This novel sensor senses touch over 1,000 times faster than the human sensory nervous system. Besides, it can detect the shape, texture and hardness of an object 10 times faster than the blink of an eye. ACES is able to adapt and can remain functional following physical damage largely thanks to its all the sensors that can be connected to a common electrical conductor with each sensor operating independently.

Moreover, scientists across the world are also exploring ways to give robots a human-like sense of touch. In a study published in the journal Science Robotics, Stanford scientists developed an electronic glove comprising sensors that could give robotic hands the human-like sense of touch and dexterity. According to researchers, the sensors in the glove's fingertips at the same time measure the intensity and direction of pressure, two qualities essential to achieving manual dexterity.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net