New advances in artificial intelligence (AI) are allowing companies to develop robots with better features and more effective human interaction capabilities.
Figure AI has reportedly raised $675 million to develop AI-powered humanoid robots. Among the investors are Jeff Bezos’ Explore Investments, alongside technology firms such as Microsoft, Amazon, Nvidia, OpenAI and Intel. The investment reflects growing excitement in the field of robotics due to AI, experts say.
“AI can better enable robots to better understand their environments, allowing them to better detect objects and people they come across,” Sarah Sebo, an assistant professor of computer science at the University of Chicago, where she directs the Human-Robot Interaction (HRI) Lab, told PYMNTS in an interview.
“AI can allow robots to generate language to more fluidly communicate with people and respond more intelligently to human speech. Finally, AI can help robots to adapt and learn to perform tasks better over time by receiving feedback, making it more likely that the robot behaves in ways that receive positive feedback and less likely to behave in ways that receive negative feedback.”
Last March, Figure AI introduced the Figure 01 robot, a versatile humanoid robotic assistant designed for a range of tasks, from industrial duties to domestic chores. Equipped with AI software, this robot mimics human movements and interactions.
The company envisions the robot taking over risky, monotonous and dull tasks, freeing human workers to concentrate on roles that demand creativity and skill. This innovation not only aims to address labor shortages in the manufacturing sector but also to enhance safety in the workplace.
Figure AI is among several companies venturing into AI robotics, with various firms creating humanoid robots. For instance, 1X Technologies, a Norwegian startup supported by OpenAI, secured $100 million earlier this year for its robot named 1X. Similarly, Sanctuary AI, a Canadian startup, is developing a robot named Phoenix, which is touted to possess human-like intelligence and emotions.
The field of robotics is moving fast. Machine learning is making it easier for people to train robots for their specific needs instead of depending on the manufacturer to program all the skills the robot might need for every situation, Dinesh Jayaraman, an assistant professor at the University of Pennsylvania who studies robotics, told PYMNTS in an interview.
For instance, an older person could show their robot assistant how to make a favorite dish once, and then the robot could help or even make it on its own next time. Similarly, a child could teach a robot pet to clean up their room after playing, like they would train a real pet.
“The key here is that the experts only develop a single underlying learning algorithm, and anyone afterward should be able to teach the robot to pick up a new skill using that algorithm,” he added.
Large language models (LLMs) are one of the most notable recent developments in the use of AI for robotics, Sebo said. The models can generate executable code for robots, turning high-level human commands, for example “find the Coke,” into instructions that a robot drone can execute.
“LLMs can also enable robots to speak more naturally and with varied language, allowing robots to move beyond a set of fixed verbal responses that remain constant during each interaction,” Sebo added. “Finally, LLMs allow robots to better understand human speech by providing the LLM with human speech and asking the LLM questions such as, ‘Is this person sad or happy?’”
Another breakthrough in robotics has been the advancement of object recognition algorithms, Iu Ayala, founder and CEO of Gradient Insight, a data science and AI consulting firm, told PYMNTS in an interview. These algorithms have reached a new height of precision, empowering robots to accurately identify a broad spectrum of objects, textures, and even minor environmental changes.
“Think about a scenario where a robot, equipped with advanced computer vision, can seamlessly differentiate between similar-looking products on a manufacturing line or precisely identify defective items in real-time,” he said. “This not only enhances efficiency but also ensures a level of quality control that was once unimaginable.”
While industrial and stationary robots have been around for decades, AI could let them move about more freely. AI enables robots to do more than just remember a preprogrammed routine, Erik Nieves, CEO and co-founder of Plus One Robotics, an AI-vision software solution for robotic parcel handling, told PYMNTS in an interview. Using AI, robots can detect changes in their surroundings and use algorithms to make decisions on the fly.
“One of the keys to this accomplishment is 3D sensor technologies that enable robots to understand their environment much more deeply,” he added. “Now, robots aren’t just operating in a predictable use case; with minimal guidance, they can intelligently see and adapt to novel situations.”
Advances in robotics could have a dramatic economic impact. Wherever there are dull, dangerous or difficult jobs that humans find undesirable, there is a potential role for robots, Jayaraman said. He mentioned that small-scale manufacturing, inspection and assembly line tasks are vital areas where labor shortages and poor working conditions could soon drive the adoption of autonomous or semi-autonomous robots.
“Areas that involve physically interacting with people, such as nursing or elderly care, pose safety risks that we do not yet have a good handle on, and where I expect that we will need to tread more carefully — we are already seeing some of this with the delays in the arrival of truly autonomous driving technologies,” he added.