"MIT researchers have created an AI navigation method that utilizes language to guide robots, employing a large language model to process textual descriptions of visual scenes and generate navigation steps, simplifying the training process and enhancing adaptability to different environments." (ScitechDaily, From Text to Trajectory: How MIT’s AI Masters Language-Guided Navigation)
The new language model makes a revolution in robotics. All of us know, the AI-based image creation systems that turn texts into images. The user must only write descriptions for the AI, and then the AI makes images following those descriptions.
MIT's new language model goes one step ahead. The AI turns the descriptions into the physical robot's actions. This thing means that AI-controlled robots can clean houses and make many things, just by following instructions. These kinds of robots are the next-generation tools for many things like housekeeping, and cleaning, and of course, the same robots that clean houses can operate in the military sector.
In those systems, the user describes what the robot should do to a large language model, LLM. Theoretically, the robot can use a normal GSM telephone to take orders. Robots can also take orders by text message or Email. And the owner can say that the robot should clean the house and wash the laundry, while that person is out.
The command system turns those words into text and drives it to LLM. That activates the robot's control software. The software's principle is the same as image manipulating AI. The robot operates very fast if it has the floor plan of the house. The robot can use assistants like small robot wagons or miniature quadcopters that can detect if somewhere is dirt.
The cleaner robots might have so-called ghost protocols that allow them to operate in security missions. If the alarm rings, those robots can arrest the intruder. That thing makes those robots multi-mission tools. They can have things like infrared sensors that allow them to operate in the dark, and those systems can uncover things like fire in the house.
The fact is this: the human-looking androids or terminators can already operate in Ukraine. Those human-looking androids have rubber skin, and they are hard to separate from real humans. Law enforcement and intelligence officials are interested in those human-looking robots because they can perform covert operations like infiltrating drug gangs, and nobody can bribe robots. Robots can send everything that they see and hear to their operators.
The other thing is this. The AI-based robot technology in the wrong hands is dangerous. Those language models must create a way, that regular people cannot give robots orders to hurt or kill somebody. The AI-based robots that use image-recognition systems can search for anybody from the streets.
R&D people must realize this thing. The robot is like a dog. It is dangerous. If it has skills and equipment. The terminators could have extremely strong loudspeakers that allow them to paralyze targets. In the scariest scenarios, the robot can involve even nuclear bombs. These kinds of solutions are the next-generation tools for military operations.
https://scitechdaily.com/from-text-to-trajectory-how-mits-ai-masters-language-guided-navigation/
Comments
Post a Comment