There⁘s no question that robotics is transforming our world. Thanks to computerized machines, manufacturing, healthcare, agriculture, supply chains, retail, automotive, construction, and other industries are seeing rapidly increasing efficiencies and new capabilities.
One challenge with bringing new robots online is that it⁘s hard, expensive, and time-consuming to train them for the task at hand. Once you⁘ve trained them, you have to retrain them with every minor tweak to the system. Robots are capable, but highly inflexible.
Some of the training is handled by software coding. Other methods use imitation learning, where a person teleoperates a robot (which, during training, essentially functions as a puppet) to kickstart data for robot movement.
Compounding the difficulty is a lack of standards. Each robot manufacturer uses its own specialized programming language. The interfaces used for teaching robots, especially ⁘teach pendants,⁘ tend to lack the modern attributes of the major, non-proprietary software development environments. (A teach pendant is a handheld control device that enables operators to program and control robots, enabling precise manipulation of the robot⁘s movements and functions.)
The lack of standards adds both complexity and costs for obvious reasons. Robot programming courses can cost thousands of dollars, and companies often need to train many employees on several robotics programming platforms.
To solve the enormous problems of robot training, MIT researchers are developing a radical, brilliant new method called Heterogeneous Pretrained Transformers , or HPTs.
The concept is based roughly on the same concept of large language models (LLMs) now driving the generative AI boom.
LLMs use vast neural networks with billions of parameters to process and generate text based on patterns learned from massive training datasets.
No comments:
Post a Comment