In 2026, we're seeing robots progress by leaps and bounds with markedly improved dexterity, the kind of progress long needed in the quest for truly useful household helpers. Now a new AI model has arrived to power robots through activities, including folding laundry, constructing boxes, fixing other robots and even filling wallets with flimsy paper money.
Earlier this month, California-based company Generalist AI released Gen-1, a new physical AI model that makes robots capable of performing all of these tasks (and more) with success. It's a big step forward in terms of robots designed for the real world based on intelligence born from the real world, Pete Florence, co-founder and CEO of Generalist AI told me.
In most of the example videos published by the company, Gen-1 is seen running on a pair of robotic arms, but that's not all it's built for. "Gen-1 is designed to be the brain of any robot, meaning the same model can run on a humanoid, an industrial arm or other robotic systems," said Florence.
Already, this has proved to be a breakthrough year for general-purpose humanoid robots, with companies including Boston Dynamics and Honor unveiling cutting-edge bots capable of uncannily humanlike movements. The market for robots is expected to explode, with one estimate from Morgan Stanley predicting growth to a $5 trillion market by 2050. Predictions see robots coming for industry, retail, hospitality and care environments before eventually landing in our homes. To get us there, we need to see further advances in AI.
Training robots to live alongside humans
Over the past few years we've seen large language models, such as ChatGPT, Gemini and Claude, evolve at lightning speed. The same hasn't been true of the physical AI models required to power robots, in large part because of a lack of data to train those models on. Robots -- and especially humanoid robots -- must learn to navigate a world built for humans just as a human would.
Often this data is collected from robots performing tasks while being teleoperated by humans, but not Gen-1. Instead, the dataset used to train Generalist AI's models has been assembled by humans completing millions of different tasks using wearable technology.
"We built our own lightweight 'data hands' and distributed them globally to learn how people actually interact with objects, with all the subtle force feedback, tactile feel, slips, corrections and recoveries that define human dexterity in the real world," said Florence. "That kind of data is critical for teaching robots physical common sense, the intuitive understanding and ability to adapt in real time rather than execute rigid instructions."
Generalist AI has released a series of videos showing the model running on robots repetitively performing a range of different tasks, with the most compelling, perhaps, being a robot drawing cash out of a wallet before reinserting it into the same pocket. This is a fiddly task that many humans fumble over. It's clearly not easy for the robot, either, given the flimsiness of the paper money and the fabric of the wallet -- and yet it completes the task.
Another video shows a robot sorting socks by color, folding them in neat piles and counting the number of pairs using a touchscreen. Other tricky tasks the model can complete include unzipping and filling a pencil case with pens, stacking oranges in a neat pyramid and plugging in an Ethernet cable.
... continue reading