OpenAI appears to be ramping up its efforts in robotics, hiring researchers who work on humanoid systems as it explores new ways to advance artificial intelligence. The company has recently recruited a number of researchers with expertise in developing AI algorithms for controlling humanoid and other types of robots. Job listings show that the company is putting together a team capable of creating systems that can be trained through teleoperation and simulation. Sources with knowledge of the company’s efforts also say OpenAI is recruiting people to work specifically on humanoid robots, or robots with a partial or full human form. One source who works in cutting-edge robotics says the company has begun training AI algorithms that are better able to make sense of the physical world and that could empower robots to navigate and perform tasks. A number of recent hires suggest that OpenAI’s robot efforts are now accelerating. For instance, Chengshu Li joined OpenAI in June 2025 from Stanford University, where he worked on a number of robotics projects, including the development of a benchmark designed to measure the abilities of humanoid robots capable of performing a wide range of household chores. Li’s dissertation concerns the development of benchmarks and focuses on robots with a partly humanoid form, which has two arms but wheels instead of legs. Two other researchers from another robotics lab have already joined the company according to their LinkedIn profiles. One professor at a third robotics lab that does humanoid work says one of their students was also recently recruited. OpenAI declined to comment on its recruitment efforts or robot research plans. However, OpenAI has recently posted a number of revealing job listings related to robotics research on its site. One opening requires expertise in teleoperation and simulation. Teleoperation is a crucial part of training partial or fully humanoid robots: A human operator performs chores and controls the limbs of the robot, while an algorithm learns how to mimic their actions. The role also calls for expertise in simulation tools including Nvidia Isaac, which is widely used to train humanoids by having an algorithm learning inside a virtual physical environment. It remains unclear whether OpenAI intends to build its own robots, use off-the-shelf hardware, or partner with a robotics company. However, another job posted in the past few weeks called for a mechanical engineer with expertise in prototyping and building robot systems with sensors for touch and motion. One roboticist says this could mean that OpenAI plans to build its own robot or that it is developing teleoperation systems for robot training. The job also calls for “experience designing mechanical systems intended for high volume (1M+), problem-solving on assembly lines,” which suggests systems that would be mass-produced or that might even be deployed in manufacturing. All of OpenAI’s robot job openings say that the company’s robotics team “is focused on unlocking general-purpose robotics and pushing towards AGI-level intelligence in dynamic, real-world settings.”