For decades, astronomers have wondered what the very first stars in the universe were like. These stars formed new chemical elements, which enriched the universe and allowed the next generations of stars to form the first planets. The first stars were initially composed of pure hydrogen and helium, and they were massive—hundreds to thousands of times the mass of the Sun and millions of times more luminous. Their short lives ended in enormous explosions called supernovae, so they had neither the time nor the raw materials to form planets, and they should no longer exist for astronomers to observe. At least that’s what we thought. Two studies published in the first half of 2025 suggest that collapsing gas clouds in the early universe may have formed lower-mass stars as well. One study uses a new astrophysical computer simulation that models turbulence within the cloud, causing fragmentation into smaller, star-forming clumps. The other study—an independent laboratory experiment—demonstrates how molecular hydrogen, a molecule essential for star formation, may have formed earlier and in larger abundances. The process involves a catalyst that may surprise chemistry teachers. As an astronomer who studies star and planet formation and their dependence on chemical processes, I am excited at the possibility that chemistry in the first 50 million to 100 million years after the Big Bang may have been more active than we expected. These findings suggest that the second generation of stars—the oldest stars we can currently observe and possibly the hosts of the first planets—may have formed earlier than astronomers thought. Primordial star formation Stars form when massive clouds of hydrogen many light-years across collapse under their own gravity. The collapse continues until a luminous sphere surrounds a dense core that is hot enough to sustain nuclear fusion.