Human vision relies on photoreceptor cells in the retina that react to visible light and trigger neurons in the optic nerve to send signals to the brain. Degradation of these photoreceptors is the leading cause of vision impairments, including blindness. However, a team of scientists at China’s Fudan University has recently built prototype retinal implants that can replace the failing photoreceptors and potentially provide infrared vision as a bonus. Sadly, they’ve only been tested in animals, so we’re still rather far away from making them work like Cyberpunk 2077-style eye augments. Vision on chip Earlier work on retinal implants that restored at least some degree of vision to the blind involved using electrode arrays that electrically stimulated neurons in the back of the retina, taking the place of the damaged photoreceptor cells. A patient had to wear a camera mounted on a pair of glasses that sent signals to the implant to activate this signaling. These implants required a power source to work, were unreliable, difficult to use, and had limited resolution, and the surgical procedure necessary to put them in the eye was extremely complicated. For all these reasons and more, they were withdrawn from the market. What the Fudan team achieved was an implant that worked without the external camera and without a power source. The development process started with extensive simulations aimed at pinpointing the right material. The ideal candidate was a photovoltaic material—it had to generate photocurrent without any external voltage in response to a broad spectrum of light. The primary material that emerged from these simulations was tellurium, a rare silver-white element that shares properties of both metals and nonmetals. The Fudan team fabricated prototype retinal implants using a mesh of tellurium nanowires. Once the implants were ready, the scientists conducted a test campaign—first in mice, then with non-human primates.