Tech News
← Back to articles

Oxford spinout RADiCAIT uses AI to make diagnostic imaging more affordable and accessible — catch it at TechCrunch Disrupt 2025

read original related products more articles

If you’ve ever had a PET scan, you know it’s an ordeal. The scans help doctors detect cancer and track its spread, but the process itself is a logistical nightmare for patients.

It starts with fasting for four to six hours before coming into the hospital — and good luck to you if you live rurally and your local hospital doesn’t have a PET scanner. When you get to the hospital, you’re injected with radioactive material, after which you must wait for an hour while it washes through your body. Next, you enter the PET scanner and have to attempt to lie still for 30 minutes while radiologists acquire the image. After that, you have to keep physically away from the elderly, young people, and pregnant women for up to 12 hours because you’re literally semi-radioactive.

Another bottleneck? PET scanners are concentrated in major cities because their radioactive tracers must be produced in nearby cyclotrons — compact nuclear machines — and used within hours, limiting access in rural and regional hospitals.

But what if you could use AI to convert CT scans, which are much more accessible and affordable, into PET scans? That’s the pitch of RADiCAIT, an Oxford spinout that came out of stealth this month with $1.7 million in pre-seed financing. The Boston-based startup, which is at Top 20 finalist in Startup Battlefield at TechCrunch Disrupt 2025, has just opened a $5 million raise to advance its clinical trials.

“What we really do is we took the most constrained, complex, and costly medical imaging solution in radiology, and we supplanted it with what is the most accessible, simple and affordable, which is CT,” Sean Walsh, RADiCAIT’s CEO, told TechCrunch.

RADiCAIT’s secret sauce is its foundational model — a generative deep neural network invented in 2021 at the University of Oxford by a team led by the startup’s co-founder and chief medical information officer, Regent Lee.

Left: CT scan. Middle: AI-generated PET scan from RADiCAIT. Right: Chemical PET scan. Image Credits:RADiCAIT

The model learns by comparing CT and PET scans, mapping them, and picking out patterns in how they relate to each other. Sina Shahandeh, RADiCAIT’s chief technologist, describes it as connecting “distinct physical phenomena” by translating anatomical structure into physiological function. Then the model is directed to pay extra attention to specific features or aspects of the scans, like certain types of tissue or abnormalities. This focused learning is repeated many times with many different examples, so the model can identify which patterns are clinically important.

Techcrunch event 2-FOR-1 DISCOUNT: Bring a +1 and save 60% Google Cloud, Netflix, Microsoft, Box, Phia, a16z, ElevenLabs, Wayve, Hugging Face, Elad Gil, Vinod Khosla — some of the 250+ heavy hitters leading 200+ sessions designed to deliver the insights that fuel startup growth and sharpen your edge. And don’t miss 300+ showcasing startups in all sectors.

... continue reading