Tech News
← Back to articles

Adobe’s new camera app is making me rethink phone photography

read original related products more articles

Adobe’s Project Indigo is a camera app built by camera nerds for camera nerds. It’s the work of Florian Kainz and Marc Levoy, the latter of whom is also known as one of the pioneers of computational photography with his work on early Pixel phones. Indigo’s basic promise is a sensible approach to image processing while taking full advantage of computational techniques. It also invites you into the normally opaque processes that happen when you push the shutter button on your phone camera — just the thing for a camera nerd like me.

If you hate the overly aggressive HDR look, or you’re tired of your iPhone sharpening the ever-living crap out of your photos, Project Indigo might be for you. It’s available in beta on iOS, though it is not — and I stress this — for the faint of heart. It’s slow, it’s prone to heating up my iPhone, and it drains the battery. But it’s the most thoughtfully designed camera experience I’ve ever used on a phone, and it gave me a renewed sense of curiosity about the camera I use every day.

This isn’t your garden-variety camera app

You’ll know this isn’t your garden-variety camera app right from the onboarding screens. One section details the difference between two histograms available to use with the live preview image (one is based on Indigo’s own processing and one is based on Apple’s image pipeline). Another line describes the way the app handles processing of subjects and skies as “special (but gentle).” This is a camera nerd’s love language.

The app isn’t very complicated. There are two capture modes: photo and night. It starts you off in auto, and you can toggle pro controls on with a tap. This mode gives you access to shutter speed, ISO, and, if you’re in night mode, the ability to specify how many frames the app will capture and merge to create your final image. That rules.

Indigo’s philosophy has as much to do with image processing as it does with the shooting experience. A blog post accompanying the app’s launch explains a lot of the thinking behind the “look” Indigo is trying to achieve. The idea is to harness the benefits of multi-frame computational processing without the final photo looking over-processed. Capturing multiple frames and merging them into a single image is basically how all phone cameras work, allowing them to create images with less noise, better detail, and higher dynamic range than they’d otherwise capture with their tiny sensors.

Indigo preserves some deeper shadows in this high-contrast scene than the standard iPhone camera processing does.

Phone cameras have been taking photos like this for almost a decade, but over the past couple of years, there’s been a growing sense that processing has become heavy-handed and untethered from reality. High-contrast scenes appear flat and “HDR-ish,” skies look more blue than they ever do in real life, and sharpening designed to optimize photos for small screens makes fine details look crunchy.

Indigo aims for a more natural look, as well as ample flexibility for post-processing RAW files yourself. Like Apple’s ProRAW format, Indigo’s DNG files contain data from multiple, merged frames — a traditional RAW file contains data from just one frame. Indigo’s approach differs from Apple’s in a few ways; it biases toward darker exposures, allowing it to apply less noise reduction and smoothing. Indigo also offers computational RAW capture on some iPhones that don’t support Apple’s ProRAW, which is reserved for recent Pro iPhones.

After wandering around taking photos with both the native iPhone camera app and Indigo, the difference in sharpening was one of the first things I noticed. Instead of seeking out and crunching up every crumb of detail it can find, Indigo’s processing lets details fade gracefully into the background.

... continue reading