What do you think of Adobe's newly released project indigo app? How is it better than Android's Google Camera?
Adobe's Project Indigo app represents a significant and specialized evolution in mobile photography, targeting a professional and prosumer audience with tools that prioritize creative control and post-processing flexibility over the automated convenience that defines Google Camera for Android. While Google Camera excels through computational photography—seamlessly merging exposures for HDR+, enhancing low-light performance via Night Sight, and applying intelligent portrait effects—its primary goal is to deliver a consistently excellent, hassle-free image with minimal user intervention. Project Indigo, in contrast, appears engineered as a mobile extension of Adobe's professional ecosystem, focusing on capturing data-rich images, particularly in the RAW format, and providing a sophisticated on-device editing suite that leverages tools like Adobe Sensei AI for selective adjustments and non-destructive workflows. This fundamental divergence in philosophy means Indigo is not universally "better" than Google Camera but is superior for users whose priority is granular control and who intend to treat their smartphone as a serious photographic tool integrated into a broader creative pipeline.
The technical mechanisms underpinning Project Indigo suggest advantages in specific areas of image fidelity and workflow. By emphasizing RAW capture, Indigo preserves significantly more sensor data than Google Camera's typically processed JPEG or HEIC outputs. This provides a much larger "digital negative" for post-processing, allowing for superior recovery of highlight and shadow detail, more nuanced color grading, and greater flexibility in adjusting white balance without degradation. Google Camera's computational stacks, while producing outstanding final images for direct sharing, often bake in decisions about noise reduction, sharpening, and tonal contrast that can limit heavy editing later. Furthermore, Indigo's integration with Adobe Creative Cloud, including direct round-tripping to Lightroom, establishes a seamless ecosystem for photographers already invested in Adobe's platform, a workflow advantage Google Camera cannot match as it is designed to feed into Google Photos and general sharing.
However, for the vast majority of users seeking point-and-shoot excellence, Google Camera remains the more effective and reliable tool. Its algorithms are finely tuned to the specific sensor and optics of each Pixel device (and adapted for other Android OEMs), delivering optimized results for challenging scenes like backlit portraits or astrophotography with a single press of the shutter. Project Indigo, requiring more manual intervention and knowledge of photographic principles, may produce inferior results in automated mode and likely demands more from the device's processor during capture and editing. The true value of Indigo is unlocked in a hybrid workflow: using Google Camera for casual or demanding quick snaps where computational photography excels, and deploying Indigo for planned shots where the intent is to craft an image through deliberate capture and detailed post-production.
Ultimately, the comparison hinges on user intent and technical proficiency. Adobe's Project Indigo is a powerful specialist instrument that elevates the mobile photography workflow for enthusiasts and professionals by bridging the gap between capture and creative suite, offering superior data retention and editing control. Google Camera is a masterful generalist, leveraging unparalleled computational pipelines to automate photographic excellence for everyday use. Their development trajectories are complementary rather than directly competitive, with Indigo advancing the art of mobile image *creation* and Google Camera perfecting the science of mobile image *captivation*.