I’ve had a lifetime interest in photography that led to my first job, designing cameras for Polaroid. I’ve owned most brands of film and digital cameras including Nikon, Canon, Pentax, Fuji, Ricoh and Leica. One of the most important criteria for choosing a camera and lens is that it produces images that are sharp, distortion-free and records an image as close as possible to the scene being photographed. And in all cases, sharper is better.
But in the case of digital photography using our phone cameras with fast processors and access to the cloud, the criteria for the most accurate image is less distinct. The images on our phone are no longer simply a mapping of the scene on the sensor. Images are often manipulated and enhanced using multiple exposures (HDR), sharpening of soft edges, and exposure adjustments on a pixel by pixel basis. As a result, we’ve been able to get some amazing images from our phones, many seemingly equivalent to what you’d get from a larger pro camera. For the most part, we still consider these images as real.
But we’re now seeing this use of computational photography moving into an area where some are questioning whether the photos are real or fake.
Samsung has been promoting photos of the moon taken with one of their phone cameras, ostensibly to show off their zoom capabilitites, and that’s been creating a big contoversy in the tech community.
It began in 2020 with an ad for the Galaxy S20 Ultra touting the advantages of their “100x Space Zoom” camera using an amazingly detailed moon photo in their marketing promotion. And this Galaxy S23 ad, shows a photographer with a huge, tripod-mounted telescope jealous of the moon photo taken with a Galaxy phone.
Then two weeks ago a poster on Reddit claimed the moon photos Samsung was using were fake and posted the work he did to back up his claim. He concluded, “The moon pictures from Samsung are fake. Samsung’s marketing is deceptive. It is adding detail where there is none (in this experiment, it was intentionally removed). In this article, they mention multi-frames, multi-exposures, but the reality is, it’s AI doing most of the work, not the optics, the optics aren’t capable of resolving the detail that you see. Since the moon is tidally locked to the Earth, it’s very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected.”
He’s saying that Samsung is combining the blurry image of the moon taken with the Galaxy camera with a much more detailed image taken with a high resolution camera at some other time, retrieved from the cloud- perhaps even using artificial intelligence.
So the question is whether this rrazor-sharp image of the moon is real or a fake as the Reddit poster claims? And does it matter?
This is an example of our new world as we start to combine what’s real in the physical space with something virtual. Is Samsung just “spell-checking” the photo by correcting and embellishing the blurry parts? Considering how much a digital photo is already manipulated using in-camera computing, does adding more information to the photo that’s sourced elsewhere turn the photo into a fake?
Suppose you take a picture of the Eiffel Tower and the camera searches the millions of Eiffel Tower pictures in its library and automatically adds more detail without changing the angle or perspective of the photo. Is that simply equivalenet to using a camera with a much sharper lens and larger sensor? Or is it a fake picture? It may not even mater what we think, because this is the direction we’re heading. Combining the real with the artificial is called augmented reality or AR.
One of the most niftiest applications of AR for travelers is taking a picture of a menu written in a foreign language using Google Translate. It turns the words on the menu into your language of choice. No one questions the reality of the menu. Will our future photos now be seen in the same way?
The importance of AR cannot be minimized. In this recent quote from Tim Cook, CEO of Apple, he says,
“If you think about the technology itself with augmented reality, just to take one side of the AR/VR piece, the idea that you could overlay the physical world with things from the digital world could greatly enhance people’s communication, people’s connection,” Cook says. “It could empower people to achieve things they couldn’t achieve before. We might be able to collaborate on something much easier if we were sitting here brainstorming about it and all of a sudden we could pull up something digitally and both see it and begin to collaborate on it and create with it. And so it’s the idea that there is this environment that may be even better than just the real world—to overlay the virtual world on top of it might be an even better world. And so this is exciting. If it could accelerate creativity, if it could just help you do things that you do all day long and you didn’t really think about doing them in a different way.”