The moon photos taken with the “Space Zoom” feature of the Galaxy S23 Ultra have already impressed many people. It even made Elon Musk say “wow.” However, as many of us guessed, the feature is nothing but AI trickery. At least, that is what a Reddit user has concluded after an in-depth investigation.
Now, before you conclude anything, you need to understand that the samsung Galaxy S23 Ultra has an extremely capable camera system. One of them can offer a 100x zoom level, which is created by augmented 3x and 10x telephoto cameras with a digital zoom that AI aids. But does it actually work as advertised?
Galaxy S23 Ultra: Space Zoom or AI Trickery?
The digital zoom aided by Samsung's AI offers Super Resolution technology. With it, the telephoto cameras of the Galaxy S23 Ultra can capture clear shots of faraway objects. Samsung calls it Space Zoom. And it claims to allow users to take real photos of the moon.
However, the clarity of the final image is only at a high level because of the software shenanigans. At least, that is what a Reddit user believes. On an android subreddit, the user called the Space Zoom feature of the Galaxy S23 Ultra totally fake. And the user has proof.
The user referred to the moon photographs that were from the S20 Ultra and later models. According to the user, the photos taken by those phones were not proven fake. But they might actually be nothing but AI trickery.
What Proof Does the User Have?
To prove that the Super Zoom feature of the Galaxy S23 Ultra is fake, the user tested the effect by themself. To do so, the Redditor downloaded a high-resolution image of the moon and downsized it to a 170 by 170 resolution. Then, the user applied a gaussian blur, obliterating any final surface details.
Putting the low-res blurry photo of the moon on a monitor, the user walked to the other end of the room. While being at a distance, the Redditor clicked a photo of the edited moon with the Galaxy S23 Ultra. The phone did some processing and offered a picture that had considerably greater detail than the image on the monitor.
Gizchina News of the week
Thus, the user came to the conclusion that Samsung is “leveraging an AI model to put craters and other details on places which were just a blurry mess.” But the Redditor did give props to the S23 Ultra as it utilized multiple images to recover the otherwise lost detail.
How Is the AI Making the Galaxy S23 Ultra Get Great Details
After much analysis of the experiment, the user proposed a theory that explains the workings of AI. According to that, for the Super Zoom feature of the S23 Ultra, Samsung is supposedly using a specific AI model. This model is likely trained on a set of moon images, which makes it so efficient that it can recognize the moon at any given conditions. The Redditor further explained:
This is not the same kind of processing that is done when you're zooming into something else, when those multiple exposures and different data from each frame account to something. This is specific to the moon.
Considering the fact that the moon is tidally locked to earth, it is very easy to train an AI model on other moon images. All it needs to do is put some texture when it detects one. And the user stressed that the AI is doing all of the work of Super Zoom, not the optics of the Galaxy S23 Ultra.
But computational photography is not actually a bad thing. Even apple iPhones rely on it. And not to mention, google Pixels has been making names for using computation photography skills. So, can you really blame Samsung for using AI for the Super Zoom feature?