Samsung's Moon shot explained: Scene Optimizer plus Super resolution and AI magic

In the last few days, Samsung has been under some heat over allegations that it fakes the Moon shots on the Galaxy S23 Ultra. It all started when Redditor u/ibreakphotos put a blurry image of the moon on his monitor and took a picture with the Galaxy S23 Ultra, which then produced a good-looking moon. Outlets then started to pick it up and now Samsung felt it needed to explain its process to the world.

Samsung put out a detailed and technical explanation of how its moon shot works. The article has actually been online for a while now, just in Korean, but the recent controversy gave us the English version. Samsung uses Scene Optimizer, AI Deep Learning, and Super Resolution. Moon shot will engage when you've enabled Scene Optimizer and zoomed above 25x, the AI Deep Learning engine, which is preloaded with a variety of moon shapes and details, will recognize the moon, then Super Resolution applies multi-frame processing to enhance the moon.

This is nothing new. Samsung has been doing the same thing since the Galaxy S20 Ultra premiered '100x Space Zoom' and it's certainly not the only manufacturer to be using that kind of processing.

So Samsung's moon shots aren't technically fake, they're enhanced using AI. In reality, you're not getting the final image from the lens and sensor of your phone, but more from its processing engine. But really, what did you expect? You need a humongous lens, tripod, and an expensive dedicated camera to pull off a decent image of the moon.

Anyway, we'll likely get to talk about this in another two or three years when people once again forget about this and someone brings it up again.

Source