Samsung’s moon shots aren’t a new point of debate. The first smartphone with a usable periscope telephoto camera was the Huawei P30 Pro in 2019.
A regular telephoto camera has a traditional lens design with a long focal length. It’s rather difficult to achieve extreme amounts of magnification beyond three times optical zoom with a traditional setup. That’s where periscope zoom comes in.
Periscope cameras use prisms or mirror systems to allow better focal length in a more compact form factor. As a result, the zoom range is much greater. Thanks to periscope technology, we have phones with 4X, 5X, and even ten times optical zoom.
Periscope cameras come with a higher price tag due to the intricate and meticulous engineering and manufacturing required to produce them.
Low-dispersion glass and aspherical glass help reduce distortion and improve image quality. Therefore, companies use periscope cameras only on their highest-end phones. Entry-level flagship phones stick to traditional 2X or 3X optical zoom cameras.
Samsung wasn’t the first to enter the periscope game, but they quickly became the masters of it. The S20 Ultra‘s 103mm 4X optical zoom camera with 10X hybrid zoom was the first one with heavy moon shot advertisements (100X).
The successor, the S21 Ultra, got a new 240mm 1/3.24″ periscope telephoto camera with a whopping ten times optical zoom. A little weirdly, the S22 Ultra and S23 Ultra have slightly smaller 10X sensors at 1/ 3.52″.
The controversies began with Huawei’s shenanigans when it was found that the company was simply using AI to fake moon shots. If the software recognises a moon, it just superimposes a ready-made AI shot of the moon, making it appear realistic.
In this article, we’ll explore whether the moon shots from Samsung phones are a gimmick or whether they actually capture the moon.
How do Samsung moon shots work?
Moon shots on Samsung phones use the periscope telephoto camera. But if we scientifically analyse this, there are two conflicting points of view. The moon is approximately 384,000km away.
The distance makes it difficult for even a 10X sensor to accurately capture all of the details. However, sensors are good at recognising bright objects. The moon is objectively the brightest object you see in the night sky.
The long-range periscope telephoto cameras capture really bright subjects easily. If you’re wondering why it’s easier to capture a moon at 50X or 100X than a swooping landscape at 30X, it’s because sensors can capture bright subjects better.
As a result, the images still retain organic detail despite the heavy crop-in. With smartphone cameras, half of the end result depends on image processing in these times.
Smartphone camera hardware is improving but is undeniably smaller than actual camera hardware. We’re only approaching the 1″ size for the main sensors. Most secondary sensors are still stuck at smaller than 1/2″ sizes.
Companies are currently prioritizing software processing to maximize image quality as it is more challenging for them to frequently upgrade their hardware. Software is responsible for several image enhancements.
Software is responsible for artificial sharpening, noise reduction algorithms, choosing a target white and black point, setting the right ISO and shutter speed, reaching a balance with target exposure, and more.
Not to forget that software is entirely responsible for high dynamic range capture, where the software intelligently fuses multiple images taken at various exposure levels.
It blends the images together into one evenly exposed shot with proper shadows (darker regions like trees or buildings) and highlights (bright skies and lights). Recently, AI recognition also started becoming rather popular.
Are the moon shots fake?
The Snapdragon 8 Gen 2 SOC’s ISP is a great example of AI recognition. Prior to this, Pixels could do it with their Custom ISPs.
Using Machine Learning algorithms, AI recognises everyday objects, hair, trees, skies, water bodies, etc. and fine-tunes these in the end result. As a result, the photos look much cleaner and more presentable. Some even have dedicated food modes.
Similar to the existing AI enhancements that are based on subject recognition, the moon is an easy subject to recognise for the built-in algorithms. It’s a combination of both Samsung’s software processing and the 8 Gen 2’s excellent new ISP.
This is where the controversy about Samsung’s moon shots begins and ends. A new post on Reddit claims that Samsung’s moon shots are fake. There’s no clear-cut answer to this, but not all of the moon shots that come out of Samsung phones are fake.
There are two scenarios to consider. In the first, you point your phone at the sky, and it recognises the moon. Here, Samsung does not superimpose an artificial image of a pre-saved moon.
Once AI detects the tiny blob of light in the sky, it applies sharpness and locks the target exposure to the moon to make the shot look more presentable. It identifies the moon’s details, enhances them, and adds artificial texture to the real subject.
Samsung confirms the use of AI to enhance moon shots with the scene optimiser on. Since it’s going through a machine-learning algorithm and improving image quality from training data, it’s not a fake photo.
Samsung isn’t recognising a moon and pulling an already stored photo to impose onto the existing shot. It’s still your photo, and machine learning algorithms do just some tampering to enhance it.
Does it matter?
Whether the photos are still organic or not is a drawn-out subjective debate. It’s clearly not layering an image on top and is using AI to detect where the moon’s craters should be. Then, it fills them up to give you the end result.
Other companies like Vivo, for example, detect a blob of white light in the sky and pull an existing image of the moon. That’s not real at all. Samsung’s AI is smarter and only adds details to the existing shot. It tampers with the image quite a bit.
However, it does not fully superimpose a new image on top. There’s another phenomenon called the synchronous rotation of the moon.
Due to this, we’re essentially in a tidal lock. We always see only one side of the moon, which looks the same from any region at any angle. Hence, Samsung can extensively train AI to recognise moons.
In the peculiar Reddit case, the AI simply found out that there were some craters in the region. Despite the heavy amount of blur, you can still make out that it’s a moon and that there are craters.
That’s the very nature of machine learning. It learned from the moon shots that were fed into it during training and intelligently filled out the spots where the craters should have been, all while enhancing the sharpness.
Other tests from creators show that it doesn’t recognise a bright white ball as a moon, so it’s trained very well, and Samsung knows what it’s doing.
Samsung uses multi-frame technology and deep learning techniques to give you the best possible end result. Ultimately, it’s better than having no built-in moon mode at all.
What does the sensor actually capture?
We’ll step away from moon shots for a minute. Remember that 10X pericope on the S23 Ultra? Beyond 30X zoom, AI is doing most of the work. It recognises common objects, trees, signs, texts, etc., and artificially sharpens them up in post-processing.
Current smartphone optical limitations mean that we can’t get a detailed moon shot that’s not tampered with if you want it to look anywhere near presentable at high magnification.
Samsung’s entire scene optimisation software uses artificial intelligence and deep learning. If you dislike AI enhancement of photos and you want a more realistic representation, we recommend turning scene optimisation off.
The S23 Ultra captures a really poor image with the scene optimiser turned off. It’s totally washed out, has zero detail, and is a soft, unusable mess. If you turn off the scene optimiser, you’re not really getting a good moon shot.
This proves that the sensor captures the moon, but it doesn’t look good without AI’s enhancement.
How to take moon shots on any phone?
You only need a camera app to adjust the exposure/ ISO and shutter speed. Open the Pro mode on an Android phone; you’ll need a third-party app on an iPhone.
Another pre-requisite is needing at least 3X optical zoom on your phone. It’s doable with 2X, but the image won’t look great. You’ll not get a great photo if you don’t have optical zoom.
- Drop the ISO to about 100 or below 100.
- Set the shutter speed to about 1/125.
- Use Lightroom or Snapseed and dial up the sharpness.
- Adjust contrast levels and brightness according to your preferences.
This further proves that you can get decent moon shots with a zoom camera. It’s the brightest object in the night sky, and you will see some craters or details of the moon if you do it right.
Twitter users tested the S22 Ultra’s moon shot against one from a professional camera. The findings remain similar to the above arguments. Samsung only enhances existing moon shots if you turn the scene optimiser on.
Samsung is not artificially superimposing a fake moon. However, the moon shots that come out of the S23 Ultra are not one hundred per cent real.
This has a degree of subjectivity since it depends on how much people want AI to tamper with their photos. If you consider sharpness boost and deep learning algorithms too much tampering, you’ll consider the moon shots from the S23 Ultra as fake.
If you’re someone who believes that AI enhancements are necessary for a good final output, the moon shots are definitely real. The fact, however, remains. Samsung is not pulling up a pre-existing image to fake moon shots.
Furthermore, metadata proves that Samsung has no “fake moon” stored on the phone.
The only code that corresponds to moon shots is buried in the scene optimiser, and all it does is add crater detail and sharpness based on previously trained algorithms.