The Galaxy S23 Ultra lies to us all (and why that’s ok)
The fact that Samsung’s smartphones from the Galaxy S20 onwards may invent things with their “scene optimization” has been debated for a long time. Above all, the discussion about moon photos is heating up, but Samsung also interprets image details quite generously elsewhere. c’t 3003 took a closer look at the technology.
transcript of the video
(Note: This is bonus content for people who cannot or do not want to watch the video above. The video track information is not reflected in the transcript.)
Look here, this is Samsung’s flagship phone, the Galaxy S23 Ultra. And it’s also ultra fast and takes ultra good photos. At the very least, it’s considered one of the best photo smartphones out there, if not the best. At Mrwhosetheboss, for example, it wins against the iPhone 14 Pro Max in terms of photos.
But the S23 Ultra also does some pretty weird stuff. Look here, this is the image optimization in the pre-installed photo app. Where are the teeth coming from? Yes, and these moon photos, you’ve probably already seen them. They were all over the internet recently. The blatant thing is that it didn’t even work via the internal photo improvement app, but came straight out of the camera. Yes, the cell phone photographs things that are definitely not there. So are the photos of the S23 Ultra possibly also ultra fake? We *zoom* out of there and take a closer look. Among other things, we compared the photos of the S23 Ultra with a mid-range SLR and a high-end mirrorless camera. Stay tuned!
Dear hackers, dear Internet surfers, welcome to…
Well, editing photos has a long tradition. And now, beware, this may sound like a joke written by ChatGPT, but what do Instagram influencers and Josef Stalin have in common? Both use photo filters. Yes, of course, it was all a bit more complicated back then, you really had to use scissors and a paintbrush, and Stalin not only “shopped” unwanted people out of propaganda photos, but also had the pockmarks on his face retouched, for example. But that the first image manipulations were used specifically for propaganda purposes is not surprising and these methods of image processing are simply 100 years old and have logically improved more and more over time. Today, and that is of course a big difference, the manipulation is already carried out when the picture is taken, because the camera thinks it already knows what the photographer wants to see. And if the camera isn’t quite sure when taking a picture, then it at least offers “creative” image improvements afterwards.
Look here, this is just a photo in Samsung’s pre-installed Gallery app when you tap “Retouch Image”. And that’s definitely more than it used to be on phones. A bit of color correction and brightness and contrast. But sometimes the image enhancement AI also does very strange things with the photos. Twitter user @earcity noticed that Samsung is making things up on some of the images. Namely teeth, for example, which the seven-month-old child in the photo doesn’t actually have yet. And of course we tried that directly on our S23 Ultra test device.
We didn’t get such a blatant result, but the AI also had problems with us again and again and simply replaced the tongue of a small child with blurred teeth. Above all, however, the eyes were the subject of our test photos. The AI always made a real buck and generally just made teeth significantly worse than they actually were in the test photo.
Yes, Samsung landed a really blatant milestone in smartphone photography with the S20 Ultra when they announced that their cell phones could now photograph the moon particularly well. And this hype about moon photography didn’t last very long. Emotions ran high in January 2021 when people, including German tech YouTuber AlexiBexi, claimed that Samsung was inventing image details on Galaxy phones that weren’t there. Specifically when photographing the full moon. This culminated in a rather elaborate debunking article on inverse.com, where it was proven in many words that the fake allegations could not be true and that it was all down to Samsung’s super good image improvement algorithms. As proof, the author took a comparison photo with an expensive 200-600mm lens on a Sony a7R III and then superimposed the image from the cell phone and the expensive Sony camera. And the images generally matched, except that the photo from the cell phone was much sharper than the one from the professional camera. And that might have made you suspicious, but the result was, no, Samsung isn’t reinventing anything, it just has the greatest algorithms. A number of websites then parroted it: the topic was over, the facts had spoken.
But back then there was already a Reddit post in which user moonfruitroar demonstrated how he or she photographed a blurred moon from the screen, with a gray smiley face on it in monochromatic grey. Yes, and there’s moon texture in the end where the smiley was. And the Samsung AI detail enhancement engine can’t have gotten that out of the template, because there was nothing there. There was only a gray area. And we’ve experienced almost the same game again with the latest S23 Ultra: Samsung shows how they can take great photos of the moon. Everyone celebrates it and a little later it comes out: Yes, not at all. Because this time, too, someone on Reddit showed that what Samsung promises us can’t be true at all. To do this, the user ibreakphotos took a blurred photo of the moon and then photographed it again with his S23 Ultra. Then this photo was taken. Yes, and that shows very clearly that Samsung’s AI is at work and not the actual camera with the zoom, as Samsung suggests. Incidentally, we were able to confirm this in our own tests.
Samsung itself explains the moon photo function in such a way that the brightness is turned down, then several images are superimposed and in the last step the AI removes the noise and improves the details of the moon. So that the AI improves the details of the moon: that says it all. Of course it’s not so blunt that Galaxy cell phones always look around in the flash memory a bit when they recognize a moon and then slap a photo of the moon along. No, that’s a bit more sophisticated.
But the fact is, things are included that are definitely not included in the photo motif. But this shouldn’t be a hate against this development in smartphone photography. Because smartphones are simply cameras that you carry around in your pocket. And most people probably don’t want to delve so deeply into photography and things like aperture, focal length, white balance and stuff like that. And it is precisely for them that it is important that they simply take their smartphone without thinking too much and take a photo that then looks good in the end. That’s why many thought that it was already clear what Samsung was doing with the moon images. It’s not intended for people who want to do scientific astrophotography, but for people who would like to stage their glass of red wine in front of the moon for Instagram. Okay, the wine glass doesn’t really work despite the AI, we’ve tried that for you.
Well, but there is also the fact that cameras, especially in smartphones, always provide a certain interpretation of reality. We tried that and, in addition to the S23 Ultra, also took an iPhone 13, a mid-range SLR, the Nikon D5300 and a high-end Lumix GH5. So we just took the same photo four times in the middle of the day. All cameras were set to fully automatic for the test. And here, too, it is clearly noticeable that all photos look different, especially in terms of colors. This becomes particularly clear when we place the photo from the Samsung cell phone next to the picture from the Lumix. There’s just a lot less color in there.
Many smartphones also tend to oversaturate photos. It looks a bit different when we look at the RAW images of the four test devices. Here the photos are stored largely without editing. After all, professional photographers want to do it themselves and then say that the colors should be intense in one way or another. If we now simply throw the RAW image from the Lumix camera into Photoshop, then we click here and the image looks much more similar to the S23.
And yet the RAW images of the smartphones look much more intense in color than those of the large cameras. In the end, this definitely makes post-photo editing more complicated. But that’s also the good news: you can still turn off all the AI optimization and stuff on a phone and just take classic manual photos and edit them yourself afterwards. At Samsung, this is specifically called “scene optimization”. If you turn that off, things like the moon don’t happen anymore. But if you want to photograph the moon, you also need much better photo equipment.
All in all, Samsung digs pretty deep into its bag of tricks and, for example, generates image parts for the moon that are not in the template. And as a result, the tech world is wondering what actually is a photo? Where do you draw the line? I mean, the transitions between the AI-optimized image and the AI-generated image à la Stable Diffusion are somehow fluid. Here, this picture of the Pope, you may have seen it already. And although I work professionally with AI image generation tools, I didn’t check that the photo was generated because, yes, even the fashion in the Vatican is sometimes a bit special.
Well, but if my mobile phone, my picture with which I recorded my reality, now misses false teeth as an optimization without being asked, then that is definitely a serious incision in reality, which I maybe don’t want at all. On the other hand, I think it’s great when my cell phone makes my sunset photo look like I’m a professional photographer and depicts my reality as I perceive it. Even if in reality it might look a little less spectacular than in my head.
But it’s a different matter when it comes to photos of people. What does it actually do to us when we only see people in the optimized version and then we look in the mirror? how do you see it? Is this a good development with the AI and image optimization or rather dangerous? Feel free to write it in the comments and of course also subscribe. Bye!
c’t 3003 is c’t’s YouTube channel. The videos on c’t 3003 are independent content and independent of the articles in c’t magazin. The editors Jan-Keno Janssen and Lukas Rumpler and the video producers Şahin Erengil and Pascal Schewe publish a video every week.