2023 is the year of one-inch sensors, megapixel infinity cameras, and… terrible previews. What do I mean by preview? To what we see on the screen when we open the camera and have not yet taken the photo. The phones post-process the image, so we don’t see the final result until we go to the gallery.
However, there is a completely losing battle in the case of Android manufacturers: HDR rendering in preview. It is something quite difficult to justify and a melon that I would like to open today. We have mobiles capable of competing with computers and professional cameras that, due to the manufacturers’ negligence, continue to show white skies and burnt out in something as important as the preview.
Yes, it is possible to render HDR in preview
Apple has been rendering HDR in preview for generations. This is especially relevant in phone cameras since, otherwise, we have to trust that the phone will correct the burn. If everything appears controlled in the preview, we will be sure of how the final photograph will be in the gallery.
This is a challenge that requires some horsepower, a point at which Apple has historically led Android. The Google Pixel began to show the preview with the HDR processing starting with Pixel 4. It had a Snapdragon 855 capable of processing this information but… what about the small Pixel 4a and its mid-range processor? The story, of course, is more than interesting.
Google wanted its Pixel 4a to display the HDR preview as the final photo would look. It’s a matter of philosophy, of caring about the shooting experience. Despite its intentions, the Snapdragon 730 was a somewhat fair processor to achieve this end: the “Live HDR +”.
designed an algorithm specially designed for the GPU, as efficiently as possible in memory. The phone calculated, in real time and based on small groups of pixels, the amount of light to correct. Basically, the phone used various inputs (photographs with different exposures) to calculate the outputs to be applied, without the need to fully process the photograph. These calculations were carried out by a neural network that predicts, from the image, the HDR curves to apply.
Google went even further, and that is that the Google Pixel allows you to adjust the high beams and low beams separately, something that not a single phone has achieved to date (and it has already rained since the Pixel 4). In simple words: we can raise the shadows of the photo without burning the sky, and lower them without turning off. The possibilities are enormous.
At the moment, Google is the only company on Android that renders HDR in real time from the preview. That is why the shooting experience in the Pixels is spectacular (although there is a lot of processing in the gallery), without burned skies. When working with low-res inputs, this preview is sometimes grainy and lacking in detail, though the HDR works brilliantly. Apple is the one that works best on this point, with a preview that is also imperfect, but with high resolution and with Live HDR.
Today’s processors are becoming more capable
With the Snapdragon 8 Gen 2 I had a breath of hope: its star feature was real-time image processing. To give you an idea, this processor is capable of Process highlights and even faces (to adjust sharpness and tone) in 4K video recording in real time. The processing power is simply wild.
I have already tried phones with this processor and … the preview is still bad. Until I open the gallery I don’t have the photo that should be previewed. It’s a detail geeky, irrelevant for average users, but which highlights those points completely forgotten by manufacturers. Sometimes complaining helps. For trying, let it not stay.
Image | xataka
In Xataka | The mobiles with the best cameras that we have analyzed in recent months (2023)