Google presented a new night mode, called Night Vision, during the premiere of Pixel 3 (Review), which promised a seemingly huge improvement in the quality of photos in low light. Now the company officially introduces this feature to the entire range of Pixel smartphones. Unofficially, Night Sight was at large for several weeks, and on the XDA programmers' forum it was possible to modify the Google camera app to enable it on any Pixel smartphone. Since then, we also had ports for OnePlus 6 (Review) and OnePlus 6T (Review), and recently also ports for Poco F1 Xiaomi (Review) and Xiaomi Mi 8.
In this article, we'll discuss how Night Sight works, what the results look like and how they can be obtained on a Google Pixel smartphone.
What is Night Sight and how does it work?
The main purpose of the night shooting mode is to reduce noise and improve detail in photos taken in a dark environment. In traditional photography, the easiest way to achieve this is to use a slower shutter so that the sensor can absorb more light. However, without a tripod, motion blur can become a big problem if you (or even a person) move slightly. This technique on a smartphone can produce disastrous results because they are almost always used in mobile devices.
Night view of the Google Apps app. It now moves to the Pixel smartphones
When you see Night Vision in action, it seems pure magic. In fact, it works on the basis of computational photography and machine learning. According to the Google blog post, the main challenge faced by the company was to align all objects in the frame because Night Sight used a frame averaging technique, like HDR + and Super Res Zoom. Pixel and Pixel 2 use a modified version of the HDR + algorithm for Night Sight, while Pixel 3 uses an improved version of the new Super Res Zoom algorithm to get similar results.
From the moment you press the shutter button in Night view mode, the camera captures a series of frames in succession. The number of frames captured depends on the amount of light available and whether you use a hand-held telephone or a tripod. It can have from 6 to 15 frames. Pixel 3 and Pixel 2 (Review) have the advantage of being able to cope with longer cage exposures because they have optical stabilization (OIS) to compensate for shocks, but the original Pixel resorted to shorter exposures because there is no OIS.
A few things worth pointing out at this stage – the camera needs about a few seconds to complete all frames, at which time you have to stop. If you move or the object moves, these frames will either be discarded or your subject may have a slight motion blur in the final image.
After saving the image, it is processed in the background to correct the white balance and exposure level. Google claims to have developed a special, deep learning, automated white balance algorithm that was trained on Pixel 3. That's why when you compare photos taken with Night Sight on all phones, photos from Pixel 3 usually have a better white balance. Google himself admits that this algorithm will provide the best results in Pixel 3. The company has also modified its tone mapping techniques to find the right balance between providing a well-lit image and keeping true time of day.
How do you get Night Sight right now – and does it really make a difference?
If you have a Pixel smartphone, you'll see an update soon for the Google Camera app waiting in the Play Store. If it's not there yet, be patient, because Google is gradually introducing this update, so you should finally get it.
After updating the Google Camera app, open it and go to the More tab, where you'll see Night vision. If you are shooting in automatic mode and the surroundings are very weak, you will see a small hint in the viewfinder Try Night Sight.
Night Sight function in action on Pixel 3 XL
Taking pictures is quite easy. Just point the camera at the object, press the shutter button and hold it down until the process is finished. You can – if you want – set the timer for three or 10 seconds to start the countdown to the shot, and the only one for the Pixel 3 series is the possibility of manual focus. This last function is only needed when shooting in the dark and the camera is unable to lock the focus. "Near" forces the camera to focus on a height of about 1.22 m, while "Far" focuses on 3.66 m and more.
In the case of test photos for landscape photos, we've tried the Google Pixel 3 XL, Pixel 2 XL and the original Pixel with a decent amount of light around. All three generations of phones have done decent work with details, but in the sky there was still a lot of visible noise, and the objects in the shade were not clearly visible. Thanks to Night Sight, the noise was drastically reduced, giving us much cleaner images. Colors and details in distant objects were also much better defined and we could see much more in the shadows.
Landscape test, shot from (below) and without (above) Night Sight
Our results of the proximity test show that our facility was barely visible when shooting in automatic mode. In fact, it's almost what we saw with our bare eyes. But thanks to Night Sight it's as if someone turned on the lights. Colors, details and clarity of images have improved drastically, thanks to the excellent sharpness and blurring of the background.
Proximity test, with (below) and without (above) Night Sight
Night Sight also works with Pixel selfie devices. Here you can also turn on fill lighting if it's too dark to shoot. In this test we were almost completely dark, with only lighting from the lights on the buildings opposite us and a little behind the door behind us. As expected, none of the phones was able to capture anything worthwhile to be available in automatic mode, but in Night Sight, the difference was huge. The picture quality got even better when the filler lights for Night Sight were turned on.
Selfie test with (below) and without (above) Night Sight
Based on what we have seen so far, Night Sight seems to be a game changer and almost completely eliminates the need to use the LED flash on the phone for photography in many situations. There are several limitations to this function, for example, it is not very effective in the dark and is best used for stationary objects.
It's nice to see Google adding this feature to older Pixel phones, but we really want the company to open it for OEMs who will be able to use their own camera applications. This was done using Google Lens and AR Stickers, so why not Night Sight?
Is Google Pixel & # 39; s Night Sight the future of shooting in low light? We discussed it on Orbital, our weekly technology podcast that you can subscribe to via Apple Podcasts or RSS, download an episode or simply press the play button below.