In this article, we'll be taking a deep dive into how Night Sight works, what the results look like, and how you can get it on your Google Pixel smartphone.
What Is Night Sight and how does it work?
The main goal of the Night Sight shooting mode is to reduce noise and improve details in images shot in dark environments. In traditional photography, the simplest way to achieve this is by using a slower shutter so the image sensor can take in more light. However, without a tripod, motion blur can become a big problem if you (or even the subject) move(s) even slightly. This technique on a smartphone can yield disastrous results, since these are almost always used handheld.
When you see Night Sight in action, it seems like pure wizardry. In reality, it works on the basis of computational photography and some machine learning. According to a Google blog post, the main challenge the company faced was getting the alignment of all the objects in a frame right, since Night Sight essentially using a frame-averaging technique, much like HDR+ and Super Res Zoom. The Pixel and the Pixel 2 use a modified version of the HDR+ algorithm for Night Sight, while the Pixel 3 uses a tweaked version of the new Super Res Zoom algorithm to deliver similar results.

From the moment you press the shutter button in Night Sight mode, the camera captures a series of frames in quick succession. The number of frames captured really depends on the amount of available light and whether you're using the phone handheld or on a tripod. It could be anywhere from 6 to 15 frames. The Pixel 3 and the Pixel 2 (Review) have the advantage of being able to handle longer exposures per frame, since they have optical stabilisation (OIS) to compensate for shakes, but the original Pixel resorts to shorter exposures since it lacks OIS.
A couple of things worth pointing out at this stage — it takes a few seconds for the camera to finish capturing all the frames, during which period, you will need to be still. If you move, or your subject moves, those frames are either discarded or your subject might have slight motion blur in the final picture.
Once a picture is saved, it undergoes some more processing in the background to correct the white balance and exposure level. Google says it has developed a special deep learning-based auto white-balance algorithm, which was trained on a Pixel 3. This is why when you compare images taken with Night Sight on all phones, the images from the Pixel 3 typically have better white balance. Google itself admits that this algorithm will deliver the best results on a Pixel 3. The company has also tweaked its tone mapping techniques to strike the right balance between giving you a well-lit image and staying true to the actual time of day.
How to get Night Sight right now — and does it really make a difference?
If you own any Pixel smartphone, you should soon see an update for the Google Camera app waiting in the Play Store. If it isn't there yet, be a little patient, as Google is rolling this update out in phases, so you should get it eventually.
Once your Google Camera app is updated, open it and go to the More tab, which is where you'll see Night Sight. If you're shooting in Auto mode and your surroundings are very dim, you'll see a little prompt in the viewfinder to Try Night Sight.