Black Magic

Google’s Night Vision Shows the Growing Importance of Soft Features

Although most smartphones launched these days are great, they don’t introduce many new “wow” features. Night Sight is certainly one of those impressive capabilities.

In October 2018, with the launch of its third generation of Pixel smartphones, Google introduced a photo mode called Night Sight. This improves low-light photography, going beyond what any camera targeted at consumers has done before. Google says it’s like “seeing in the dark.” It’s an accurate description; Night Sight is almost spooky.

Night Sight isn’t a hardware feature. It doesn’t use a flash or special lens. Rather, it’s powered by Google’s algorithms and computational power, allowing the company to roll it out to all previous generations of Pixel phones as well.

Google explains the basics behind Night Sight in a blog post, but this remains a mysterious tool, leaving users scratching their heads about how it could possibly work. The setting allows Pixel users to capture a photo with only a small amount of light — not quite complete dark, but close — and the result is a shot that looks like it could’ve been taken in the middle of the day. You may not be able to see the scene clearly, but the camera and its supporting cast can.

According to Google’s blog, Night Sight is made possible by “computational photography and machine learning”, combining a burst of photos into one clear snapshot and cleaning away the noise. Google is using its talents in artificial intelligence and cloud computing to bring magic to smartphone photography.

But Google isn’t stopping there. The company is expected to launch its Duplex assistant to Pixel phones over the coming months. Duplex, which was announced back in July, is an advanced example of what an artificial assistant is capable of, making restaurant reservations using natural conversation. Users simply instruct Google Assistant to make a booking in a certain area, and Duplex does the rest.

Another tool that Google introduced in October was its call-screening feature, a very welcome addition given the vast number of spoofed calls that subscribers get. The feature intercepts phone calls, with a robot asking the caller to identify themselves. The receiver sees a near-live transcription on their Pixel screens that gives enough information to decide whether to take the call.

Google is using its prowess in artificial intelligence computing, mixing local and cloud computing to pack more features into smartphones and other devices. Although most components are available in the smartphone industry, it’s a secret, algorithm-based ingredient that will allow device makers to distinguish their products in the future. Google is showing the importance of artificial intelligence wizardry.

For those who think smartphone development has become boring, there’s light at the end of the tunnel.