Posted: 2023-10-04 18:31:00

With its Pixel 8 and Pixel 8 Pro smartphones, Google is bringing its big guns to the battle for smartphone photo and video leadership. Among more than a dozen notable improvements coming to the Android phones is a tool called Video Boost that uses AI processing on Google's server-packed data centers to dramatically increase image quality.

When you first shoot a video on the Pixel 8 Pro, you'll have just a 1080p preview version. But during a couple hours or so for uploading and processing, Google uses new artificial intelligence models too big for a smartphone to improve shadow detail, reduce pesky noise speckles and stabilize the video. That means Google's Night Sight technology, which in 2018 set a new standard for smartphone photos taken in dim and dark conditions, has now come to video, too. Or at least it will when Video Boost ships later this winter.

"Night Sight means something very big to us," said Isaac Reynolds, the lead product manager in charge of the Pixel cameras. "It is the best low-light smartphone video in the market, including any phones that might have recently come out," he said in an unsubtle dig at Apple's iPhone 15 models. But Video Boost improves daytime videos, too, with better detail and smoother panning.

Reynolds spoke during an exclusive deep dive interview about the new photo and video technology in the $699 Pixel 8 and $999 Pixel 8 Pro, unveiled Wednesday. He detailed the new hardware Google calls dual exposure that makes Night Sight video possible, along with a range of new Pixel photography advances: higher-resolution photos, better lenses, a new app with advanced photography controls, an ability to create a group photo where everybody is smiling and an Audio Magic Eraser to clean up the sound in a video.

Camera abilities are key to smartphones, but especially to Google's Pixel phones. They're gaining market share but remain relatively rare, accounting for just 4% of North American phone shipments in the second quarter. Good photos, bolstered by years of computational photography work, are arguably the Pixel line's strongest selling point.

But the Pixel phones' video has been weak when there's not much light. Improving that, even if it takes a helping hand from Google's servers, is crucial to making a Pixel phone worth buying.

"Where we really wanted to make a huge difference this year was video," Reynolds said. Video Boost is "the most exciting thing that I've done in years."

Here's a detailed look at how Google is trying to wring better photos and videos out of its Pixel 8 phones.

How Google's Video Boost works

Many developments were necessary to make Video Boost possible.

At the foundation is a newer image sensor technology in the main camera called dual conversion gain that improves image noise and dynamic range -- the ability to capture both shadow and highlight details. Google refers to its approach as "dual exposure," but unlike conventional HDR (high dynamic range) technology, it doesn't blend multiple separate shots.

A side-by-side comparison of video from Apple's iPhone 15 Pro and Google's Pixel 8 Pro processed with Video Boost video technology. In the comparison, Google's video shows more details on a person's shadowed face and a bluer sky.

Google says its Video Boost technology produces better video dynamic range, including shadow detail and highlights that aren't blown out, when compared to the Apple iPhone 15 Pro. But you'll have to wait hours to get your Video Boost video back from Google's data centers.

Google; Screenshot by Stephen Shankland/CNET

Instead, the dual conversion gain technology is able to simultaneously capture details from both low-light and bright areas of a scene pixel by pixel, then blend the best of both. The result: "Whether it's a high-contrast scene or a low-light scene, you're going to see dramatically better performance versus the Pixel 7 and Pixel 7 Pro," Reynolds said. "You don't have to give up the dynamic range. That means less underexposure, which means less shadow noise."

 The technology is on both the Pixel 8 and 8 Pro, but only the 8 Pro gets Video Boost.

Next is the new Tensor G3 processor, the third generation of Google's Pixel phone processors. The G3 has more built-in Google circuitry for AI and image processing than last year's G2, and Google uses it to produce two videos. One is the 1080p preview version you can watch or share immediately.

The other is the Video Boost version that's uploaded to Google for more editing. The G3 preprocesses that video and, for each frame, adds up to 400 metadata elements that characterize the scene, Reynolds said.

The last Video Boost step takes place in Google's data centers, where servers use newly developed algorithms for noise reduction, stabilization and sharpening with low-light imagery. That processed video then replaces the preview video on your phone, including a 4K version, if that's the resolution you originally shot at.

Reynolds defends the video's data center detour as worthwhile.

"The results are incredible," he said. Besides, people like to reminisce, revisiting a moment through photos and videos hours later, not just months or years later. "I don't think there's any downside at all to waiting a couple of hours," he said.

See the Pixel 8 and Pixel 8 Pro Up Close and Personal

See all photos

It may be worth the wait, but the wait also might be longer than just a couple hours to pump gigabytes of video to Google. If you're away from home Wi-Fi, you might be worried about blowing through your plan's mobile data cap. And when you're at home, you might be among the millions of people whose broadband doesn't actually offer fast upload speeds.

If you're nervous about sending your videos to Google's cloud, Video Boost largely follows the same privacy policies Google already used. "If you delete the backed-up version after processing but keep it on-device, Google will not retain any copy," Google said.  The difference is with intermediate versions of the video the company creates during Video Boost processing. Those are discarded once the new video is done.

More megapixels on Pixel 8 cameras

If you're taking photos, the sensor doesn't use the image sensor's dual conversion gain technology -- at least yet, though Google says it's excited about the technology's potential.

But there are other big improvements: Like Samsung and Apple, Google is now advancing beyond the 12-megapixel smartphone photo resolution we've had for years.

When Apple introduced its iPhone 14 Pro in 2022, it let photographers shoot 48-megapixel photos with the main camera. Samsung goes even further with a 200-megapixel sensor, though the results aren't generally impressive beyond 50 megapixels. In comparison, even though the Pixel 7 and 7 Pro had 50-megapixel main cameras, Google offered photos at only 12-megapixel resolution. (Though it did offer 2x and 10x modes that took advantage of the full resolution of its sensors.)

This year, Google is leapfrogging Apple when it comes to pixel count on the Pixel 8 Pro. Not only can you take photos at the main camera's full 50-megapixel resolution, you can also take 48-megapixel ultrawide (like this year's OnePlus 11) and 48-megapixel 5x telephoto shots. (The Pixel 8 only can take 12-megapixel ultrawide shots.)

Google's Pixel 8 Pro and Pixel 8 smartphones in their bay and rose colors

Google's Pixel 8 Pro and Pixel 8 smartphones in their bay and rose colors.

Stephen Shankland/CNET

Like all flagship phone makers, Google has employed a technology called pixel binning that offers photographers a choice between full-resolution shots and lower resolutions that work better in low-light situations. But this year, it'll let you shoot full-res shots even in low light if you prefer. That's unlike Apple, which switches to pixel binning and low resolution automatically when it's dark.

"You will always get more detail by enabling 50 megapixels [than when shooting at 12 megapixels], even in very low light, although you may suffer some noise penalty," Reynolds said.

You can also use Night Sight for lower noise. It works at 50-megapixel resolution.

Apple changed its main camera's default photo resolution from 12 megapixels to 24 megapixels with the iPhone 15 models, released last month. It also uses the HEIF image format, which stores files more compactly than the older JPEG. If you're a Pixel photographer, you can choose the full resolution or 12 megapixels, but nothing intermediate. And there's no HEIF support, because Google prefers JPEG's universal compatibility.

We won't know until testing whether Google's high-resolution shots are worth it. Small pixels are worse when it comes to image noise and dynamic range. But Google has invested in better hardware, including wider aperture lenses that gather more light.

Pixel 8 camera hardware upgrades

Both the Pixel 8 and Pixel 8 Pro get the new higher-end main camera with dual conversion gain, and that gathers 21% more light than on the Pixel 7 generation.

Both phones get a new selfie camera, but it autofocuses only on the Pixel 8 Pro. Better image processing possible with the G3 helps improve color and reduce noise on both phones, Google said.

And as with the 2022 phones, only the Pro model gets a 5x telephoto camera. It uses the same sensor as the Pixel 7 Pro, but this year, the Pixel 8 Pro gets a wider f2.8 aperture lens to gather 56% more light. That's the same aperture as on the iPhone 15 Pro Max's 5x telephoto.

"We're going hard on low light," Reynolds said. A secondary benefit: Switching to the 5x camera is faster because the phone can lock focus more swiftly than with the Pixel 7 Pro's f3.5 lens.

Google Pixel 8 Pro's three-camera camera bar

Photos and videos are a top priority and a competitive strong suit for for Google's phones. This closeup of the "obsidian" colored Google Pixel 8 Pro shows its larger new 48-megapixel ultrawide camera at left, its updated 50-megapixel main camera in the center and its 48-megapixel 5x telephoto camera at right.

Stephen Shankland/CNET

Only the Pro gets an improved ultrawide camera. It can gather 105% more light thanks to a larger sensor with a wider-aperture lens. That's important for low-light scenes and for supporting the 48-megapixel resolution.

The Pro's ultrawide camera also gets autofocus abilities and reduces its close-focus distance from last year's 3cm to 2cm -- about 0.8 inch. That means macro shots will have much better background blur, Reynolds said.

New Google camera app gets "pro controls"

Google is proud of its "computational raw" technology, which combines the multishot blending used to create ordinary JPEGs with the editing flexibility of raw photos. That means more dynamic range than with single-frame raw shots, which is handy for people who edit their photos afterward in software like Adobe Lightroom.

With the Pixel 8 Pro's camera app, Google is giving photographers new controls to give photographers fine-tune their shots as they're taken. It's got new "pro controls" that expose options for shutter speed, exposure length, white balance, ISO sensitivity and focus, Reynolds said.

Most people won't shoot raw or use the manual controls, but that doesn't mean it's not important. "It gets you the photo you need when you absolutely have to have it a certain way," Reynolds said. 

Another big change to Google's camera app has already started arriving. For years, that app has presented buttons with a choice of modes like photo, video, panorama, Night Sight and slow motion video. Now Google offers a master switch for video and photo, each with its own range of buttons.

Also new to the camera app is an improvement to the DNG files used for raw photos. They now store more metadata so software like Adobe Lightroom will display a version that better matches the colors and tones of the Pixel's fine-tuned JPEG. Raw photos in general have been retuned for better color and subtle tonal differences, Google said.

The Pixel 8 and 8 Pro phones are the first smartphones to use Ultra HDR, an Android photo format that adds extra information to a JPEG photo so compatible software can show a high dynamic range version. For instance, Ultra HDR brightens stars in astrophotography photos, which show more detail at 50-megapixel resolution.

New Pixel 8 shooting tricks

"The computational photography in Pixel camera is one of the biggest mobile computing breakthroughs in the last decade, and it's moved the entire industry forward," said Google devices chief Rick Osterloh at a press conference Wednesday. That sounds self-congratulatory, but Google can claim credit for effectively tackling the challenges of wringing useful image quality out of small image sensors.

Computational photography, often employing artificial intelligence technology, figures into some other abilities of the of Pixel 8 and 8 Pro:

  • For group photos, a new feature called Best Take lets you choose the faces you want from a group of photos. It shows thumbnails of each face in the photos, and when you tap on one, it shows you the various expressions. You can pick everybody's best smiles or goofy faces for the composite photo the camera creates.
  • The G3's AI acceleration abilities automatically cut noise from crowds and wind out of videos. But a new editing tool called Audio Magic Eraser isolates different sounds to let you pick what you want. Sound level sliders adjust the mix of sound from speech, wind, music, crowd hubbub and background noise.
  • A new Magic Editor tool lets you increase or decrease the size of scene elements like people. One tap outlines a scene element, then pinching or dragging moves it around. Like Video Boost, this tool goes to Google's cloud computing system to do the heavy lifting of creating any new imagery that's needed. The tool will only be available in an early access version to start.
  • A tool called Zoom Enhance will use generative AI to create higher-resolution photos out of smaller, pixelated originals.
  • Google has taken some measures to improve lens flare problems common on smartphones. Among other things, when shooting toward bright point sources of light like the sun, the Pixel can remove the distracting green dot such sources often produce.
  • Magic Eraser, which lets you obliterate scene elements like distracting people in the background, gets a big AI boost. It now uses generative AI so the phone can fill in larger areas. It also can remove shadows from selected elements you're erasing.

How well all these features and technologies work remains to be seen. But it's clear Google is investing heavily in the Pixel photo and video technology.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above