Posted: 2023-10-15 18:00:00

Despite a similar hardware design, Google’s own take on the new Android 14 software does a lot to give the Pixel 8s a fresh look. There are a lot of lock-screen styles to choose from, which double as always-on-display designs (I especially like the new one that puts the date and temperature against the phone’s longer sides), and the options to colour theme your phone, widgets and icons automatically to your wallpaper continue to impress.

Loading

Embedded in the top of the display is a little selfie shooter similar to last year’s, but it has a new trick thanks to AI, in that it can recognise you a lot more accurately. The change means you can authenticate banking apps and sign in to services with your face, just like you can on an iPhone, without a big black spot at the top of the phone. The one downside to Google’s approach here is it doesn’t work in the dark, so you still need a fingerprint or PIN.

The phone for photo AI

Speaking of AI, this year’s update is big on generative features, both obvious and subtle. One example of the former is the new “AI wallpaper” feature, which is a DALL-E style text-to-image generator but with a lot of limitations. Choose a theme, pick some keywords, wait 20 seconds and you’ll be given eight different options to choose from. They’re always fairly abstract and don’t stand up to close scrutiny — to be honest, I’d always prefer one of the many real photos or paintings from Google’s wallpaper collection — but as a gimmick it works and will only get better.

Google’s Best Take feature can swap people’s faces when you take multiple shots.

Google’s Best Take feature can swap people’s faces when you take multiple shots.

Generative AI is more present in the new Google Photos editing suite too, where it toes the line between allowing you to realise the intention of a photograph and letting you straight-up invent stuff. The marquee feature here is Magic Editor, which is scarily good at replacing a gloomy sky with a nice blue one, shifting the colours for a “golden” hour look, removing unwanted elements or even completely changing the composition.

For example, I grabbed an older photo of my two kids at mini golf, posing in the left of the frame. Behind them is a giant statue of a cartoon rhino, centre frame. In the Magic Editor I tapped one kid to select him, held down to edit, then dragged him over to the right of the frame. After processing, the image just looks like the kids were naturally standing either side of the rhino. The part where the repositioned kid used to be standing now has some convincing invented detail, including a bit of path, some scattered bark chips and one of the rhino’s hands. It even gives you a few options to choose from so you can pick the most natural. The feature is also great at enlarging the moon, or removing dead tree branches. But for whatever reason, any time I tried to change the size of a person I was told it was against the company’s ethics policy.

On the left is my original photo, the middle is with Google’s AI-generated sky, the right has “golden hour” turned on.

On the left is my original photo, the middle is with Google’s AI-generated sky, the right has “golden hour” turned on.Credit: Tim Biggs

There’s also a feature called Best Take, which appears if you’ve taken a series of photos featuring a group of people. Pick one of the photos, and you can tap on each person’s face to cycle through the various expressions they made throughout the set, ending up with one picture featuring everyone’s best face. Like the Magic Editor it’s far from foolproof but can result in fakes nobody would pick at a glance.

To be honest, I can’t ever see myself using these tools on my own personal photos. I’m aware that some level of AI processing has been present in smartphone photography for a long time and is here to stay, but intentionally changing the content feels unnerving. That said, I can definitely see using it in place of Photoshop if I needed a specific edit of a non-human subject quickly.

For video, a new Audio Eraser will analyse clips for sounds and show a few of them as separate waveforms (for example, wind, speech or nature). Then you can watch and listen to the video while moving the levels around to cut out talking or annoying gusts. Like a lot of AI photo editing it works very well but leaves artefacts you’ll notice if you’re specifically looking for them. Other features, like a video enhancer that utilises Google’s cloud servers, are not present at the phones’ launch.

Loading

Comparing Apples and Androids

The new Pixels come hot on the heels of new iPhones, and the two families of devices share some similarities despite being fundamentally tough to compare. Google’s phones have been becoming more premium and expensive year over year, while Apple’s have been becoming more open — notably this year with the introduction of a USB-C port — so they’re closer to equivalent than ever.

Comparing the standard phones, Pixel 8 and iPhone 15, Google’s immediately stands out as more premium despite being $300 cheaper. They’re similar sizes and have similar camera set-ups, but Apple’s lacks a fast refresh and an always-on display. Under the hood Apple has also withheld some features from the standard iPhone that the Pixel happily supports, including USB 3.2 for much faster charging and data transfer, and autofocus on the ultra-wide camera which makes for a great close-up macro mode.

When it comes to the high end, Pixel 8 Pro against the iPhone 15 Pro Max, the gap is much closer though the prices are farther apart. (The iPhone starts at 256GB here, so a true like-for-like comparison would put the Pixel 8 Pro at $1800, but that’s still a $400 gap.)

The phones are evenly matched across almost all specs, though Apple has the strong advantage of an immensely powerful processor that outpaces the Pixel in raw strength. In cameras Google has opted for bigger sensors and more flexibility in editing, and the Pro has an excellent manual control mode, but Apple’s shots tend to be more pleasant for quick snaps since you can preset your preferred temperature and crop. When it comes to portrait mode and low light photography the Pixel is far more confident, and both phones now support adding a bokeh blur after the fact; Apple by capturing depth data when it detects a face and Google through the Magic Editor.

Both happily shoot 4K HDR video that looks amazing, though Apple supports Dolby Vision and ProRes which may suit professionals better.

iPhone users have access to a pretty full suite of Google products these days, and even features like the Magic Eraser have made their way from Pixel to iPhone, though the opposite is not true; those who choose Apple services tend to have a hard time on Android. The unique strength of the Pixel, then, is hardware that’s been specially tuned for Google’s AI tasks, and the latest in experimental features that may or may not become widely used in the future.

Still, putting the AI features aside these are still the nicest Pixels Google has yet produced, with some of the most genuinely useful features and best cameras you’ll find on an Android.

Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above