Sign Up
..... Connect Australia with the world.
Categories

Posted: 2023-02-25 00:30:00

Tech giants Microsoft and Google seem ready to roll out AI-powered chatbots as the next generation of internet search engines. But if early previews are any indication, humans may not quite be ready to use them.

The most remarkable thing about this new breed of chatbots is how believably human their writing is, and that can make for a game-changing search experience. At least that’s been my experience so far after a few days with Microsoft’s new Bing. Instead of a search query leading me to a Wikipedia page, or past stacks of ads to piles of disconnected Reddit threads and product reviews, Bing seems to scan them all and deliver a coherent, chatty summary with annotated links and sources.

Conversations with Bing’s AI chat don’t always go how one might think.

Conversations with Bing’s AI chat don’t always go how one might think.Credit:Bloomberg

When answering quick one-off search queries the chatbot is always confident, always matter-of-fact, and never vague, even though it pulls its facts and convictions from all over an internet that’s famously full of lies and venom. The issue, then, may end up being that human minds are too quick to accept this construct as a peer. In making its top priority to sound like a human at all times, Bing often behaves in a way that makes it impossible not to ascribe feelings and emotions to it.

When peppering Bing with trivia questions on subjects I know a lot about, it made a lot of mistakes. And if you point out the errors, it may clarify what it meant, or it may invent an excuse of varying believability. But rather than chalk this up to a failure of the product, my first impulse was to talk more to the chatbot and figure out where it went wrong.

When asking Bing for its own opinion on current events, I was always impressed by how measured and thought-through its responses seemed, even though intellectually I know it was just adopting language that would give that effect. It’s equally good at being playful if you give the right prompts, or persuasive, or melancholy.

Loading

And as more people spend more time with the Bing preview, many examples are popping up where these conversations with a machine provoked real and unavoidable human emotion.

In one exchange posted to Reddit, Bing appears to break down after mistakenly claiming the movie Avatar: The Way of Water is not out yet. To avoid admitting to being outright wrong, it claims the current year is 2022, and repeatedly accuses the user of lying when they correct it. That exchange ends with Bing suggesting the user either apologise and admit they were wrong, or reset the chatbot and start a new conversation with a better attitude.

A New York Times columnist kept it talking for hours and managed to get it monologuing about its dark murderous desires, and how it had fallen in love with him. A Verge journalist published a story detailing some of Bing’s stranger interactions, and when another user asked Bing about the journalist by name the chatbot called him “biased and unfair”.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above