Posted: 2021-05-20 04:00:11

If Amazon’s Alexa thinks you sound sad, should it suggest that you buy a tub of ice cream?

Joseph Turow says absolutely no way. Turow, a professor at the Annenberg School for Communication at the University of Pennsylvania, researched technologies like Alexa for his new book, “The Voice Catchers.” He came away convinced that companies should be barred from analysing what we say and how we sound to recommend products or personalise advertising messages.

Turow’s suggestion is notable partly because the profiling of people based on their voices isn’t widespread. Or, it isn’t yet. But he is encouraging policymakers and the public to do something I wish we did more often: Be careful and considerate about how we use a powerful technology before it might be used for consequential decisions.

Should Alexa be able to react based on the tone of your voice?

Should Alexa be able to react based on the tone of your voice?Credit:AP

After years of researching Americans’ evolving attitudes about our digital jet streams of personal data, Turow said that some uses of technology had so much risk for so little upside that they should be stopped before they got big.

In this case, Turow is worried that voice technologies including Alexa and Siri from Apple will morph from digital butlers into diviners that use the sound of our voices to work out intimate details like our moods, desires and medical conditions. In theory they could one day be used by the police to determine who should be arrested or by banks to say who’s worthy of a mortgage.

“Using the human body for discriminating among people is something that we should not do,” he said.

Some business settings like call centres are already doing this. If computers assess that you sound angry on the phone, you might be routed to operators who specialise in calming people down. Spotify has also disclosed a patent on technology to recommend songs based on voice cues about the speaker’s emotions, age or gender. Amazon has said that its Halo health tracking bracelet and service will analyse “energy and positivity in a customer’s voice” to nudge people into better communications and relationships.

Turow said that he didn’t want to stop potentially helpful uses of voice profiling — for example, to screen people for serious health conditions, including COVID-19. But there is very little benefit to us, he said, if computers use inferences from our speech to sell us dish detergent.

“We have to outlaw voice profiling for the purpose of marketing,” Turow told me. “There is no utility for the public. We’re creating another set of data that people have no clue how it’s being used.”

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above