Sign Up
..... Connect Australia with the world.
Categories

Posted: 2017-11-29 20:46:34

Persuasion architecture – in the physical world we are somewhat blase about it; after all, there are only so many chocolate bars that can be placed at a child's eye level near a 7-Eleven checkout.    

But digital persuasion architecture can be scaled to billions and tailored and targeted to each person's specific likes, dislikes, what they post, what they delete. And it can be delivered in private. One on one, intimate, just the algorithm and you. We have no real visibility on this.

In the fascinating and, at times, disturbing Ted talk "We're building a dystopia just to make people click on ads", Turkish academic Zeynep Tufekci points out that human predictive algorithms only work if there is a tremendous amount of data, so they intrinsically encourage deep surveillance. It is difficult to predict who might buy something if you don't know a lot about everyone else who bought the thing, so you can find the common denominators.

She also posits a (hopefully) hypothetical that asks what happens if the algorithm picks up that it is easier to sell to those who are bi-polar and about to enter the manic phase. In case you are wondering, the onset of mania can be predicted from social media posts. But I'm sure it's fine, right? I mean those companies would have something in place to stop that, right?

Well, Facebook managed to not notice – or, bluntly, possibly not care – that 126 million people saw posts produced by Russian-government-backed agents. With all its vaunted ability to process billions of data points, the company didn't seem to notice that electoral ads were coming from Russia.

There is also the phenomenon of escalating extremism. As Tufekci points out, "you are never hardcore enough for YouTube". Watching a Trump rally leads to white supremacist videos. Clinton and Sanders leads to the conspiracy left. The rabbit holes can be divisive, immersing people in ever more echo chambers, further removing common bases of information. And all the time you are being served ads.

Tufekci mentions that Trump's social media manager revealed that they used Facebook non-public "dark-posts" to demobilise voters, to convince them not to vote. They targeted, among others, African-American men in specific key cities. One can't help wondering whether Cambridge Analytica, the company that proudly boasts 5000 data points on every American and that worked on the Trump campaign, may have assisted in this process.

So, whither democracy? Is it concerning enough just to get a glimpse of the manipulation that is available to the resourced or do we also ponder to what extent the owners of such platforms could manipulate elections for their own ends, and the opacity of the process if they so chose?

The same persuasion architectures and authoritarian opportunities that once were used to sell shoes are now being used to sell ideologies and to make public debate impossible.

The mission statement "to give people the power to build community and bring the world closer together", and the famous and much-discussed "do no evil", do not appear to be entirely married to how these companies are allowing their platforms to be used.

It may be that platforms such as these will force a level of regulation, but the companies themselves have no financial incentive to alter their approach and there appears little real will to act from the political sphere.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above