Sign Up
..... Connect Australia with the world.
Categories

Posted: 2024-11-03 18:44:46

They were the people tasked with keeping users safe on the world's biggest social media platforms.

One by one they left the companies, disturbed by the risks of disinformation, threats to democracy and extreme dangers, particularly for young people.

These former insiders and whistleblowers say the guardrails meant to protect users are limited and ineffective and social media companies are reluctant to respond to criticism.

Whether or not you use X, Instagram, TikTok or Facebook, these platforms are shaping the world, upending lives and having profound consequences for their youngest users.

Instagram inaction

Meta product safety engineer-turned-whistleblower Arturo Béjar says "every parent should know how bad it is" on social media.

He left Facebook in 2015 but rejoined the company in 2019 because he was so concerned about the unwanted sexual advances his own teenage daughter was experiencing on Instagram.

When he returned, he was shocked to learn that most of the safeguards he and his team had put in place had been abandoned.

A man wearing glasses, sitting in a room looking at the camera with a concerned expression.

Arturo Béjar says Meta could absolutely act to make its platforms safer. (Four Corners: Rob Hill)

In October 2021, Béjar wrote to founder Mark Zuckerberg to warn him of the dangers to young people.

Béjar pointed to internal company research that showed "51 per cent of Instagram users say 'yes' to having had a bad or harmful experience in the last seven days".

The research also found more than a quarter of 13 to 15-year-olds said they had received unwanted sexual advances.

Béjar says that Zuckerberg didn't even bother to write back.

"I think that the company cares more about protecting its reputation and the risk of a leak than they do to understand actual harm and reducing it."

Two hands hold a mobile phone in a slightly darkened room.

Béjar says some of Instagram's youngest users were reporting the most harm. (Four Corners: Rob Hill)

Béjar also shared his concerns and recommendations to address the issues with the head of Instagram, Adam Mosseri.

"I had my meeting with [Mosseri] and he understood everything I was talking about," Béjar says.

"He didn't disagree with any of it."

That meeting was in 2021. Béjar says Mosseri did not make any of the changes to the platform he suggested.

Meta disagrees, saying it has implemented measures such as warnings to discourage people from making hurtful comments, "kindness reminders" and restricting teens from seeing content related to self-harm and eating disorders.

On many of the measures of harm examined by Béjar and his team — including witnessing self-harm content on the platform — 13 to 15-year-olds were the worst-affected age group.

"It's insane," Béjar says, "there's no way for a kid to press a button to let Instagram know that that happened."

When asked why he thinks that is the case, Béjar replies curtly: "Because they don't want to know."

"Because I think that if they knew … they might feel they have a responsibility to deal with it."

Béjar has no doubt that Meta could act if it wanted to.

"If Mark [Zuckerberg] woke up tomorrow morning and said, we are no longer going to show anything that has remotely to do with self-harm to 13-year-olds, it would take a few months [but] … they have the infrastructure and the capacity to do that.

A close up of a man wearing glasses looking ahead as he sits on a balcony. His expression is contemplative.

Béjar took his concerns to the very top of Instagram and Meta. (Four Corners: Rob Hill)

"I've seen what the company can do when Mark deems it a priority. And I've seen how hard it is for the company to do something if Mark doesn't deem it a priority."

He has three words to sum up social media as it now operates: "Unnecessarily, tragically harmful."

Meta says its latest move, introducing "Teen Accounts", will provide them with "built-in protections" and "gives parents more oversight, including seeing who their teens are chatting with and setting time limits".

Whistleblowers say these measures don't go far enough.

Meta 'knew what was happening'

Lori Schott believes her daughter would still be alive if Zuckerberg had made his platforms safer for young people.

A woman stands leaning on a railing at dusk looking ahead, in the distance are fields. Her expression is sad.

Lori Schott wants Meta to be held accountable for the deaths of her daughter and other teenagers. (Four Corners: Rob Hill)

Her daughter, Annalee, known as Anna, killed herself in 2020. Lori said Anna had become addicted to social media, including Instagram and TikTok.

"Instagram fed her the worst content. I feel like it is a heat-detecting missile that finds these weaknesses in these children and just zone[s] in on it.

"They knew that Anna's weakness was anxiety or depression … going to accounts or looking these things up. And it just magnified it and it did not stop."

She's now one of the thousands of American parents suing Meta.

Internal Meta documents discovered during litigation against the company show that in the months before Anna's death, Zuckerberg was warned about Instagram causing dangerous body dysmorphia in teen girls.

Zuckerberg cancelled a meeting called to consider proposed changes, describing them as "paternalistic".

"They knew what was happening," Schott says of the Meta leadership.

Anna Schott smiles in a portrait photo wearing a cowboy hat

Anna Schott took her life in 2020. (Supplied)

"How could this happen? And they're not accountable to this day."

In February, after being grilled by the United States Senate Judiciary Committee, Mark Zuckerberg apologised to families impacted by his company's platforms.

"I'm sorry for everything you have all been through," Zuckerberg told the parents.

"No one should go through the things that your families have suffered," he said, adding that Meta continued to invest and work on "industry-wide efforts" to protect children.

Lori Schott says Zuckerberg needs to be held legally liable for the suicide deaths of teenagers like her daughter.

"He had the ability … stop it, change it, make it for the good of the kids."

"But trying to improve it was not on his agenda and he's going to have to live with that for the rest of his life. He has blood on his hands."

Meta says it is confident the evidence will demonstrate that neither the company nor any of its executives have any liability.

It says it has strict rules against content that is graphic or glorifies suicide or self-harm.

The X 'disinformation machine'

Eddie Perez led the team at Twitter tasked with tackling false information and extremism.

Perez left in 2022 just before Elon Musk took over and renamed it X, slashing about 80 per cent of the company's workforce — particularly in the trust and safety division.

He says the company once set high standards around accuracy of information — especially when it came to elections — but those standards, along with a tremendous amount of institutional knowledge, have been lost.

"The attempt to balance freedom of speech with quality of information and reduction of harm was a very, very serious part of our mission," Perez says.

"In its place, Twitter today is … the most generous thing you can say, is a wild, wild West."

A man looks ahead as he walks on a footpath by a large body of water. He has a determined expression.

Eddie Perez says X is operating like a "disinformation machine". (Four Corners: Rob Hill)

Perez says the new standard being set at X comes right from the top of the company and its billionaire owner who, he believes, has weaponised the social media platform for political purposes.

Elon Musk now has more followers than anyone else on the platform.

Eddie Perez says there are multiple examples of Musk posting things that would have violated Twitter's old rules and would have been taken down by the platform.

For example, Perez says, when false claims about the person responsible for the murder of three girls sparked a series of racially motivated riots in the UK in July, Musk suggested in a post that "civil war is inevitable".

In September, Musk mused about why no one had tried to assassinate Joe Biden or Kamala Harris — also a content violation according to Perez.

There was also the false claim, whipped up on social media and posted on X by Musk, that Haitian migrants in an Ohio town were eating people's pets.

A reply to a post on X from Elon Musk's account saying "Apparently, people’s pet cats are being eaten".

Elon Musk's pet cats post on X. (X)

This was then repeated in the presidential debate by Republican candidate Donald Trump.

"I mean, it would be laughable if it weren't so harmful," Perez says.

"X is fundamentally operating like an amplifier and disinformation machine, almost like a factory for these inflammatory right-wing messages.

"The bottom line here is there is very real human harm. What happened to that small town [in] Ohio was terrible. They ended up with bomb threats, they ended up having to evacuate schools."

Musk has now donated more than $US100 million ($153 million) to Trump's campaign and has used his more than 200 million followers on X to amplify false claims and conspiracy theories about the election.

Eddie Perez believes that social media is tearing at the heart of democracy.

"I don't think the stakes could be any higher," he says.

Perez no longer uses the platform he once worked for.

But he says it's a mistake for people who don't use X, or who have vacated the platform, to write it off or underestimate its influence on culture and politics.

"X has a very harmful place in the current media ecosystem, whether people are actively participating in it or not. The ripple effects from that will be felt."

Four Corners made several attempts to contact Elon Musk and his companies, but we did not receive any replies to our correspondence.

Facebook safety 'cut to the bone'

Concerns about democracy aren't confined to X.

Facebook whistleblower Frances Haugen believes that Elon Musk has effectively given permission to other social media proprietors to slash the teams tasked with stopping disinformation and protecting users.

"Seeing that you could fire huge numbers of people and not face really any meaningful consequences, Mark Zuckerberg followed and fired 20,000 employees," says Haugen, who worked as a product manager at the platform's parent company Meta.

Haugen blew the whistle after Facebook dissolved its civic integrity team weeks after the last US election.

She says it was the "one area of the company … that was responsible for trying to make sure that Facebook was a positive force in the world".

Frances Haugen portrait looks down the barrel of the camera

Haugen says safety teams have been "cut to the bone". (ABC News: Billy Cooper)

In 2021, Haugen leaked thousands of pages of documents from Facebook and appeared before the US Congress to highlight her concerns.

She says Facebook users have become more vulnerable to false and harmful content since then.

"They closed [CrowdTangle], the only mechanism for transparency that existed for either Facebook or Instagram … on the 15th of August [2024].

"During the US election season, there will be no tool available from Facebook to be able to see what's going on, on the platform, even in a moderate way."

Frances Haugen was also concerned about safety for young people on the platform.

"Unquestionably, we have a more dangerous form of social media today than we had in 2020," Haugen says.

"They've really begun to cut down to the bone what they invest in safety."

Meta says it has more people working in safety and security now than it did in 2020.

Zvika Krieger sits for a potrait

Krieger says most parents are out of touch with what their kids are seeing on social platforms. (Four Corners: Rob Hill)

Zvika Krieger was hired to address user safety when he went to work for Facebook in 2020.

It came at a cost.

"Spending two years every day reviewing the worst of the worst … child pornography and human trafficking and violent and extremism, body dysmorphia, suicide, all of these horrible things, it weighs on you," Krieger says.

After two years, Krieger left Facebook, completely burnt out.

Krieger doesn't see himself as a whistleblower because he says he's always been up-front with the companies that he's worked for that he is a critic who wants to make their platforms better.

He is now a leader of a Jewish spiritual community but does still consult for social media companies about how to make their platforms safer.

Despite having spent his adult life working for social media companies, Krieger is adamant that he would not allow his own children to be on the platforms.

"I've spent much of the past 10 years exploring and being exposed to and trying to mitigate the dregs of the social media world."

"I would not want to expose my child to that."

Krieger says the companies tend to see harm reduction measures as points of "friction" — slowing down the user experience of the platform, potentially turning them off it and leading to a loss of revenue.

A woman with pretty nails holds a phone

Krieger says companies don't want to unnecessarily impact the user experience on their platforms. (Four Corners: Nick Wiggins)

He says this generation of parents is still completely out of touch with what their children are experiencing, even though they use social media themselves.

"Parents are lulled into this sense that they understand what's happening on the platform, and they can protect their children, but they don't even know what they don't know."

"There's a huge gap in digital literacy, digital knowledge, digital intelligence between parents and children in every generation.

"But the potential harms of that gap are exponentially more worrisome in this generation than in previous generations, given the ways that social media has been proven to harm children."

TikTok's hate and harm

Lori Schott's anger isn't just reserved for Instagram. She is also furious at and disturbed by what her daughter saw on TikTok.

The platform is designed for young people, but she was horrified to discover after Anna's death the videos being suggested to Anna in her "For You" feed.

Lori stares solemnly at the camera in a portrait photo

Lori Schott was stunned by the content Anna was seeing on TikTok. (Four Corners: Rob Hill)

Anna was experiencing depression and had also at one point looked up suicide prevention charities for a school project.

The videos in her feed — suggested by TikTok's algorithm — are preoccupied with death and suicide:

"I hope death is like being carried to your bedroom when you were a child," one video said.

"Knowing there's only one way to stop the pain," another said.

In a page in Anna's journal headed "TikToks" in Anna's handwriting, she wrote:

"Technically, if I kill myself, the problem will be gone."

It was only by hiring private investigators that Lori Schott discovered what Anna had been seeing on TikTok.

"When I opened it up … it brought me to my knees. It crippled me," Lori says.

"I can't unsee what I saw. And to think that children of all ages are exposed to something like this and to try and figure out why, these things called algorithms were feeding these to Anna."

TikTok says it has strict community guidelines and doesn't allow content depicting self-harm or eating disorders.

Andrew Kaung saw the dangers of TikTok up close when he worked as a planning analyst for the social media company in its headquarters in Ireland.

"I've seen so many videos … people dying, people committing suicide, people getting beheaded," Kaung says.

"I remember a video where this guy, he was stomping a cat to death, and they think that it's a good, good idea to upload that video on TikTok, child sexual abuse, child pornography, sexual assault."

Andrew Kaung leans against a brick wall

Andrew Kaung says he saw horrific content when working at TikTok. (Four Corners: Alex McDonald)

He says some of this content is still on the platform today.

"On a platform that is targeted toward younger generation and younger people — it's horrible."

TikTok says it has strict controls for users under 18, and actively removes accounts of those it suspects are under 13.

Kaung demonstrates for Four Corners the dangers of TikTok's algorithm-driven feed, setting up a new account for a male user and putting in the search term "immigration UK".

Within minutes, the suggested content turns xenophobic and dark, constantly throwing up videos about England being invaded by Muslims.

"The more time you spend on it, you're going to go down further into the rabbit hole," Kaung says,

"That's sort of how they keep you addicted."

Like so many other people employed to help protect the public from the worst excesses of social media, Kaung also got burnt out.

He says he had been assigned to work with the team moderating the war in Ukraine and got swamped with a blizzard of disinformation.

A man walking through a crowd, wearing a checked shirt and a backpack.

Kaung says self-regulation isn't working. (Four Corners: Rob Hill)

Like all of these insiders, Kaung supports much more comprehensive external regulation and has a phrase he likes to use to describe the futility of self-regulation by the social media companies.

"It's [like] asking a tiger not to eat you," Kaung says with a wry chuckle.

"We are asking a social media company to regulate themselves to remove … harmful content.

"But the thing is, that goes against their objective, because the objective is to attract as many users as possible."

Watch Four Corners' full investigation, Disconnected, tonight from 8:30pm on ABCTV and ABC iview.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above