Sign Up
..... Connect Australia with the world.
Categories

Posted: 2019-03-16 01:51:05
facebook-f8-mark-zuckerberg-2018-0176

Facebook CEO Mark Zuckerberg once hailed streaming as the next big thing in social media.

James Martin

There's a lot not to like about Facebook these days. But when one of its core features, livestreaming video, is being used as a tool for terror, simple outrage just isn't enough.

The latest evidence that live video needs to change came Friday when, before entering a mosque and committing one of the deadliest mass murders in New Zealand history, the accused gunman began a stream on Facebook Live.

As he prepared to carry out his heinous act, he quipped, "Remember, lads, subscribe to PewDiePie." Then his followers watched for nearly six minutes as he streamed video from the massacre that otherwise could have looked like it came from Call of Duty, Battlefield or any other realistic war simulation game.

It was a shocking reminder that the magical devices tech companies create to give us access to nearly all human knowledge and easy ways to take photos and connect with friends, have a darker, more grisly side.

Technology has always had its pluses and minuses that we take in stride because on the whole, it's worth it. But now's the time to consider whether livestreaming in particular may finally be the first Silicon Valley invention that needs radical reform, or to just go away.

Multiple Fatalities Following Christchurch Mosque Shootings

A floral tribute near the scene of the mass murder in Christchurch, New Zealand, that was streamed live on Facebook.

Getty Images

"The New Zealand shooter was able to livestream a 17-minute video of his murderous rampage that continues to spread like wildfire online. This is flatly unacceptable," said Farhana Khera, executive director of civil rights organization Muslim Advocates, in a statement. "Tech companies must take all steps possible to prevent something like this from happening again."

Of course, livestreaming won't go away. It's already ingrained in internet culture as a driving force behind online news services, quiz shows and popular video games like Fortnite.

At a minimum, though, the approach to livestreaming needs to change. Facebook and others like YouTube, Twitter's Periscope and Amazon's Twitch, need to treat this technology as the potential tool for mass terror that it is. Otherwise, this whole situation is only going to get worse.

None of the companies offered details when asked if they had plans to prevent violent streams in the future.

Portable Technology Photo

A lot of the internet uses livestreaming these days. But it needs a lot of work.

Getty Images

A dark start

When Facebook CEO Mark Zuckerberg announced Facebook Live in April 2016, he heralded it as the next big thing in social media. "This is a big shift in how we communicate, and it's going to create new opportunities for people to come together," he wrote in a Facebook post.

Within a year, the service had been used to broadcast at least 50 acts of violence including murders and suicides, according to a tally by The Wall Street Journal. One was of a special needs man being tortured during a 30-minute livestream in Chicago. Another was of an alleged sexual assault of a 15-year-old girl, watched live by dozens of people. And there was Philando Castile, a 32-year-old school cafeteria worker killed by a police officer during a traffic stop for a faulty taillight.

Facebook vowed in each case to do better, and it has failed every time.

Once again, Facebook didn't stop the live video of the horrific shootings in New Zealand that left 49 people dead and at least 20 wounded. That video is now being uploaded to YouTube, Twitter and other websites.

Lavish Reynolds livestreamed her boyfriend Philando Castile's dying moments after a police officer shot him during a traffic stop in 2016.

Getty Images

Some sort of fix

The good news for Facebook is that news organizations have already come up with a solution. They all time-delay their broadcasts, to ensure nothing unsavory hits the air.

With that extra time, Facebook could have its employees identify anything that could spell trouble. You know, stuff like people with guns, screaming, or anyone who appears to be in distress.

Then, Facebook could take the $22 billion in profits it notched last year and put it to some use. With that kind of money, Facebook could fund an army larger than the one that protects Israel. So why not raise an army of content reviewers instead?

It could give them tools and the necessary time to keep an eye out for when the uglier parts of the internet show up.

To be sure, this wouldn't stop every terrorist, rapist or bully. But it would put a dent in the problem.

And maybe then, Facebook could stop making promises, and actually do better instead.

First published March 15 at 12:01 p.m. PT.
Updated 3:20 p.m. PT: Adds company comments.

View More
  • 0 Comment(s)
Captcha Challenge
Reload Image
Type in the verification code above