Facebook said it deleted 1.5 million copies of the New Zealand mosque terror attack video from its platform in the first 24 hours after the massacre.
The social networking giant said in a tweet Saturday night that the majority of the videos were blocked before making it to the web.
"In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload...," Facebook said in the tweet.
Facebook said it's also removing all edited versions of the video that don't show graphic content. This is being done, a New Zealand Facebook spokeswoman said, "out of respect for the people affected by this tragedy and the concerns of local authorities."
On Friday, a gunman in Christchurch attacked Muslims praying at a mosque and livestreamed the shooting on Facebook. The social network removed the video and deleted the shooter's account. But that didn't stop the clip from spreading across the internet.
The roughly 17-minute video was downloaded from Facebook. Then it was re-uploaded to YouTube multiple times, with new posts often appearing within minutes of each other. YouTube is encouraging users to flag any videos showing this clip and said it's been removing thousands of videos related to the shooting in the last 24 hours.
Authorities in New Zealand reported that 50 people were killed and at least 20 wounded at two mosques on Friday. One suspect is in custody and has been charged in attacks.