

But executives from the sites say they have been doing what they can to combat the spread of the video, one possibly designed for an age of virality. Social media and video sharing sites have faced criticism for being slow to respond to the first-ever live-streamed mass shooting, recorded from the first-person perspective of the shooter, the camera seemingly mounted atop the killer's helmet. Countless more views occurred in the hours afterward, as copies of the video proliferated more quickly than online platforms like Facebook could remove them. But the video was viewed about 4,000 times before Facebook removed it, he added. Other video-streaming sites like TikTok and YouTube require users to have a certain number of followers before they're able to stream live, reports Allyn.Al Noor mosque is shaded by clouds in Christchurch, New Zealand, on Tuesday.Ī Facebook vice president said fewer than 200 people saw the Christchurch massacre while it was being streamed live on the site. Facebook, Twitter and other sites like them have teams of thousands working to moderate content and block violent media from reaching people.įor example Twitch, the site the Buffalo shooter livestreamed on, could make it harder for people to open accounts and instantly upload live videos. Social media companies used to take a mostly hands-off approach to moderating content on their sites, but now more than ever sites are trying to manage the societal problems their sites create, reports Allyn. Experts say social media companies could do more Listen to his discussion on Morning Edition.

"The social media platforms that profit from their existence need to be responsible for monitoring and having surveillance, knowing that they can be, in a sense, an accomplice to a crime like this, perhaps not legally but morally," Hochul said.Īllyn reports that social media companies usually are not held liable for what they don't police on their sites.
