The shooting rampage on Saturday at a grocery store in Buffalo is believed to have been streamed live online by the gunman. The shooter used a GoPro camera attached to a helmet designed like a military helmet to broadcast live on Twitch for around two minutes before the service terminated the broadcast. Other websites have now included video clips on their pages.
Experts think platforms might do more to prevent atrocities from being live streamed
It is not out of the ordinary for white nationalists to employ social media to spread information about an impending attack. The white supremacist who carried out the shooting rampage in Christchurch, New Zealand, in 2019 is not the first person of his kind to gloat about his crimes on social media. Since the tragedy in Christchurch, social media companies have refined their responses to violent material that may be found online.
One particular area of improvement is the quickness with which they can stop live streams of attacks. However, some users keep violent videos, such as those of mass killings, which eventually appear on social networking websites such as Facebook, Instagram, Twitter, and TikTok. According to Bobby Allyn of NPR, it is more difficult for businesses to delete videos that have been re-uploaded several times.
Alleyn claims that more than three million people saw the clip of the Buffalo shooting on Streamable before the site removed it. The Governor of New York, Kathy Hochul, has said that the companies that run social media platforms should take some of the responsibility for tragic events like the one that occurred in Buffalo.
Hochul stated that the social media platforms that benefit from them need to be held accountable for monitoring and surveillance because they may be complicit in such a crime, if not legally, at least ethically. This is because they know they may be complicit in such a crime. According to Allyn, social media networks are often not held liable for anything that isn’t filtered or moderated. Listen to his commentary on Morning Edition to find out his view on recent events.
Experts argue social media firms can do more
It used to be the case that companies involved in social media took a hands-off approach to the moderation of user-generated content. However, as reported by Allyn, these days, more than ever, websites are making an effort to address the social problems caused by the services they provide. Twitter, Facebook, and other similar platforms employ thousands of moderators to regulate content and prevent hazardous information from reaching users. These moderators are responsible for preventing people from using the site.
For example, the video-streaming website Twitch, from which the shooter in Buffalo streamed his rampage, may make it more difficult to sign up for an account and start broadcasting the video immediately. According to Allyn, for users to broadcast live content on social media sites like TikTok and YouTube, they must first accumulate a certain number of followers.