[ad_1]
WARNING: This story contains graphic details.
On Saturday afternoon, Hoda Wale was scrolling through her Twitter feed when she came across news of a shooting outside a mall in suburban Dallas.
Almost immediately, he experienced graphic images of bloody victims, some of them children.
The video starts playing automatically, according to her account settings.
“There was no indication that the video would be as horrific as what I saw,” said Awale, who works for a public health nonprofit in Seattle.
“Especially since the kids, it really traumatized me at the time.”
In all, eight people were killed and seven others were wounded in the parking lot of an outlet mall in Allen, Texas, before the gunman was shot dead by a police officer.
In an effort to prevent others from accidentally viewing the video, Awale tweeted a picture of the first moment, warning of what lies ahead.
There was a mass shooting at the Allen Outlets in Texas, the video went viral of the bodies of the kids and I want to warn everyone not to open it if it’s on your TL. It’s horrible but that’s how the video starts #allenoutletmall pic.twitter.com/71myIfJse8
It was initially among some Twitter users who criticized the platform for not immediately removing the video, or at least adding a warning notice. He, like many, has also turned off Twitter’s video autoplay function – an automatic setting that was previously owned by Elon Musk.
As the video circulates on the platform, a debate is played out about the limits of freedom of expression and the power of images to cause harm and cause change.
Video power
In response to Alawe, one Twitter user said that the image should be widely seen to force US lawmakers to finally make changes to the country’s gun control laws.
“The world needs to see what is happening here. Everyone needs to watch,” he wrote.
Others questioned the value of censoring the video: “Why? So we can continue to live in denial?”
Twitter users protested the gruesome images from Dallas and other mass murders. I said, let’s see so they can’t be denied. Juxtapose the victims’ corpses with the whacky Christmas cards sent by politicians showing them posing with their families packing AR-15s.
There is a long history of disturbing photos and videos that lead to social change, such as video recording killing George Floyd and the Black Lives Matter movement that followed, said Prof. Heidi Tworek, director of the Center for the Study of Democratic Institutions at the University of British Columbia.
The question, he emphasized, is whether “this video really belongs to that category, or whether it is a display of indecent violence, whether it does not respect the victim, whether it can inspire imitators.”
James Turk, director of the Center for Free Expression at Toronto Metropolitan University, points out that similar debates have been raging in traditional news media for years and show no signs of letting up.
“I don’t think there’s a right answer,” he said.
Either way, the video appears to be in violation of Twitter terms of service – which states that “extremely gory content” is not allowed. Company rules also say that media is prohibited if it has “the potential to normalize violence and cause distress to viewers.”
The video, published by multiple accounts and viewed several times, could still be seen on Twitter on Monday afternoon. It is not easy to search on other social platforms, such as Facebook or YouTube. (CBC News has decided not to post the video).
Musk describes himself as “absolutist free speech“but it’s hard to say whether they allowed the recording to be distributed on purpose, or whether it was the result of a new corporate cut. content moderation team.
The company did not return a request for comment on Monday. CBC News received a poop emoji as an automated email reply.
Facebook declined to comment, while a YouTube spokesperson said it “quickly removes offending content” and ensures people “connect with quality information when searching for details about this tragic event.”
Legislation harm online
Initially, who has been a Twitter user for more than a decade, said she was concerned that the video – which she described as “trauma porn” – had still not been taken down.
At the very least, he said, there should be a content warning on the video so people can make up their own minds.
In Canada, that consideration may be part of upcoming federal legislation regulating online damages on social media platforms.
A gunman killed eight people and wounded seven others in a Texas mall before being killed by police. The attack sparked a familiar debate about access to assault-style weapons.
Tworek is part of a group of experts consulted on the issue in 2022. The proposed legislation, Bill C-36, died on the order paper that February The new law is expected in the coming months.
“This is not the first time we have had discussions about images and videos,” he said. “This is clearly a systemic issue with the process on the platform.”
He added that now, even if removed from Twitter, the video will be downloaded many times and reposted elsewhere on the internet.
According to Turk, the legislation will be a challenge for the government, due to the evolving nature of online content and differences of opinion on how it should be regulated.
“In my view, the only restrictions on what the state should limit are those that are illegal,” he said, referring to content such as hate speech or the advocacy of violence.
“In terms of posting graphic material about shooting or fighting, I’m not sure we can ban it.”
[ad_2]
Source link