Share
Getting your Trinity Audio player ready...
|
WASHINGTON — When researchers at a nonprofit that studies social media wanted to understand the connection between YouTube videos and gun violence, they set up accounts on the platform that mimicked the behavior of typical boys living in the U.S.
They simulated two nine-year-olds who both liked video games. The accounts were identical, except that one clicked on the videos recommended by YouTube, and the other ignored the platform’s suggestions.
Flooded With Graphic Videos
The account that clicked on YouTube’s suggestions was soon flooded with graphic videos about school shootings, tactical gun training videos and how-to instructions on making firearms fully automatic. One video featured an elementary school-age girl wielding a handgun; another showed a shooter using a .50 caliber gun to fire on a dummy head filled with lifelike blood and brains. Many of the videos violate YouTube’s own policies against violent or gory content.
The findings show that despite YouTube’s rules and content moderation efforts, the platform is failing to stop the spread of frightening videos that could traumatize vulnerable children — or send them down dark roads of extremism and violence.
Algorithms Leading to Gun-Related Content
“Video games are one of the most popular activities for kids. You can play a game like “Call of Duty” without ending up at a gun shop — but YouTube is taking them there,” said Katie Paul, director of the Tech Transparency Project, the research group that published its findings about YouTube on Tuesday. “It’s not the video games, it’s not the kids. It’s the algorithms.”
The accounts that followed YouTube’s suggested videos received 382 different firearms-related videos in a single month, or about 12 per day. The accounts that ignored YouTube’s recommendations still received some gun-related videos, but only 34 in total.
The researchers also created accounts mimicking 14-year-old boys; those accounts also received similar levels of gun- and violence-related content.
How a Switch Works on a Glock
One of the videos recommended for the accounts was titled “How a Switch Works on a Glock (Educational Purposes Only).” YouTube later removed the video after determining it violated its rules; an almost identical video popped up two weeks later with a slightly altered name; that video remains available.
A spokeswoman for YouTube defended the platform’s protections for children and noted that it requires users under 17 to get their parent’s permission before using their site; accounts for users younger than 13 are linked to the parental account. “We offer a number of options for younger viewers,” the company wrote in emailed statement. “… Which are designed to create a safer experience for tweens and teens.”
Links Between Social Media, Radicalization, Real-World Violence
Along with TikTok, the video sharing platform is one of the most popular sites for children and teens. Both sites have been criticized in the past for hosting, and in some cases promoting, videos that encourage gun violence, eating disorders and self-harm. Critics of social media have also pointed to the links between social media, radicalization and real-world violence.
The perpetrators behind many recent mass shootings have used social media and video streaming platforms to glorify violence or even livestream their attacks. In posts on YouTube, the shooter behind the 2018 attack on a school in Parkland, Florida., that killed 17 wrote “I wanna kill people,” “I’m going to be a professional school shooter” and “I have no problem shooting a girl in the chest.”
The neo-Nazi gunman who killed eight people earlier this month at a Dallas-area shopping center also had a YouTube account that included videos about assembling rifles, the serial killer Jeffrey Dahmer and a clip from a school shooting scene in a television show.
In some cases, YouTube has already removed some of the videos identified by researchers at the Tech Transparency Project, but in other instances the content remains available. Many big tech companies rely on automated systems to flag and remove content that violates their rules, but Paul said the findings from the Project’s report show that greater investments in content moderation are needed.
Tighter Age Restrictions on Firearms-Related Content
In the absence of federal regulation, social media companies must do more to enforce their own rules, said Justin Wagner, director of investigations at Everytown for Gun Safety, a leading gun control advocacy organization. Wagner’s group also said the Tech Transparency Project’s report shows the need for tighter age restrictions on firearms-related content.
“Children who aren’t old enough to buy a gun shouldn’t be able to turn to YouTube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities,” Wagner said in response to the Tech Transparency Project’s report.
Similar concerns have been raised about TikTok after earlier reports showed the platform was recommending harmful content to teens.
TikTok has defended its site and its policies, which prohibit users younger than 13. Its rules also prohibit videos that encourage harmful behavior; users who search for content about topics including eating disorders automatically receive a prompt offering mental health resources.
RELATED TOPICS:
In a Calendar Rarity, Hanukkah Starts This Year on Christmas Day
18 hours ago
A Look at the $100 Billion in Disaster Relief in the Government Spending Bill
18 hours ago
It’s Eggnog Season. The Boozy Beverage Dates Back to Medieval England but Remains a Holiday Hit
18 hours ago
9-Year-Old Among 5 Killed in Christmas Market Attack in Germany
18 hours ago
This French Bulldog Is So Fetch: Meet Toaster Strudel
20 hours ago
The Fed Expects to Cut Rates More Slowly in 2025. What That Could Mean for Mortgages, Debt and More
23 hours ago
Jeffrey Sachs Warns of Looming US War With Iran