Disturbing videos of animal abuse circulated on Twitter over the last few weeks, sparking outrage and concern over the platform's moderation systems, according to NBC News.
One notorious video, which users have said the platform directed to in the search bar via a suggested search term, appeared to show a kitten being placed in a blender and killed.
Laura Clemens, a 46-year-old Londoner who first learned of the video when her 11-year-old son asked her about it two weeks ago, told the outlet that she then searched for "cat" on Twitter, and the search box returned a suggestion for "cat in a blender."
Clemens explained that she clicked on the search term and a video of the kitten being killed instantly appeared on her screen, which, for users who have not manually disabled the platform's autoplay feature, would result in the video playing immediately. NBC News said that they were able to replicate Clemens' process to find the video on Wednesday.
Clemens said she was grateful her child told her about the video rather than finding it on Twitter himself.
"I'm glad that my child has talked to me, but there must be lots of parents whose kids just look it up," she said.
The circulation of the video as well as its presence on Twitter and in the suggested searches are part of a greater trend of animal cruelty and graphic videos circulating on the platform following the takeover of CEO Elon Musk.
Just last weekend, graphic videos of two violent events in Texas — a shooting at a shopping center and a car ramming into a group of migrants — spread on the app with some users saying they appeared on their algorithmically driven "For You" pages.
The animal cruelty videos seem to have been posted before those videos as some users, Clemens included, have tried to get Twitter and Musk's attention on the issue since early May. Clemens said she alerted Twitter's support account and its vice president of trust and safety, Ella Irwin, of the video on May 3 but neither responded to the tweet.
"Young children know this has been trending on your site. My little one hasn't seen it but knows about it. It should not be an autofill suggestion," she wrote.
Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.
The company likely dismantled the platform's built-in safeguards meant to prevent these search bar autocomplete issues, Yoel Roth, the app's former head of trust and safety, told NBC News. The "type-ahead search" system was constructed to keep illegal and dangerous content from becoming autocomplete terms.
"There is an extensive, well-built and maintained list of things that filtered type-ahead search, and a lot of it was constructed with wildcards and regular expressions," Roth said.
"Type-ahead search was really not easy to break. These are longstanding systems with multiple layers of redundancy," Roth added, referencing the several-step process combining automatic and human moderation to flag violent videos before they appear in searches that existed on the platform. "If it just stops working, it almost defies probability."
While NBC News approached the company for comment on Thursday, the news outlet also found that the searches for "dog" and "cat" autocompleted to videos of animal abuse.
Twitter's press account responded with a poop emoji, which has reportedly been the company's response for the last month.
As of Friday, suggested searches appeared to be turned off on the platform.
Shares