Publisher's Note: This post appears here courtesy of the The Daily Wire. The author of this post is Brandon Drey.
Hashtags and keywords promoting the sale of child sex abuse material (CSAM) were blocked on Twitter last weekend following an investigative report.
NBC News originally reported that individuals on Twitter seeking to trade or sell CSAM had used several hashtags linked to Mega, a file-sharing service. Dozens of users allegedly published hundreds of tweets related to the hashtags every day, according to a review by the outlet.
Ella Irwin, Twitter's vice president of product trust and safety, told the outlet on Friday in an email after the report was published that the platform would conduct a "specific additional review"
investigation over the weekend.
"As you probably know the links you shared relate to a file sharing service broadly used for a wide variety of purposes and so that makes it much harder to find the specific illegal content being posted using the hashtags in question,"
Irwin told the outlet.
Irwin told the outlet the next day her team met over the weekend and banned the hashtags, which were already under company review.
"We were already reviewing doing this in the coming weeks, given that we have banned other hashtags used commonly for trafficking [CSAM] material already, however we made the decision to accelerate this action for these terms,"
Since taking ownership of the company in October, CEO Elon Musk has claimed Twitter faced a barrage of problems relating to child sex abuse material under previous ownership.
Such issues included more than 500 accounts soliciting such material that led approximately 30 major advertisers to pull or pause their ad services on their Twitter accounts.
And an independent cyber-security data analyst identified 95% of several active accounts exploiting CSAM, which allegedly included videos of children and teens involved in sexual activities.
Irwin has openly talked about partnering with organizations worldwide to combat illegal content and trafficking to identify material it would have taken them longer on the platform.
"We can do a lot with the information we're given by organizations,"
she said last month in a Twitter space. "We can go out and then do massive investigations and take down a lot of things,"
adding it's "much more effective when that communication channel is there."
Irwin also added that she has seen Musk push for discussions to include organizations focused on various abuse issues to help the platform improve.
Irwin told NBC that her team had been analyzing thousands of hashtags for a project scheduled for completion soon but made an exception to ban the hashtags promoting such material.
"If bad actors are successfully evading our detection using these specific terms and in spite of our detection mechanisms currently in place, then we would rather bias towards making it much harder to do this on our platform,"
Mega's Executive Chairman Stephen Hall told NBC News in an email on Friday - before Twitter's trust and safety team blocked such hashtags - that the New Zealand-based encrypted service had a zero-tolerance policy toward CSAM.
"If a public link is reported as containing CSAM, we immediately disable the link, permanently close the user's account, and provide full details to the New Zealand authorities, and any relevant international authority,"
Hall said in a follow-up e-mail reacting to Irwin blocking the Mega-related terms on Twitter was "a rather blunt reaction to a complex situation."
Twitter's CSAM problem has been documented for more than a decade.
Twitter claimed last summer that the company aggressively fights online child sexual abuse; however, a Twitter Red Team report from an Adult Content Monetization project found that although the platform invested in technologies to detect and manage the issue, "growth has not."
The National Center for Missing and Exploited Children, which plays a crucial role after Internet sites like Twitter remove such material, reported that 86,666 CSAM reports on Twitter were made to the organization last year. When CSAM is detected and removed, the report goes to the center, which contacts law enforcement for further action.
Musk said last year the social media platform would make addressing its alleged child sexual exploitation problem his number one priority.
Securities and Exchange Commission documents and internal records obtained by the NBC News report shows the platforms's trust and safety has less than half the number of employees working the department than did at the end of 2021.
Irwin said last month when she joined Twitter last June before Musk acquired the platform that she was "shocked to find that there were such gaping holes in some of these really critical areas like, for example, child safety."
However, Bloomberg reported the trust and safety team underwent further cuts this month.
Twitter watchdogs have called on Elon Musk to ban pornography on the platform until it can implement an age and consent verification system for those depicted in the images and videos amid alarming concerns over the internet's child sexual exploitation epidemic.
Officials from the National Center on Sexual Exploitation told The Daily Wire that since Musk laid off most of Twitter's Trust and Safety Team and further disbanded its Advisory Council, the alleged skeleton staff could lead the platform to even more dangerous grounds to confront its child sexual abuse material problem.