Elon Musk vowed to end child exploitation on Twitter, workforce too thin

Elon Musk declared he could stop child trafficking on Twitter, the social media platform he spent $44 billion to buy.

He spoke about the issue and stated it was his top concern less than a month ago.

There has yet to be any evidence, though, that Twitter has been acting aggressively since he took charge.

According to conversations with four former employees, one current employee, internal business records, and people working toward the same goal, Musk hasn’t made a significant investment in removing child exploitation-related information from Twitter.

Arguments

Elon Musk turned the discussion of internet security into a significant effort to malign former Twitter leaders.

In addition, he is using his ownership in the “awake mind virus,” a social movement that opposes far-left to center-left principles.

After he agreed with far-right internet rhetoric that usually includes exaggerated accusations of child sex abuse, the change occurred.

“It is a crime that they refused to take action on child exploitation for years,” Musk tweeted on Friday.

His comment was in reaction to a letter of resignation from a member of Twitter’s Trust and Safety Council who focused on child abuse issues.

The previous CEO, Jack Dorsey, responded, “This is false.”

New management

Now run by Musk, Twitter said that more accounts were suspended for child sex abuse content in November than in any other month.

The suspensions were attributed to new partnerships with anonymous groups and new “detection and enforcement mechanisms.”

The following reasons hindered the corporation’s ability to address child sex abuse online:

  • Layoffs
  • Mass-firings
  • Resignations

Internal records

Internal records obtained by NBC News and CNBC reveal that 25 of the 1,600 workers still employed by Twitter had positions related to “Trust and Safety,” although the company’s personnel total is still in flux.

The total includes over 100 individuals Musk authorized to work for Twitter, Tesla, SpaceX, The Boring Company, and numerous investors and advisers.

A former employee with a child safety focus claimed to be aware of a small Twitter team still tackling the issue.

But most of the team’s engineers and product managers have already left.

The employee asked to remain anonymous because she feared retaliation.

By the end of 2021, Twitter employed more than 7,500 people.

According to former employees, layoffs would have been possible even if Musk hadn’t purchased the company.

Child safety groups

Twitter’s ties to outside groups that promote child safety have been cut back under the current management.

The Trust and Safety Council of the social media firm, which was made up of 12 groups and provided advice to Twitter on its campaigns to increase public awareness of child sexual exploitation, was abolished on Monday.

The National Center for Missing & Exploited Children (NCMEC), an organization the US government has tasked with monitoring reports of child sexual abuse content online, claims that not much has changed under Musk’s direction.

NCMEC’s consolidated CSAM reporting system was addressed by the organization’s spokesperson Gavin Portnoy, who said:

“Despite the rhetoric and some of what we’ve seen people posting online, their CyberTipline numbers are almost identical to what they were prior to Musk coming on board.”

Another change observed by Portnoy was the exclusion of Twitter from the organization’s annual social media discussion.

“The previous person was one of the folks who resigned,” he said.

According to Portnoy, Twitter declined when asked if they wanted to send a proxy.

Reports

Twitter stated that 86,666 CSAM incidents were found on the platform in 2017, although Portnoy believes the actual number may be higher.

“We’ve always felt that there should have been more reports coming out of Twitter, no matter how you cut it, and just given the sheer number of users that are there,” he said.

Twitter continues to be plagued with child sexual exploitation content, even if it affects most social media sites.

After learning that their advertisements frequently displayed next to harmful content, Twitter’s advertisers left the platform earlier this year.

Last year, a child sex abuse victim and their mother sued the company, claiming they were negligent in their response to information about a video showing the child roaming the site.

Read also: Meta threatens to remove news content on FB

Content moderation

Using automated detection technologies, internal expert teams, and outside contracts, child abuse content must be detected and eliminated in moderation.

The content complies with Twitter’s policies as follows:

“Imagerey and videos, referred to as child pornography, but also written solicitation and other material that promotes child sexual exploitation.”

Several employees and leaders who worked on trust and safety features, as well as improvements to the current platforms, made up Twitter’s engineering staff, which was reduced by more than half as a result of layoffs, firings, and resignations, according to those with knowledge of the situation and internal records.

Twitter’s current head of trust and safety, Ella Irwin, alleges that Musk also let go of contractors when the company migrated to high-tech automation to fulfill its moderation needs.

“You tend to think more bodies equals more safety,” said Portnoy.

“So, I mean, that is disheartening.”

How many Twitter staff are still working on child safety issues remains unknown.

Reference:

Elon Musk says he can stop child exploitation on Twitter. So far, he’s axed jobs and pushed out watchdogs