Twitter Has Cut Its Team That Monitors Child Sexual Abuse


While Elon Musk said that Twitter should remove child sexual abuse content, the numbers of people responsible for monitoring the site and taking down such content have decreased significantly since Musk became the company’s chief executive. Bloomberg reported last month that there are now fewer than 10 people whose job it is to track such content – down from 20 at the start of the year.

It is even more concerning that only one person is employed by the Asia-Pacific region to remove child sexual abuse material via Twitter.

The digital age has made child sex exploitation more common and harder to combat. The Internet has made criminals more sophisticated in their search for ways to hide from detection. It’s much easier today to exploit children than 20 years ago,” stated Dr. Mellissa Withers. She is an associate professor of preventive medicine at UCLA and directs the University of Southern California master of health online program.

Multiple studies have found that the majority of teens spend at least four hours a day on electronic devices – and social media sites including Twitter, Instagram, and YouTube, could provide the perfect opportunity for a predator to identify potential victims with little risk of being caught.

Withers said that victims may not meet their abusers or traffickers in person. They are trained through chat, social media and gaming platforms.

Catfished

Her explanation was that teens and children can become victims to child sex abuse materials (CSAM). They can upload images and videos through online sharing platforms. Then again, they might not realize the potential for their photos being used against them or easily shared with other people. Predators often use “catfishing,” whereby they pretend to be a teenager and try to win the trust of potential victims.

The story of the Virginia sheriff’s deputy, who took on the identity of a boy aged 17 online before asking a California teenage girl for nude pictures. He then drove across America and murdered her grandparents and mother.

Sextortion

Other cases include “sextortion”, where predators manipulate victims to send them nude photographs over time.

Withers stated that this eventually results in harassment and threats to the image sharing, unless money is sent. Sextortion is often a problem for children. One study showed that 25 percent of the victims were under 13 when threatened, and more than two-thirds were victims who were threatened with violence before they reached 16 (Thorn 2018).

Are Our Children Being Failed by Twitter?

Experts believe it’s very worrying that Twitter and the other social media platforms have not done their bit to eradicate CSAM material that is being spread via their platforms. It is overwhelming how much data must be scrubbed within the company. Even with help from outside agencies, it should not be considered inefficient for one to two people.

According to Dr. Brian Gant (assistant professor of cybersecurity at Maryville University), “Having an online child safety team is crucial for organisations operating on social media,”

Gant observed that Twitter is a popular platform for sharing pornography. The most important thing is not having an internal team that can determine what content is acceptable and what images are considered to be child exploitation.

Failure to act can be interpreted as giving the opportunity for predators to strike.

Lois A. Ritter is associate professor in the masters of public health program, University of Nevada, Reno. “Social media platforms exacerbating child abuse when they allow users condone the exploitation of pedophilia and other forms of abuse as much as increasing the ability for children be groomed and controlled to exploit,” she said.

Alarming evidence is emerging that the number of child safety officers has dropped.

Ritter stated that social media platforms are responsible for monitoring material posted to their websites to stop and disrupt the horrific acts of child abuse and to protect children from being victimized. Staff should monitor and promptly follow up on all complaints. Child welfare is often overlooked in favor of profit. Children will be hurt if this permanent staffing shift is made.

However, it is not possible to keep track of all content even though there are many people involved.

Withers stated that while automated technological tools may be helpful, they should not replace a human moderator. They will need to decide what is true or false or child sex abuse. These companies might need to be held to higher standards. These companies could be responsible for creating an environment that allows the growth of child sex abuse materials.

These types of content are not just shared on social media. This content existed well before the Internet age.

Withers said, “We must also keep in mind that the United States is the biggest consumer and producer of child abuse content around the globe.” “We must ask why this is happening and then consider what we can do to reduce the demand.”





Source link