The bishops’ letter noted the dangers of abuse, extortion, and blackmail online. This includes the coercion of sexual favors or money accompanied by threats to release sexual images or money.
“Legislation should ensure that social media platforms do not permit abuse by predators or undermine the rights of parents to protect their children from harm,” they said.
Researchers have sought to determine whether and to what extent popular social media sites help spread illegal pornography and CSAM.
Instagram, owned by Facebook’s parent company Meta, has many user accounts that seek to purchase sexual content depicting underage persons. Investigators and researchers with the Wall Street Journal, Stanford University, and the University of Massachusetts-Amherst analyzed these accounts and how Instagram treats them.
They found sexually explicit hashtags and pornographic accounts purporting to be run by children or minors themselves. Some Instagram accounts appear to allow other users to commission custom works of illegal pornography or to meet children in person. The social media platform algorithm appears to promote the accounts through recommendation systems that identify shared interests among users, researchers and investigators found.
“Child exploitation is a horrific crime,” the company said, according to the Wall Street Journal. “We’re continuously investigating ways to actively defend against this behavior.”
Promoting underage sexual content violates both Meta policy and federal law. Meta said it has an internal task force dedicated to policing this content. In the past two years, it said, it has removed 27 networks for distributors of pedophilic material and has blocked thousands of hashtags that sexualize children. The company is also seeking to prevent algorithms and recommendation systems from helping to connect adults with possible interests in CSAM.
Alex Stamos, who was chief security officer at Meta through 2018 and is now head of the Stanford Internet Observatory, told the Wall Street Journal that a sustained effort is needed to combat the material.
“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, voicing hope that the company reinvests in human investigators.
Other researchers at the Stanford Internet Observatory, based on their analysis of 100,000 Twitter posts from March to May, have reported that the social media platform appears to have failed to block dozens of known images of child pornography, despite the availability of screening software, databases, and other best practices to combat CSAM.
Credit: Source link