Trinity Mount Ministries

Wednesday, June 7, 2023

Meta vows to take action after report found Instagram’s algorithm promoted pedophilia content


 Steve Dent June 7, 2023, 6:33 am

Meta has set up an internal task force after reporters and researchers discovered its systems helped "connect and promote a vast network of accounts" devoted to underage-sex content, The Wall Street Journal has reported. Unlike forums and file transfer services, Instagram not only hosts such activities but promotes them via its algorithms. The company acknowledged enforcement problems and has taken actions including restricting its systems from recommending searches associated with sex abuse.

"Child exploitation is a horrific crime," Meta told the WSJ in a statement. "We’re continuously investigating ways to actively defend against this behavior."

Along with the task force, Meta told reporters that it is working on blocking child sexual abuse material (CSAM) networks and taking steps to change its systems. In the last two years, it has taken down 27 pedophile networks and is working on removing more. It has blocked thousands of related hashtags (with millions of posts for some) and took action to prevent its systems from recommending CSAM-related terms. It's also trying to stop its systems from connecting potential abusers with each other.

However, the report should be a wakeup call for Meta, the company's former security chief Alex Stamos told the WSJ. "That a team of three academics with limited access could find such a huge network should set off alarms at Meta," he said, noting that the company far better tools than outside investigators to map CSAM networks. "I hope the company reinvests in human investigators."

Academics from Stanford's Internet Observatory and UMass's Rescue Lab were able to quickly find "large-scale communities promoting criminal sex abuse," according to the report. After creating test users and viewing a single account, they were immediately hit with "suggested for you" recommendations of possible CSAM sellers and buyers, along with accounts linking to off-platform content sites. Following just several recommendations caused the test accounts to be inundated with sex-abuse content.

“Instagram is an onramp to places on the internet where there’s more explicit child sexual abuse,” said UMass Rescue Lab director Brian Levine. The Stanford group also found that CSAM content is "particularly severe" on the site. "The most important platform for these networks of buyers and sellers seems to be Instagram."

Meta said the company actively seeks to remove such users, having taken down 490,000 accounts violating child safety policies in January alone. Its internal statistics show that child exploitation appears in less than one in 10 thousand posts, it added.

However, until queried by reporters, Instagram was allowing users to search terms that its own systems know may be associated with CSAM material. A pop-up screen warned users that "These results may contain images of child sexual abuse" that can cause "extreme harm" to children. However, it then allowed users to either "Get resources" or "See results anyway." The latter option has now been disabled, but Meta didn't respond when the WSJ asked why it was allowed in the first place.

Furthermore, attempts by users to report child-sex content were often ignored by Instagram's algorithms. And Facebook's own efforts to exclude hashtags and terms were sometimes overridden by the systems, suggesting users try variations on the name. In testing, researchers found that viewing even one underage seller account caused the algorithm to recommend new ones. "Instagram’s suggestions were helping to rebuild the network that the platform’s own safety staff was in the middle of trying to dismantle."

A Meta spokesperson said it's currently building system to prevent such recommendations, but Levine said the time to act is now. "Pull the emergency brake. Are the economic benefits worth the harms to these children?" Engadget has reached out to Meta for comment.


Sunday, June 4, 2023

Trinity Mount Ministries - CyberTipline - NCMEC - REPORT CHILD ABUSE! REPORT CSAM! CALL 1-800-843-5678

 Help Find Missing Children. Let's Put An End To Child Abuse And Exploitation... Care.


Overview

NCMEC’s CyberTipline is the nation’s centralized reporting system for the online exploitation of children. The public and electronic service providers can make reports of suspected online enticement of children for sexual acts, child sexual molestation, child sexual abuse material, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet.

Every child deserves a safe childhood.

What Happens to Information in a CyberTip?

NCMEC staff review each tip and work to find a potential location for the incident reported so that it may be made available to the appropriate law-enforcement agency for possible investigation. We also use the information from our CyberTipline reports to help shape our prevention and safety messages.

Is Your Image Out There?

Get Support

One of the worst things about having an explicit image online is feeling like you’re facing everything alone. But you have people who care for you and want to help. Reach out to them!

A trusted adult can offer advice, help you report, and help you deal with other issues. It could be your mom, dad, an aunt, a school counselor, or anyone you trust and are comfortable talking to. You can also “self report” by making a report on your own to the CyberTipline.

Families of exploited children often feel alone in their struggle and overwhelmed by the issues affecting their lives. NCMEC provides assistance and support to victims and families such as crisis intervention and local counseling referrals to appropriate professionals. Additionally, NCMEC’s Team HOPE is a volunteer program that connects families to others who have experienced the crisis of a sexually exploited child.

Don't Give Up

Having a sexual exploitative image of yourself exposed online is a scary experience. It can make you feel vulnerable and isolated, but remember, others have been in the same situation as you – and they’ve overcome it. Learn the steps you can take to limit the spread of the content.

By the Numbers

Total Reports

In 2021, reports to the CyberTipline increased by 35% from 2020.

NCMEC alerted law enforcement to over 4,260 potential new child victims. 

Find more data in the CyberTipline Report.

More

Learn more about online exploitation and safety.

Coping with Child Sexual Abuse (CSAM) Exposure For Families

Production and Active Trading of Child Sexual Exploitation Images Depicting Identified Victims

Trends Identified in CyberTipline Sextortion Reports

The Online Enticement of Children: An In-Depth Analysis of CyberTipline Reports

How NCMEC is responding to the ever-changing threats to children online.

NCMEC is a founding member of