Trinity Mount Ministries

Saturday, March 1, 2025

Intelligence Notification: Violent online communities threaten children


Europol today issues an Intelligence Notification calling attention on the rise of violent online communities dedicated to the serious harm of children. This strategic document focuses on online grooming cult groups dedicated to normalising violence and corrupting minors, advocating for the collapse of modern society through acts of terror, chaos and violence, and spreading ideologies that inspire mass shootings, bombings and other acts of crime. These communities recruit offenders and victims on a global scale and function as cults formed around charismatic leaders who use manipulation and deception to lure and control their victims. The communities’ hierarchy is based on the amount of content shared, with the most prolific contributors earning higher rankings. Community members share extremely violent content, ranging from gore and animal cruelty to child sexual exploitation material and depictions of murder.


Europol’s Executive Director Catherine De Bolle said: 


"Today, digital platforms enable communications globally; violent extremist online communities also leverage this opportunity. Violent perpetrators spread harmful ideologies, often targeting our youth. These networks radicalise minds in the shadows, inciting them to bring violence into the real world. Awareness is our first line of defence. Families, educators and communities must stay vigilant and equip young people with critical thinking skills to resist online manipulation. International cooperation is also imperative – by sharing intelligence and holding perpetrators accountable, we can combat these dangerous communities and safeguard future generations from the grip of extreme violence and crime."

 

Vulnerable minors targeted through gaming platforms and self-help communities


The perpetrators leverage online gaming platforms, streaming services and social media platforms to identify and lure their victims. The members of these groups target vulnerable young people, particularly minors between 8 and 17 years old – especially who are LGBTQ+, racial minorities and those struggling with mental health issues. In some cases, perpetrators infiltrate online self-help or support communities dedicated to individuals impacted by these issues.


These violent criminal actors employ different tactics to lure and manipulate their victims into producing explicit sexual content, perpetrating self-harm, harming others and even carrying out murders. In the beginning, perpetrators often use ‘love bombing’ techniques – extreme expressions of care, kindness and understanding to gain the trust of the minors – while collecting personal information about their victims. The criminal actors use this information in the exploitation phase of the grooming, when they force the vulnerable minors into producing sexual content and committing acts of violence. The perpetrators then blackmail the victims to do even more harmful acts by threatening to share the victims’ explicit content with their families, friends or online communities.


Once caught in the net of the predators, minors become even more vulnerable – the detection of these criminal activities is crucial.


Beware of these behaviours in your children:


  • Secrecy about online activities
  • Withdrawal and isolation
  • Emotional distress
  • Interest in harmful content
  • Changes in language or symbols used
  • Concealing physical signs of harm

 

Do not ignore these signs in your children’s online behaviour:


  • Unusual activity on platforms
  • Interaction with unknown contacts
  • Encrypted communications
  • Exposure to disturbing content

  • The European Multidisciplinary Platform Against Criminal Threats (EMPACT) tackles the most important threats posed by organised and serious international crime affecting the EU. EMPACT strengthens intelligence, strategic and operational cooperation between national authorities, EU institutions and bodies, and international partners. EMPACT runs in four-year cycles focusing on common EU crime priorities.


Thursday, February 27, 2025

MAYER | BROWN - Children’s Online Privacy: Recent Actions by the States and the FTC


February 25, 2025
Authors:

Amber C. Thomson,
Howard W. Waltzman,
Kathryn Allen,
Megan P. Von Borstel

At A Glance

As the digital world becomes an integral part of children's lives, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act.

As social media companies and digital services providers increasingly cater to younger audiences, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act (“COPPA”).

I. US State Developments
Social Media Legislation

Several states, including California, Connecticut, Florida, Georgia, Louisiana, New York, Tennessee, and Utah, have passed legislation focused on regulating the collection, use, and disclosure of children’s data in connection with social media use. Below is a brief summary of notable requirements and trends across each state law.

California. On September 20, 2024, California Governor Newsom signed the Protecting Our Kids from Social Media Addiction Act. The law prohibits companies from collecting data on children under 18 without parental consent and from sending notifications to minors during school hours or late at night. The Ninth Circuit has temporarily blocked the law until April 2025, when the court will examine whether it infringes on free speech rights.

Connecticut. Effective October 1, 2024, Connecticut’s law prohibits features designed to significantly increase a minor’s use of an online service (such as endless scrolling), unsolicited direct messaging from adults to minors, and the collection of geolocation data without opt-in consent.

Florida. Effective January 1, 2025, Florida’s Social Media Safety Act requires social media companies to verify the age of users and terminate accounts for children under 14 years old.

Georgia. Effective July 1, 2025, the Protecting Georgia’s Children on Social Media Act will require platforms to verify users’ ages and obtain parental consent for users under 16. The law will also require schools to adopt policies that restrict social media access on school devices. 

Louisiana. Effective July 1, 2024, the Louisiana Secure Online Child Interaction and Age Limitation Act requires platforms to verify users’ ages and obtain parental consent for users under 16 to create accounts. The law also bans users under 16 from direct messaging unknown adults and restricts the collection of unnecessary personal information.

New York. On June 21, 2024, New York Governor Kathy Hochul signed the Stop Addictive Feeds Exploitation (“SAFE Kids Act”). The SAFE Kids Act requires platforms to obtain verifiable parental consent to provide addictive feeds to users under 18. The law also bans users from sending notifications to children between 12:00 to 6:00 a.m. and prohibits degrading the quality or increasing the price of the product or service due to not being able to provide the minor an addictive feed.

Tennessee. Effective January 1, 2025, the Tennessee Protecting Children from Social Media Act requires that social media companies verify express parental consent for users under 18. It also allows parents the ability to monitor their children’s privacy settings, set time restrictions, and schedule breaks in account access.

Utah. Passed in March 2023, and amended in 2024, the Utah Social Media Regulation Act mandates that social media platforms require parental consent for minors to use their services. Unless required to comply with state or federal law, social media platforms are prohibited from collecting data based on the activity of children and may not serve targeted advertising or algorithmic recommendations of content to minors. Enforcement of Utah’s law is also currently blocked by litigation.

This year, children’s privacy bills related to social media regulations continue to be introduced in other state legislatures. For instance, Utah’s App Store Accountability Act recently passed the State Senate and would require app store providers to verify users’ ages. South Carolina’s Social Media Regulation Act would require social media companies to make commercially reasonable efforts to verify the age of South Carolina account holders and require parental consent for users under the age of 18 to have an account. Similar children’s privacy bills have also been introduced in Alabama (HB 276), Arizona (HB 2861), 

Arkansas (HB 1082 and HB 1083), Colorado (SB 86), Connecticut (SB 1295) Iowa (HF 278), New York (S 4600 and S 4609), and Tennessee (SB 811 and HB 825).

Age-Appropriate Design Codes

Last year, multiple states enacted laws requiring age-appropriate design codes to improve online privacy protections for children. The success of these laws has varied.

California. The California Age-Appropriate Design Code Act (the “Act”) mandates that online services likely to be accessed by children under 18 prioritize their well-being and privacy. The Act requires businesses to assess and mitigate risks from harmful content and design features that may exploit children. Initially set to take effect on July 1, 2024, the Act is currently subject to a partial injunction by the Ninth Circuit Court of Appeals. In August 2024, the Ninth Circuit upheld a preliminary injunction of the Act’s data protection impact assessment provisions but lifted the injunction on provisions restricting the collection, use, and sale of children’s data and geolocation data.

Connecticut. As of October 1, 2024, Connecticut’s amended Consumer Data Privacy Act includes provisions for collecting data on children under the age of 18. The law also requires companies to complete a data protection impact assessment for each product likely to be accessed by children. Additionally, companies must exercise “reasonable care to avoid any heightened risk of harm to minors” caused by their products or services and to delete minors’ accounts and data upon request.
Maryland. Effective October 1, 2024, the Maryland Kids Code requires social media platforms to implement default privacy settings for children, prohibits collecting minors’ precise locations, and requires a data protection impact assessment for products likely to be accessed by children.

Illinois, South Carolina, and Vermont. Illinois, South Carolina, and Vermont have each introduced bills requiring age-appropriate design codes in their 2025-2026 legislative sessions.

Harmful Content Age Verification Legislation
States are increasingly enhancing online privacy protections for children through “harmful content age verification” laws. These laws require companies to implement reasonable age verification measures before granting children access to potentially harmful content (such as pornography, violence, or other mature themes) or face liability for failing to do so. As of January 2025, 19 states have passed laws requiring age verification to access potentially harm content:  Alabama, Arkansas, Florida, Georgia Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Utah, and Virginia. 

On January 15, 2025, Texas Attorney General Ken Paxton defended Texas’ law (HB 1181) before the Supreme Court. The case centers on whether the law, which requires that websites with harmful content verify users’ ages to prevent minors from accessing such content, infringes on the First Amendment. The Court has not yet issued its opinion on the matter.

Children’s Data Protection Legislation

States’ privacy measures for children extend beyond social media regulation. For example, Texas passed the Securing Children Online through Parental Empowerment (SCOPE) Act last year, which applies to all digital service providers. Effective September 1, 2024, the SCOPE Act prohibits digital service providers from sharing, disclosing, or selling a minor’s personal identifying information without parental consent. It also requires companies to provide parents with tools to manage and control the privacy settings on their children’s accounts. These protections extend to how minors interact with AI products.

Similarly, the New York Child Data Protection Act (CDPA) will prohibit websites, mobile applications, and other online operators from collecting, using, disclosing, or selling personal information of children under the age of 18 unless:

For children 12 years or younger, such processing is permitted by COPPA; or
For children 13 years or older, “informed consent” is obtained or such processing is strictly necessary for certain specified activities. Informed consent must be made clearly and conspicuously.

Companies will be subject to the CDPA if they have both: (a) actual knowledge that data is from a minor user; and (b) the website, online service, online application, mobile application, or device is “primarily directed to minors.” The CDPA comes into effect on June 20, 2025.

Other states have passed COPPA-style laws that impose additional restrictions on processing of minors’ data for targeted advertising, including New Hampshire and New Jersey. Similarly, Maryland’s Online Data Privacy Act prohibits the sale or processing or personal data for targeted advertising if the business knew or should have known the consumer is under 18.

Virginia amended its general consumer privacy law to address children’s privacy protections. The amendment to the Consumer Data Protection Act, effective January 1, 2025,  requires parental consent for processing personal information of a known child1 under 13 and requires data protection assessments for online services directed to known children. Similarly, Colorado amended its privacy law to strengthen protections for minors’ data. Companies are prohibited from processing minors’ data for targeted advertising and must exercise reasonable care to avoid any heightened risk of harm to minors. The Colorado privacy law amendment will take effect on October 1, 2025.

Pending Legislation

California is leading the way in enacting legislation to protect children from the risks associated with Artificial Intelligence (AI). On February 20, 2025, the California legislature introduced AB 1064, known as the Leading Ethical Development of AI (LEAD) Act. Among its provisions, the LEAD Act would require parental consent before using a child's personal information to train an AI model and mandate that developers conduct risk-level assessments to classify AI systems based on their potential harm to children. It would also prohibit systems involving facial recognition, emotion detection, and social scoring. Additionally, the LEAD Act would establish an advisory body, the LEAD for Kids Standards Board, to oversee AI technologies used by children.

II. US Federal Developments

COPAA aims to protect children’s privacy online and imposes various requirements on online content providers. On January 16, 2025, the FTC finalized updates to COPPA, which originally took effect in 2000 and had not been revised since 2013. The new changes will become effective 60 days after publication in the Federal Register, with a compliance date set for one year after publication. The updates include several revised definitions, new retention requirements, and expanded consent requirements. Additionally, there will be increased transparency regarding compliance with the COPPA Safe Harbor Programs.

The revisions to COPPA were unanimously approved by a 5-0 vote and include updates to address new technology and data collection practices, such as:

Clarifying definitions to assist companies navigating compliance “gray areas.”The amendments introduce a new definition for “mixed audience website or online service,” which covers cases where websites might fall under COPPA’s scope. A “mixed audience website or online service” is a website or online service that is directed to children (as further described in COPPA), but which:

a)  does not target children as its primary audience, 
and
b) does not collect personal information from any visitor (other than for certain limited purposes outlined in the statute) before collecting age information or use technology to determine whether the visitor is a child.

Further, to qualify as a mixed audience website or online service, any collection of age information or other means of determining whether a visitor is a child must be done in a neutral manner without defaulting to a set age or encouraging visitors to falsify age information.

Accounting for new types of data collectionby updating the definition of “personal information” to include “a biometric identifier,” which “can be used for the automated or semi-automated recognition of an individual, such as fingerprints; handprints; retina patterns; iris patterns; genetic data, including a DNA sequence; voiceprints; gait patterns; facial templates; or faceprints.”

2 Expanding on ways for parents to give their consent. 

Operators may use a text message coupled with additional steps to ensure the person providing the consent is the parent for use cases where personal information is not “disclosed” (as defined in COPPA). These additional steps include sending a confirmatory text message to the parent after receiving consent or obtaining a postal address or telephone number from the parent and confirming consent by letter or telephone call. Operators using this method must notify parents that they can revoke any consent given by responding to the initial text message.

There have been other attempts to pass federal legislation regarding children’s privacy rights and children’s online safety in recent years, including the Kids Online Safety Act, which was introduced in 2022 and passed the Senate (packaged together with an update to COPPA), but did not pass last Congress. More recently, on February 19, 2025, the Senate Judiciary Committee held a hearing on children online safety and efforts to boost safeguards for children.

III. Enforcement

Children’s privacy is also a subject of enforcement scrutiny by state attorneys general and the FTC. For example, the Texas Attorney General has launched investigations into several technology companies regarding their handling of minors’ data and potential violations of the SCOPE Act. In his press release about the investigation, Attorney General Ken Paxton warned, “[t]echnology companies are on notice that [the Texas Attorney General’s] office is vigorously enforcing Texas’s strong data privacy laws. These investigations are a critical step toward ensuring that social media and AI companies comply with our laws designed to protect children from exploitation and harm.” 3

The FTC has been actively enforcing COPPA violations against website operators. From January 2023 to January 2025, the FTC published six enforcement actions related to COPPA investigations on its website. Earlier this year, the FTC settled with an application owner for $20 million for allowing children under the age of 16 to make in-app purchases without parental consent and deceiving children about the costs of such purchases.

IV.  Key Takeaways and Predictions

States are moving to enhance parental controls around children’s privacy. Social media legislation across various states is helping parents maintain greater control over their children’s privacy by requiring companies to obtain parental consent and providing parents the ability to set time restrictions and monitor their children’s account use. Companies should develop and provide these tools that enable parents to manage their children’s online experiences to stay aligned with regulatory trends.

States will likely begin regulating the threats AI chatbots pose to young kids over the coming months. AI chatbots are increasingly central to discussions about children’s safety. For example, one technology company’s chatbot is currently facing litigation from a mother alleging that the platform’s AI chatbot encouraged her son to commit suicide.  Another lawsuit claims that the same company’s chatbot service suggested a child should kill his parents over screen time limits. State legislatures may expand their regulatory scrutiny to address threats to children posed by AI chatbots.

Businesses may face steep penalties and injunctions for violations. Several of these laws grant state attorneys general the authority to impose civil penalties for violations. For example, the New York Attorney General can impose civil penalties of up to $5,000 per violation, issue injunctive relief, and obtain disgorgement of any profits gained from violating the SAFE Kids Act. Companies violating Florida’s social media law may face fines of up to $50,000 per violation. Companies should implement robust age verification processes to accurately verify users’ ages and obtain necessary parental consent to avoid potential risk and enforcement scrutiny.

COPPA was updated for the first time in 12 years. The  amendments to COPPA reflect technological advancements made since the law was last revised 12 years ago. While some of these updates introduce additional compliance requirements for website operators, others clarify “gray areas” that the previous version of the law did not address.
 
1 The Virginia Consumer Data Protection Act does not define a “known child” but defines a child as any natural person younger than 13 years of age. § 59.1-575.

2  COPPA Final Rule at§ 312.2



Saturday, February 22, 2025

CyberTipline - NCMEC - Trinity Mount Ministries - REPORT CHILD ABUSE! REPORT CSAM! 1-800-843-5678

Skip to main co

Overview

NCMEC’s CyberTipline is the nation’s centralized reporting system for the online exploitation of children. The public and electronic service providers can make reports of suspected online enticement of children for sexual acts, child sexual molestation, child sexual abuse material, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet.

Every child deserves a safe childhood.

What Happens to Information in a CyberTip?

NCMEC staff review each tip and work to find a potential location for the incident reported so that it may be made available to the appropriate law-enforcement agency for possible investigation. We also use the information from our CyberTipline reports to help shape our prevention and safety messages.

Is Your Image Out There?

Get Support

One of the worst things about having an explicit image online is feeling like you’re facing everything alone. But you have people who care for you and want to help. Reach out to them!

A trusted adult can offer advice, help you report, and help you deal with other issues. It could be your mom, dad, an aunt, a school counselor, or anyone you trust and are comfortable talking to. You can also “self report” by making a report on your own to the CyberTipline.

Families of exploited children often feel alone in their struggle and overwhelmed by the issues affecting their lives. NCMEC provides assistance and support to victims and families such as crisis intervention and local counseling referrals to appropriate professionals. Additionally, NCMEC’s Team HOPE is a volunteer program that connects families to others who have experienced the crisis of a sexually exploited child.

Don't Give Up

Having a sexually exploitative image of yourself exposed online is a scary experience. It can make you feel vulnerable and isolated, but remember, others have been in the same situation as you – and they’ve overcome it. Learn the steps you can take to limit the spread of the content.

By the Numbers

In 2023, reports made to the CyberTipline rose more than 12% from the previous year, surpassing 36.2 million reports.

There were 105,653,162 data files reported to the CyberTipline in 2023.

Reports of online enticement increased by more than 300% from 44,155 in 2021 to 186,819 in 2023. 

Find more data in the CyberTipline Report.

By the Numbers

In 2022:

Find more data in the CyberTipline Report.

More

Learn more about online exploitation and safety.

Coping with Child Sexual Abuse (CSAM) Exposure For Families

Production and Active Trading of Child Sexual Exploitation Images Depicting Identified Victims

Trends Identified in CyberTipline Sextortion Reports

The Online Enticement of Children: An In-Depth Analysis of CyberTipline Reports

How NCMEC is responding to the ever-changing threats to children online.

NCMEC is a founding member of



Thursday, February 20, 2025

Police Arrest Santa Clara Teacher For Having An Inappropriate Relationship With A Student



David Alexander

POSTED 11:00 AM, February 19, 2025 | UPDATED AT 09:33 AM, February 20, 2025

Los Gatos-Monte Sereno police have arrested Darrin Garcia, a former teacher and current athletic director at SCUSD for alleged sex with a minor.

A former Santa Clara High School teacher and “well-known” coach stands accused of having sex with a student.

Darrin Garcia, 53, has been an athletic director in the Santa Clara Unified School District (SCUSD) since 2022, according to SCUSD. Prior to that role, Garcia was a gym and health teacher at Santa Clara High School, where he also coached track.

He has also been a sports mentor across the county.

On Feb. 12, Los Gatos-Monte Sereno police nabbed Garcia at Kathleen McDonald High School in San Jose where he works, charging him with statutory rape.

“These are extremely serious charges. The safety and security of our students are our top priorities,” SCUSD officials wrote in a statement. “Our priorities of student safety and well-being will continue to guide our actions.”

Police executed the warrant after they received a report alleging that Garcia had a sexual relationship with a student from 2021 to 2023 while teaching at Santa Clara High School, according to police.

In an email exchange, Sgt. Katherine Mann, with Los Gatos-Monte Sereno police, wrote that Garcia no longer lives in Los Gatos, where he allegedly had sex with the girl.

The investigation continues, Mann wrote, with detectives scouring Garcia’s electronic devices and conducting follow-up. She wrote that she does not know whether there are more victims.

Garcia is a father of two, according to a 2016 article in The Roar, the Santa Clara High School student news website. At the time of the article, his son was 11, and his daughter was 13, making them in their twenties. He has taught at Lynbrook High School in San Jose and Saratoga High School in Saratoga, according to the article.

SCUSD has placed Garcia on administrative leave, according to the school district’s statement. The school district conducts background checks with the Department of Justice and FBI prior to hiring employees and gets “timely notifications” if its employees are charged with a crime.

The school district is conducting its own investigation but due to employee privacy concerns, is limited in how much information it can release, according to the statement.

While school officials acknowledge that the news is “shocking and can bring up a range of emotions,” it does not intend to plan to “communicate directly with students” about the arrest.

However, Jennifer Dericco, SCUSD’s public information officer, wrote in an email that school officials met with students Feb. 14 to “share resources, including our school’s wellness team.”

“Working together, we can maintain an environment where all students can remain focused on learning,” school officials wrote in the statement.

How much jail time Garcia will face if convicted will depend on how the Santa Clara County District Attorney decides to prosecute the case, Mann wrote.

The rape charge, California Penal Code Section 261.5, could be either a felony or a misdemeanor. If convicted of a misdemeanor, according to the code, Garcia would face a year in county jail. If convicted of a felony, he could face up to four years in county jail.

A second charge, “bigamy, incest and crimes against nature,” is a felony punishable by “imprisonment in the state prison for the term prescribed,” according to the California Penal Code.

Because Garcia is older than 21, the charge’s severity depends on whether the victim was 16 years old or younger at the time of the crime, information that police and the district did not provide.

However, given the alleged relationship began in 2021, it is unlikely the girl was 17 at the start of it. If the victim were 17 in 2021, that would likely have made her 19 in 2023, making it unlikely she was still in high school.

The law does not ordinarily require statutory rapists to register as sex offenders, according to California Penal Code.

Police are asking anyone with information about this case to contact Detective Riley Frizzell at (408) 827-3558 or police@losgatosca.gov.

Garcia is out on bail.

Contact David Alexander at d.todd.alexander@gmail.com

The Silicon Valley Voice


Wednesday, February 19, 2025

These young men were tricked into sending nude photos, then blackmailed: The nightmare of sextortion

 


Rachel Hale
USA TODAY

It was around 10 p.m. on a Friday night in Indiana when one young man began messaging with a pretty girl from Indianapolis on a dating app. Lying in bed feeling lonely and bored, he was exhilarated when she suggested they exchange nude photos

Minutes later, he started violently shaking after the conversation took a turn. The woman was really a cybercriminal in Nigeria – and threatened to expose the nude photographs to his family and friends if he didn’t pay $1,000. The scammer had located his Facebook profile and compiled a photo collage of their sexts, nudes, a portrait from his college graduation and a screenshot of his full name and phone number. 

He caved to the threats and sent $300, but a month later, his fears manifested into reality. A childhood friend told him that she had received the nude photos in her Facebook spam inbox.

“I just felt my blood get hot, and my heart went down to the center of the earth,” says the 24-year-old, who requested that his name be withheld, citing concerns that the cybercriminals may track him down again and further extort him. “I can’t even begin to describe how embarrassing and humiliating it was.”

He fell victim to a growing crime in the United States: financial sextortion, a form of blackmail where predators persuade people to send explicit images or videos, then threaten to release the content unless the person sends a sum of money. In some cases, the crime can happen even if the participant doesn't send nude photos − the criminals use artificial intelligence to create highly realistic images. The most common victims are young men, particularly teenage boys ages 13 to 17.

Sextortion can lead to mental health problems and, in extreme cases, suicide. It has been connected to at least 30 deaths of teenage boys by suicide since 2021, according to a tally of private cases and the latest FBI numbers from cybersecurity experts.

More than half a dozen young men detailed their experiences to USA TODAY and recounted the shame, embarrassment and fear that kept them from telling someone they were being blackmailed or reporting it to the police.

Financial sextortion has exploded since the pandemic

Financial sextortion is the fastest-growing cybercrime targeting children in America, according to a report from the Network Contagion Research Institute. It probably has been around for decades, but in years past people didn't have the terminology or resources to report it in large numbers, says Lauren Coffren, executive director of the Exploited Children Division at the National Center for Missing and Exploited Children (NCMEC). 

In the years since the pandemic, reports of the blackmail surged − kids were online more, cybercriminals became more effective, and their operations grew in scale and organization.

In 2022, the FBI issued a public safety alert about "an explosion" of sextortion schemes that targeted more than 3,000 minors that year. From 2021 to 2023, tips received by NCMEC’s CyberTipline increased by more than 300%. The recently tabulated 2024 numbers reached an all-time high, the organization says.

That increase, Coffren says, is because cybercriminals have begun exploiting young people across the globe using the same scripts with each interaction.

One 17-year-old victim, who traced his blackmailer to Nigeria, says it’s “really frustrating” to navigate prosecution options. Another teen, whose predator was based in the Philippines, described the cyber abuse he experienced as “torture.”

“Even now, my blackmailer sometimes tries to contact me, but nothing has been shared because he would lose his leverage,” the second teen says.

The increased prevalence of the crime is also reflected by a surge in victims looking for support. Searches for “Sextortion” on Google have increased fivefold over the last 10 years. One of the largest financial sextortion support forums, r/Sextortion on Reddit, has grown to 33,000 members since its creation in 2020. 

Of forum posts that included gender information, 98% were male, according to a 2022 study of the thread. The thread’s main moderator, u/the_orig_odd_couple, says that in the past two years, there has been a noticeable increase in posts from victims who are under 18. 

Because predators are often abroad, these crimes typically land with the FBI. The agency declined to comment.

Online sexual exploitation can have long-term mental health effects

Teens are relying more on online friends than ever and often feel comfortable disclosing information to an online friend that they may not tell a physical one, said Melissa Stroebel, vice president of research and insights at Thorn, a technology nonprofit organization that creates products to shield children from sexual abuse. In 2023, more than 1 in 3 minors reported having an online sexual interaction.

Roughly 25% of sextortion is financial. Ninety percent of financial sextortion victims are young men ages 13 to 17, according to the NCMEC. Boys have a lower likelihood of disclosing victimization regarding sexual abuse but have higher risk-taking tendencies when it comes to sexual and romantic exploration in their teens, creating a perfect opportunity for blackmailers. Boys also aren’t featured as often in sexual abuse prevention conversations and materials, according to Stroebel.

“It’s really distinctly and disproportionately targeting that community,” Stroebel says. “Criminals are banking on the fact that they might have more success here.”

Because the human brain doesn’t finish developing until about age 25, young people respond to stress and decision-making differently from adults, which affects their ability to navigate these scams.

“Fear can compound and become very overwhelming in their brains, and then things start to feel bigger and bigger and bigger,” said Dr. Katie Hurley, senior director of clinical advising for the Jed Foundation. “Because often the threats are not just to themselves, but to other people they know, it feels like an intense amount of responsibility, and that's where they get frozen.”

Early experiences of abuse have long-term effects on their ability to build healthy relationships and establish trust with significant others later in life. Victims may develop anxiety, depression and post-traumatic stress disorder and are more prone to future experiences of online abuse, according to Laura Palumbo, communications director for the National Sexual Violence Resource Center.“Emotionally, the worst thing is not even the images themselves, it's the feeling of knowing that someone is after me with very, very bad intentions,” says the 17-year-old male victim.

Another male, who was just 13 years old when he was sextorted, says it took five years for the guilt and fear to subside. '

Hey I have ur nudes'

The exploitation typically starts with what seems like an innocent message through Instagram or Snapchat: “Hey there! I found your page through suggested friends.” The predator will direct the conversation to a sexual nature, and in some cases, send unsolicited nudes − often with the pressure or ask that the teen exchange their own.

Then the blackmail starts. Scammers ask for an amount, most commonly $500, to delete the images − or risk them being sent to the victim’s friends and family. To heighten these feelings of intimidation, criminals often create a countdown of how long victims have to send money, spamming teens with dozens of threats over the course of minutes or hours. The 17-year-old who spoke to USA TODAY says his abuser threatened to share the photos with child porn websites and live-camera porn sites; other blackmailers falsely told their victims they would become registered sex offenders. The act of grooming minor victims in order to receive nudes is illegal in the U.S.

Dozens of scripts obtained by USA TODAY outlined how extortionists create a sense of isolation and manipulate young victims.

“Hey I have ur nudes and everything needed to ruin your life, I have screenshot all ur followers and tags and those that comment on ur post. If you don’t cooperate with me, I won’t mind going through the stress of sending it to all of them,” one script read.I n reality, the account sending these messages is often a team of three to four foreign cybercriminals who simultaneously contact the victim, handle a money transfer, and conduct open-source research on the victim to find their family members, contacts and school.

Financial sextortion has often been traced to scammers in West African countries, including Nigeria and Ivory Coast, and Southeast Asian countries like the Philippines, according to the FBI.

For teens on social media, it should raise alarms if the person they receive a message from doesn't share mutual friends and if a profile’s photos look unusual, blurry or highly edited. In other cases, the Instagram accounts are highly believable, having been hacked from a real teenage girl or curated with photos over months.A 14-year-old who spoke to USA TODAY said he initially had suspicions about the account that sextorted him − the user was posing as a 15-year-old girl based in California but followed only 26 people and didn’t have any mutual followers.

Because scammers may be non-native English speakers, poor grammar or unusual vernacular can also be a tip-off of someone taking on a fake identity. 

Teens should also be alarmed if a new follower immediately guides a conversation to a romantic or sexual nature and should be wary of someone asking to move the conversation off social media onto a private text platform. Predators typically send unsolicited nudes within minutes, according to Coffren.

“This is a romance scam on steroids,” says cyber intelligence analyst Paul Raffile. “They are, within an hour, convincing these kids that they are trustworthy, that they can do something that potentially compromises themselves.”

Scammers have also abused the rise of generative artificial intelligence tools to create highly realistic deepfake images and videos. Roughly one of 10 reports Thorn reviewed involved artificially generated content.

'You might as well end it now':Terrorized by sextortion plot, a 17-year-old takes his life

Here’s what to do if you or your teen is sextorted 

Experts say victims should report the predator’s account but keep their own account and documentation of all messages. Having a paper trail of time frames and messages can be vital in finding a criminal's identity.

If a predator is going to send out images, it will typically happen within two weeks of contact. Once the images are sent out, the blackmailer loses their leverage and usually moves on, Coffren says.

Victims should report any attempt at sextortion to NCMEC’s CyberTipline, contact their local FBI field office, or report to the FBI at tips.fbi.gov. Teens experiencing sextortion should tell a trusted adult. For immediate mental health assistance, teens can also call or text the the 988 suicide hotline.

Those who have been scammed can work to remove the images from the internet through NCMEC’s Take It Down service, which works by assigning a digital fingerprint called a hash value to a reported sexually explicit photo or video from a minor. These hash values allow online platforms to remove the content, without the original image or video ever being viewed.

Experts agree telling teens to avoid social media platforms or not engage with strangers online is outdated advice given the sheer scale of the problem. Stroebel adds sex-shaming teen boys can inadvertently backfire. What’s more, a child could be blackmailed regardless of whether or not they’ve shared a nude image to begin with.

Parents should employ a mentality of discussing online exploitation “before it happens in case it happens,” Coffren says.

One young man, who was 23 at the time of blackmail, urged other victims to tell their parents. He panicked over “how stupid” he was after a scammer contacted him on Instagram but says his parents helped him navigate how to ignore his blackmailer and stay calm − and they blamed the predator, not their son, for what happened.

“Sextortion can happen to anyone. If it happens to you, please tell someone,” he says. “They will support you and be sympathetic.”

This article is the first in an ongoing USA TODAY series investigating a surge in financial sextortion and its mental health impact on teenage boys, which was connected to suicide in extreme cases.

Rachel Hale’s role covering youth mental health at USA TODAY is funded by a grant from Pivotal Ventures. Pivotal Ventures does not provide editorial input. Reach her at rhale@usatoday.com and @rachelleighhale on X.