Trinity Mount Ministries

Monday, March 3, 2025

PROJECT SAFE CHILDHOOD - DOJ - Trinity Mount Ministries - UPDATE - 03/31/2025

Help Find Missing Children. Let's Put An End To Child Abuse And Exploitation... Care.

PROJECT SAFE CHILDHOOD

Project Safe Childhood

  

About Project Safe Childhood

Project Safe Childhood is a nationwide initiative to combat the growing epidemic of child sexual exploitation and abuse launched in May 2006 by the Department of Justice.  Led by the U.S. Attorneys' Offices and the Criminal Division's Child Exploitation and Obscenity Section (CEOS), Project Safe Childhood marshals federal, state and local resources to better locate, apprehend and prosecute individuals who exploit children via the Internet, as well as to identify and rescue victims.

Learn More About Project Safe Childhood

Saturday, March 1, 2025

Intelligence Notification: Violent online communities threaten children


Europol today issues an Intelligence Notification calling attention on the rise of violent online communities dedicated to the serious harm of children. This strategic document focuses on online grooming cult groups dedicated to normalising violence and corrupting minors, advocating for the collapse of modern society through acts of terror, chaos and violence, and spreading ideologies that inspire mass shootings, bombings and other acts of crime. These communities recruit offenders and victims on a global scale and function as cults formed around charismatic leaders who use manipulation and deception to lure and control their victims. The communities’ hierarchy is based on the amount of content shared, with the most prolific contributors earning higher rankings. Community members share extremely violent content, ranging from gore and animal cruelty to child sexual exploitation material and depictions of murder.


Europol’s Executive Director Catherine De Bolle said: 


"Today, digital platforms enable communications globally; violent extremist online communities also leverage this opportunity. Violent perpetrators spread harmful ideologies, often targeting our youth. These networks radicalise minds in the shadows, inciting them to bring violence into the real world. Awareness is our first line of defence. Families, educators and communities must stay vigilant and equip young people with critical thinking skills to resist online manipulation. International cooperation is also imperative – by sharing intelligence and holding perpetrators accountable, we can combat these dangerous communities and safeguard future generations from the grip of extreme violence and crime."

 

Vulnerable minors targeted through gaming platforms and self-help communities


The perpetrators leverage online gaming platforms, streaming services and social media platforms to identify and lure their victims. The members of these groups target vulnerable young people, particularly minors between 8 and 17 years old – especially who are LGBTQ+, racial minorities and those struggling with mental health issues. In some cases, perpetrators infiltrate online self-help or support communities dedicated to individuals impacted by these issues.


These violent criminal actors employ different tactics to lure and manipulate their victims into producing explicit sexual content, perpetrating self-harm, harming others and even carrying out murders. In the beginning, perpetrators often use ‘love bombing’ techniques – extreme expressions of care, kindness and understanding to gain the trust of the minors – while collecting personal information about their victims. The criminal actors use this information in the exploitation phase of the grooming, when they force the vulnerable minors into producing sexual content and committing acts of violence. The perpetrators then blackmail the victims to do even more harmful acts by threatening to share the victims’ explicit content with their families, friends or online communities.


Once caught in the net of the predators, minors become even more vulnerable – the detection of these criminal activities is crucial.


Beware of these behaviours in your children:


  • Secrecy about online activities
  • Withdrawal and isolation
  • Emotional distress
  • Interest in harmful content
  • Changes in language or symbols used
  • Concealing physical signs of harm

 

Do not ignore these signs in your children’s online behaviour:


  • Unusual activity on platforms
  • Interaction with unknown contacts
  • Encrypted communications
  • Exposure to disturbing content

  • The European Multidisciplinary Platform Against Criminal Threats (EMPACT) tackles the most important threats posed by organised and serious international crime affecting the EU. EMPACT strengthens intelligence, strategic and operational cooperation between national authorities, EU institutions and bodies, and international partners. EMPACT runs in four-year cycles focusing on common EU crime priorities.


Thursday, February 27, 2025

MAYER | BROWN - Children’s Online Privacy: Recent Actions by the States and the FTC


February 25, 2025
Authors:

Amber C. Thomson,
Howard W. Waltzman,
Kathryn Allen,
Megan P. Von Borstel

At A Glance

As the digital world becomes an integral part of children's lives, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act.

As social media companies and digital services providers increasingly cater to younger audiences, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act (“COPPA”).

I. US State Developments
Social Media Legislation

Several states, including California, Connecticut, Florida, Georgia, Louisiana, New York, Tennessee, and Utah, have passed legislation focused on regulating the collection, use, and disclosure of children’s data in connection with social media use. Below is a brief summary of notable requirements and trends across each state law.

California. On September 20, 2024, California Governor Newsom signed the Protecting Our Kids from Social Media Addiction Act. The law prohibits companies from collecting data on children under 18 without parental consent and from sending notifications to minors during school hours or late at night. The Ninth Circuit has temporarily blocked the law until April 2025, when the court will examine whether it infringes on free speech rights.

Connecticut. Effective October 1, 2024, Connecticut’s law prohibits features designed to significantly increase a minor’s use of an online service (such as endless scrolling), unsolicited direct messaging from adults to minors, and the collection of geolocation data without opt-in consent.

Florida. Effective January 1, 2025, Florida’s Social Media Safety Act requires social media companies to verify the age of users and terminate accounts for children under 14 years old.

Georgia. Effective July 1, 2025, the Protecting Georgia’s Children on Social Media Act will require platforms to verify users’ ages and obtain parental consent for users under 16. The law will also require schools to adopt policies that restrict social media access on school devices. 

Louisiana. Effective July 1, 2024, the Louisiana Secure Online Child Interaction and Age Limitation Act requires platforms to verify users’ ages and obtain parental consent for users under 16 to create accounts. The law also bans users under 16 from direct messaging unknown adults and restricts the collection of unnecessary personal information.

New York. On June 21, 2024, New York Governor Kathy Hochul signed the Stop Addictive Feeds Exploitation (“SAFE Kids Act”). The SAFE Kids Act requires platforms to obtain verifiable parental consent to provide addictive feeds to users under 18. The law also bans users from sending notifications to children between 12:00 to 6:00 a.m. and prohibits degrading the quality or increasing the price of the product or service due to not being able to provide the minor an addictive feed.

Tennessee. Effective January 1, 2025, the Tennessee Protecting Children from Social Media Act requires that social media companies verify express parental consent for users under 18. It also allows parents the ability to monitor their children’s privacy settings, set time restrictions, and schedule breaks in account access.

Utah. Passed in March 2023, and amended in 2024, the Utah Social Media Regulation Act mandates that social media platforms require parental consent for minors to use their services. Unless required to comply with state or federal law, social media platforms are prohibited from collecting data based on the activity of children and may not serve targeted advertising or algorithmic recommendations of content to minors. Enforcement of Utah’s law is also currently blocked by litigation.

This year, children’s privacy bills related to social media regulations continue to be introduced in other state legislatures. For instance, Utah’s App Store Accountability Act recently passed the State Senate and would require app store providers to verify users’ ages. South Carolina’s Social Media Regulation Act would require social media companies to make commercially reasonable efforts to verify the age of South Carolina account holders and require parental consent for users under the age of 18 to have an account. Similar children’s privacy bills have also been introduced in Alabama (HB 276), Arizona (HB 2861), 

Arkansas (HB 1082 and HB 1083), Colorado (SB 86), Connecticut (SB 1295) Iowa (HF 278), New York (S 4600 and S 4609), and Tennessee (SB 811 and HB 825).

Age-Appropriate Design Codes

Last year, multiple states enacted laws requiring age-appropriate design codes to improve online privacy protections for children. The success of these laws has varied.

California. The California Age-Appropriate Design Code Act (the “Act”) mandates that online services likely to be accessed by children under 18 prioritize their well-being and privacy. The Act requires businesses to assess and mitigate risks from harmful content and design features that may exploit children. Initially set to take effect on July 1, 2024, the Act is currently subject to a partial injunction by the Ninth Circuit Court of Appeals. In August 2024, the Ninth Circuit upheld a preliminary injunction of the Act’s data protection impact assessment provisions but lifted the injunction on provisions restricting the collection, use, and sale of children’s data and geolocation data.

Connecticut. As of October 1, 2024, Connecticut’s amended Consumer Data Privacy Act includes provisions for collecting data on children under the age of 18. The law also requires companies to complete a data protection impact assessment for each product likely to be accessed by children. Additionally, companies must exercise “reasonable care to avoid any heightened risk of harm to minors” caused by their products or services and to delete minors’ accounts and data upon request.
Maryland. Effective October 1, 2024, the Maryland Kids Code requires social media platforms to implement default privacy settings for children, prohibits collecting minors’ precise locations, and requires a data protection impact assessment for products likely to be accessed by children.

Illinois, South Carolina, and Vermont. Illinois, South Carolina, and Vermont have each introduced bills requiring age-appropriate design codes in their 2025-2026 legislative sessions.

Harmful Content Age Verification Legislation
States are increasingly enhancing online privacy protections for children through “harmful content age verification” laws. These laws require companies to implement reasonable age verification measures before granting children access to potentially harmful content (such as pornography, violence, or other mature themes) or face liability for failing to do so. As of January 2025, 19 states have passed laws requiring age verification to access potentially harm content:  Alabama, Arkansas, Florida, Georgia Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Utah, and Virginia. 

On January 15, 2025, Texas Attorney General Ken Paxton defended Texas’ law (HB 1181) before the Supreme Court. The case centers on whether the law, which requires that websites with harmful content verify users’ ages to prevent minors from accessing such content, infringes on the First Amendment. The Court has not yet issued its opinion on the matter.

Children’s Data Protection Legislation

States’ privacy measures for children extend beyond social media regulation. For example, Texas passed the Securing Children Online through Parental Empowerment (SCOPE) Act last year, which applies to all digital service providers. Effective September 1, 2024, the SCOPE Act prohibits digital service providers from sharing, disclosing, or selling a minor’s personal identifying information without parental consent. It also requires companies to provide parents with tools to manage and control the privacy settings on their children’s accounts. These protections extend to how minors interact with AI products.

Similarly, the New York Child Data Protection Act (CDPA) will prohibit websites, mobile applications, and other online operators from collecting, using, disclosing, or selling personal information of children under the age of 18 unless:

For children 12 years or younger, such processing is permitted by COPPA; or
For children 13 years or older, “informed consent” is obtained or such processing is strictly necessary for certain specified activities. Informed consent must be made clearly and conspicuously.

Companies will be subject to the CDPA if they have both: (a) actual knowledge that data is from a minor user; and (b) the website, online service, online application, mobile application, or device is “primarily directed to minors.” The CDPA comes into effect on June 20, 2025.

Other states have passed COPPA-style laws that impose additional restrictions on processing of minors’ data for targeted advertising, including New Hampshire and New Jersey. Similarly, Maryland’s Online Data Privacy Act prohibits the sale or processing or personal data for targeted advertising if the business knew or should have known the consumer is under 18.

Virginia amended its general consumer privacy law to address children’s privacy protections. The amendment to the Consumer Data Protection Act, effective January 1, 2025,  requires parental consent for processing personal information of a known child1 under 13 and requires data protection assessments for online services directed to known children. Similarly, Colorado amended its privacy law to strengthen protections for minors’ data. Companies are prohibited from processing minors’ data for targeted advertising and must exercise reasonable care to avoid any heightened risk of harm to minors. The Colorado privacy law amendment will take effect on October 1, 2025.

Pending Legislation

California is leading the way in enacting legislation to protect children from the risks associated with Artificial Intelligence (AI). On February 20, 2025, the California legislature introduced AB 1064, known as the Leading Ethical Development of AI (LEAD) Act. Among its provisions, the LEAD Act would require parental consent before using a child's personal information to train an AI model and mandate that developers conduct risk-level assessments to classify AI systems based on their potential harm to children. It would also prohibit systems involving facial recognition, emotion detection, and social scoring. Additionally, the LEAD Act would establish an advisory body, the LEAD for Kids Standards Board, to oversee AI technologies used by children.

II. US Federal Developments

COPAA aims to protect children’s privacy online and imposes various requirements on online content providers. On January 16, 2025, the FTC finalized updates to COPPA, which originally took effect in 2000 and had not been revised since 2013. The new changes will become effective 60 days after publication in the Federal Register, with a compliance date set for one year after publication. The updates include several revised definitions, new retention requirements, and expanded consent requirements. Additionally, there will be increased transparency regarding compliance with the COPPA Safe Harbor Programs.

The revisions to COPPA were unanimously approved by a 5-0 vote and include updates to address new technology and data collection practices, such as:

Clarifying definitions to assist companies navigating compliance “gray areas.”The amendments introduce a new definition for “mixed audience website or online service,” which covers cases where websites might fall under COPPA’s scope. A “mixed audience website or online service” is a website or online service that is directed to children (as further described in COPPA), but which:

a)  does not target children as its primary audience, 
and
b) does not collect personal information from any visitor (other than for certain limited purposes outlined in the statute) before collecting age information or use technology to determine whether the visitor is a child.

Further, to qualify as a mixed audience website or online service, any collection of age information or other means of determining whether a visitor is a child must be done in a neutral manner without defaulting to a set age or encouraging visitors to falsify age information.

Accounting for new types of data collectionby updating the definition of “personal information” to include “a biometric identifier,” which “can be used for the automated or semi-automated recognition of an individual, such as fingerprints; handprints; retina patterns; iris patterns; genetic data, including a DNA sequence; voiceprints; gait patterns; facial templates; or faceprints.”

2 Expanding on ways for parents to give their consent. 

Operators may use a text message coupled with additional steps to ensure the person providing the consent is the parent for use cases where personal information is not “disclosed” (as defined in COPPA). These additional steps include sending a confirmatory text message to the parent after receiving consent or obtaining a postal address or telephone number from the parent and confirming consent by letter or telephone call. Operators using this method must notify parents that they can revoke any consent given by responding to the initial text message.

There have been other attempts to pass federal legislation regarding children’s privacy rights and children’s online safety in recent years, including the Kids Online Safety Act, which was introduced in 2022 and passed the Senate (packaged together with an update to COPPA), but did not pass last Congress. More recently, on February 19, 2025, the Senate Judiciary Committee held a hearing on children online safety and efforts to boost safeguards for children.

III. Enforcement

Children’s privacy is also a subject of enforcement scrutiny by state attorneys general and the FTC. For example, the Texas Attorney General has launched investigations into several technology companies regarding their handling of minors’ data and potential violations of the SCOPE Act. In his press release about the investigation, Attorney General Ken Paxton warned, “[t]echnology companies are on notice that [the Texas Attorney General’s] office is vigorously enforcing Texas’s strong data privacy laws. These investigations are a critical step toward ensuring that social media and AI companies comply with our laws designed to protect children from exploitation and harm.” 3

The FTC has been actively enforcing COPPA violations against website operators. From January 2023 to January 2025, the FTC published six enforcement actions related to COPPA investigations on its website. Earlier this year, the FTC settled with an application owner for $20 million for allowing children under the age of 16 to make in-app purchases without parental consent and deceiving children about the costs of such purchases.

IV.  Key Takeaways and Predictions

States are moving to enhance parental controls around children’s privacy. Social media legislation across various states is helping parents maintain greater control over their children’s privacy by requiring companies to obtain parental consent and providing parents the ability to set time restrictions and monitor their children’s account use. Companies should develop and provide these tools that enable parents to manage their children’s online experiences to stay aligned with regulatory trends.

States will likely begin regulating the threats AI chatbots pose to young kids over the coming months. AI chatbots are increasingly central to discussions about children’s safety. For example, one technology company’s chatbot is currently facing litigation from a mother alleging that the platform’s AI chatbot encouraged her son to commit suicide.  Another lawsuit claims that the same company’s chatbot service suggested a child should kill his parents over screen time limits. State legislatures may expand their regulatory scrutiny to address threats to children posed by AI chatbots.

Businesses may face steep penalties and injunctions for violations. Several of these laws grant state attorneys general the authority to impose civil penalties for violations. For example, the New York Attorney General can impose civil penalties of up to $5,000 per violation, issue injunctive relief, and obtain disgorgement of any profits gained from violating the SAFE Kids Act. Companies violating Florida’s social media law may face fines of up to $50,000 per violation. Companies should implement robust age verification processes to accurately verify users’ ages and obtain necessary parental consent to avoid potential risk and enforcement scrutiny.

COPPA was updated for the first time in 12 years. The  amendments to COPPA reflect technological advancements made since the law was last revised 12 years ago. While some of these updates introduce additional compliance requirements for website operators, others clarify “gray areas” that the previous version of the law did not address.
 
1 The Virginia Consumer Data Protection Act does not define a “known child” but defines a child as any natural person younger than 13 years of age. § 59.1-575.

2  COPPA Final Rule at§ 312.2



Saturday, February 22, 2025

CyberTipline - NCMEC - Trinity Mount Ministries - REPORT CHILD ABUSE! REPORT CSAM! 1-800-843-5678

Skip to main co

Overview

NCMEC’s CyberTipline is the nation’s centralized reporting system for the online exploitation of children. The public and electronic service providers can make reports of suspected online enticement of children for sexual acts, child sexual molestation, child sexual abuse material, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet.

Every child deserves a safe childhood.

What Happens to Information in a CyberTip?

NCMEC staff review each tip and work to find a potential location for the incident reported so that it may be made available to the appropriate law-enforcement agency for possible investigation. We also use the information from our CyberTipline reports to help shape our prevention and safety messages.

Is Your Image Out There?

Get Support

One of the worst things about having an explicit image online is feeling like you’re facing everything alone. But you have people who care for you and want to help. Reach out to them!

A trusted adult can offer advice, help you report, and help you deal with other issues. It could be your mom, dad, an aunt, a school counselor, or anyone you trust and are comfortable talking to. You can also “self report” by making a report on your own to the CyberTipline.

Families of exploited children often feel alone in their struggle and overwhelmed by the issues affecting their lives. NCMEC provides assistance and support to victims and families such as crisis intervention and local counseling referrals to appropriate professionals. Additionally, NCMEC’s Team HOPE is a volunteer program that connects families to others who have experienced the crisis of a sexually exploited child.

Don't Give Up

Having a sexually exploitative image of yourself exposed online is a scary experience. It can make you feel vulnerable and isolated, but remember, others have been in the same situation as you – and they’ve overcome it. Learn the steps you can take to limit the spread of the content.

By the Numbers

In 2023, reports made to the CyberTipline rose more than 12% from the previous year, surpassing 36.2 million reports.

There were 105,653,162 data files reported to the CyberTipline in 2023.

Reports of online enticement increased by more than 300% from 44,155 in 2021 to 186,819 in 2023. 

Find more data in the CyberTipline Report.

By the Numbers

In 2022:

Find more data in the CyberTipline Report.

More

Learn more about online exploitation and safety.

Coping with Child Sexual Abuse (CSAM) Exposure For Families

Production and Active Trading of Child Sexual Exploitation Images Depicting Identified Victims

Trends Identified in CyberTipline Sextortion Reports

The Online Enticement of Children: An In-Depth Analysis of CyberTipline Reports

How NCMEC is responding to the ever-changing threats to children online.

NCMEC is a founding member of



Thursday, February 20, 2025

Police Arrest Santa Clara Teacher For Having An Inappropriate Relationship With A Student



David Alexander

POSTED 11:00 AM, February 19, 2025 | UPDATED AT 09:33 AM, February 20, 2025

Los Gatos-Monte Sereno police have arrested Darrin Garcia, a former teacher and current athletic director at SCUSD for alleged sex with a minor.

A former Santa Clara High School teacher and “well-known” coach stands accused of having sex with a student.

Darrin Garcia, 53, has been an athletic director in the Santa Clara Unified School District (SCUSD) since 2022, according to SCUSD. Prior to that role, Garcia was a gym and health teacher at Santa Clara High School, where he also coached track.

He has also been a sports mentor across the county.

On Feb. 12, Los Gatos-Monte Sereno police nabbed Garcia at Kathleen McDonald High School in San Jose where he works, charging him with statutory rape.

“These are extremely serious charges. The safety and security of our students are our top priorities,” SCUSD officials wrote in a statement. “Our priorities of student safety and well-being will continue to guide our actions.”

Police executed the warrant after they received a report alleging that Garcia had a sexual relationship with a student from 2021 to 2023 while teaching at Santa Clara High School, according to police.

In an email exchange, Sgt. Katherine Mann, with Los Gatos-Monte Sereno police, wrote that Garcia no longer lives in Los Gatos, where he allegedly had sex with the girl.

The investigation continues, Mann wrote, with detectives scouring Garcia’s electronic devices and conducting follow-up. She wrote that she does not know whether there are more victims.

Garcia is a father of two, according to a 2016 article in The Roar, the Santa Clara High School student news website. At the time of the article, his son was 11, and his daughter was 13, making them in their twenties. He has taught at Lynbrook High School in San Jose and Saratoga High School in Saratoga, according to the article.

SCUSD has placed Garcia on administrative leave, according to the school district’s statement. The school district conducts background checks with the Department of Justice and FBI prior to hiring employees and gets “timely notifications” if its employees are charged with a crime.

The school district is conducting its own investigation but due to employee privacy concerns, is limited in how much information it can release, according to the statement.

While school officials acknowledge that the news is “shocking and can bring up a range of emotions,” it does not intend to plan to “communicate directly with students” about the arrest.

However, Jennifer Dericco, SCUSD’s public information officer, wrote in an email that school officials met with students Feb. 14 to “share resources, including our school’s wellness team.”

“Working together, we can maintain an environment where all students can remain focused on learning,” school officials wrote in the statement.

How much jail time Garcia will face if convicted will depend on how the Santa Clara County District Attorney decides to prosecute the case, Mann wrote.

The rape charge, California Penal Code Section 261.5, could be either a felony or a misdemeanor. If convicted of a misdemeanor, according to the code, Garcia would face a year in county jail. If convicted of a felony, he could face up to four years in county jail.

A second charge, “bigamy, incest and crimes against nature,” is a felony punishable by “imprisonment in the state prison for the term prescribed,” according to the California Penal Code.

Because Garcia is older than 21, the charge’s severity depends on whether the victim was 16 years old or younger at the time of the crime, information that police and the district did not provide.

However, given the alleged relationship began in 2021, it is unlikely the girl was 17 at the start of it. If the victim were 17 in 2021, that would likely have made her 19 in 2023, making it unlikely she was still in high school.

The law does not ordinarily require statutory rapists to register as sex offenders, according to California Penal Code.

Police are asking anyone with information about this case to contact Detective Riley Frizzell at (408) 827-3558 or police@losgatosca.gov.

Garcia is out on bail.

Contact David Alexander at d.todd.alexander@gmail.com

The Silicon Valley Voice