Trinity Mount Ministries

Tuesday, April 8, 2025

Washington Man Indicted on 11 Counts of Sex Trafficking Children, Production of Child Sexual Abuse Material, and Forced Labor

For Immediate Release

U.S. Attorney's Office, Eastern District of Washington

Richland, Washington - Acting United States Attorney Richard R. Barker announced that on April 2, 2025, a federal grand jury for the Eastern District of Washington returned an indictment charging Jonathan Michael Atkinson, age 34, with 11 criminal counts including Sex Trafficking Children, Production and Attempted Production of Child Pornography, Online Enticement of a Minor, and Forced Labor. The criminal charges against Atkinson carry a maximum sentence of up to a lifetime in prison.

Atkinson was arrested on April 8, 2025, by the Southeast Regional Internet Crimes Against Children Task Force, consisting of Homeland Security Investigations, Richland Police Department, Kennewick Police Department, and the Benton County Sheriff’s Office. Additional assistance was provided by Pasco Police Department, ATF and DEA. Atkinson will be arraigned in federal court on April 10, 2025.

“The U.S. Attorney’s office for the Eastern District of Washington will continue to aggressively prosecute all versions of human trafficking,” stated Acting United States Attorney Richard Barker. “We will continue to work closely with our federal, state, local, and Tribal law enforcement partners to seek justice for the most vulnerable among us.”

“Human trafficking is a heinous crime that preys on the most vulnerable members of our communities and the most effective way we can dismantle these criminal networks is through strong partnerships,” said Matthew Murphy, acting Special Agent in Charge of HSI Seattle. “HSI is proud to work alongside our federal, state, and local law enforcement partners to protect victims, bring traffickers to justice, and put an end to this exploitation.”

If members of the public have any information regarding this case, please contact the Pasco Police Department.

This case was investigated by Homeland Security Investigations and the Southeast Regional Internet Crimes Against Children Task Force. It is being prosecuted by Assistant United States Attorney Laurel J. Holland and Stephanie A. Van Marter.

An indictment is merely an allegation, and all defendants are presumed innocent until proven guilty beyond a reasonable doubt in a court of law.


Monday, April 7, 2025

CyberTipline - NCMEC - Trinity Mount Ministries - REPORT CHILD ABUSE! REPORT CSAM! 1-800-843-5678

 


Skip to main co         

                      Overview

NCMEC’s CyberTipline is the nation’s centralized reporting system for the online exploitation of children. The public and electronic service providers can make reports of suspected online enticement of children for sexual acts, child sexual molestation, child sexual abuse material, child sex tourism, child sex trafficking, unsolicited obscene materials sent to a child, misleading domain names, and misleading words or digital images on the internet.

Every child deserves a safe childhood.

What Happens to Information in a CyberTip?

NCMEC staff review each tip and work to find a potential location for the incident reported so that it may be made available to the appropriate law-enforcement agency for possible investigation. We also use the information from our CyberTipline reports to help shape our prevention and safety messages.

Is Your Image Out There?

Get Support

One of the worst things about having an explicit image online is feeling like you’re facing everything alone. But you have people who care for you and want to help. Reach out to them!

A trusted adult can offer advice, help you report, and help you deal with other issues. It could be your mom, dad, an aunt, a school counselor, or anyone you trust and are comfortable talking to. You can also “self report” by making a report on your own to the CyberTipline.

Families of exploited children often feel alone in their struggle and overwhelmed by the issues affecting their lives. NCMEC provides assistance and support to victims and families such as crisis intervention and local counseling referrals to appropriate professionals. Additionally, NCMEC’s Team HOPE is a volunteer program that connects families to others who have experienced the crisis of a sexually exploited child.

Don't Give Up

Having a sexually exploitative image of yourself exposed online is a scary experience. It can make you feel vulnerable and isolated, but remember, others have been in the same situation as you – and they’ve overcome it. Learn the steps you can take to limit the spread of the content.

By the Numbers

In 2023, reports made to the CyberTipline rose more than 12% from the previous year, surpassing 36.2 million reports.

There were 105,653,162 data files reported to the CyberTipline in 2023.

Reports of online enticement increased by more than 300% from 44,155 in 2021 to 186,819 in 2023. 

Find more data in the CyberTipline Report.

By the Numbers

In 2022:

Find more data in the CyberTipline Report.

More

Learn more about online exploitation and safety.

Coping with Child Sexual Abuse (CSAM) Exposure For Families

Production and Active Trading of Child Sexual Exploitation Images Depicting Identified Victims

Trends Identified in CyberTipline Sextortion Reports

The Online Enticement of Children: An In-Depth Analysis of CyberTipline Reports



National Center for Missing and Exploited Children, CyberTipline, 1-800-843-5678

Report It

If you think you have seen a missing child, or suspect a child may be sexually exploited, contact the National Center for Missing & Exploited Children 24 hours a day, 7 days a week.

Report Child Sexual Exploitation

Use the CyberTipline to report child sexual exploitation.

Make a CyberTipline Report »

The banner is a tool to allow you to conveniently share a link to NCMEC's CyberTipline to create a report. To display this banner on your website:

  • Read the terms of use. Your use of any National Center for Missing & Exploited Children® banner signifies your agreement to these terms of use.
  • Enter the code snippet below into your site.

<iframe src="https://www.missingkids.org/gethelpnow/cybertipline/widget" width="300" height="500"></iframe>


NCMEC is a founding member of





Friday, March 28, 2025

CRC Technology Assists in Arrest of Man Involved in Online Child Abuse Network


A man from the Ostalbkreis district has been arrested under suspicion of watching and giving instructions on the sexual abuse of children in Asia through live online streams. The 60-year-old is now being investigated by the Baden-Württemberg Cybercrime Center in Karlsruhe and the Aalen Criminal Police for sexual abuse and exploitation of children and the production of child sexual abuse material.

According to investigators, the man from Ostalbkreis was not only watching live abuse of children in the Philippines but allegedly also provided instructions for the abuse to occur. Authorities have dismantled a pedophile network centered around a woman in the Philippines, who was reportedly offering live broadcasts of horrific acts of abuse in exchange for payment.

Crucially, the investigation and arrest was aided by CRC technology*. Devices seized by Philippine authorities played a critical role in tracking down the suspect in the Ostalbkreis district. The Baden-Württemberg State Criminal Police Office then took over the investigation. During a search of the 60-year-old’s apartment, several data storage devices were found and seized by investigators. Mike McGonigle, Director of Special Projects & Industry Outreach at CRC, commended the teamwork: “Great job by the Baden-Württemberg Cybercrime Center, the Aalen Criminal Police, and Philippine authorities. CRC remains dedicated to creating and providing technology that helps law enforcement identify, arrest, and prosecute live-stream offenders. We greatly appreciate our collaboration with our global law enforcement partners who are actively combating this form of child sexual abuse.”

The man is now in custody, and the children affected by the abuse in the Philippines are under the care of local authorities. Additionally, several other investigations are ongoing in Baden-Württemberg, targeting individuals who are also alleged to have watched live-streamed abuse from the same network in the Philippines. CRC technology continues to aid in identifying and apprehending those involved in these horrific crimes. Thank you to the Tim Tebow Foundation for your support and collaboration towards this arrest.

*Child Rescue Coalition (CRC):
  • This organization provides free technology to law enforcement agencies to help protect children from Child Sexual Abuse Material (CSAM) and identify individuals involved in the possession and distribution of such material. 



Thursday, March 27, 2025

@elonmusk by Brett Fletcher @TrinityMount

 

NOTE - This is a Trinity Mount Ministries blog entry from December 4, 2022. I decided to repost it.

@elonmusk

I think it is hilarious how the world of social media seems very upset since the brilliant and wealthy man, Elon Musk, purchased Twitter! "That’s it! I'm leaving Twitter!" Twitter is ruined!" "Advertisers are leaving Twitter!" Etc., ad nauseum!

The only thing I know: When I founded Trinity Mount Ministries back in 2011, the two main social media platforms that would be instrumental for the cause of sharing information about Missing and exploited children, related news articles and child safety content would be Twitter and Facebook. Now, as we are approaching 2023, (12 years later) this remains true. No other social media platforms comes close to the combination of Twitter and Facebook, in relation to Trinity Mount Ministries, Trinity Mount Global Missing Kids and Trinity Mount International Missing Kids. 

So, leave these platforms if you must... I will continue to utilize these valuable tools for the above-mentioned cause: helping to find missing and exploited children, domestically and internationally, as well as promoting child safety. I hardly think Elon Musk or Mark Zuckerberg are too concerned about anybody's departure(s) from their social media platforms. My guess - these two platforms will do whatever necessary to stay afloat. 

The only difference I've noticed on Twitter: it's interesting and even exciting to some extent. I've always believed and maintain this to be true: if you don't like what someone shares: ignore it, fight against it, block it, protest it... though, a platform is there for you to use as well, as you see fit...to some extent. I believe in moderation as well as free speech. They can co-exist... when clear, cool and sound heads prevail. 

Brett Fletcher, Founder of Trinity Mount Ministries 

@TrinityMount https://www.twitter.com/TrinityMount


Thursday, March 20, 2025

Arrested in underage sex sting, Minnesota lawmaker resigns

by Elizabeth Russell


The chair for Minnesota state Sen. Justin Eichorn, a Republican from Grand Rapids, sits empty in a Senate hearing roomAssociated Press / Steve Karnowski

Republican state senator Justin Eichorn resigned Thursday before the Minnesota Senate could vote on his expulsion. Bloomington Police arrested Eichorn, 40, on Monday on charges of attempted coercion and enticement of a minor. He is presumed innocent until proven guilty in court. He was expected to appear in federal court Thursday afternoon.

How have public officials responded to the arrest? Minnesota Republicans called for Eichorn to resign on Tuesday when the charges became public. They were preparing to expel him on Thursday when he resigned, the Minnesota Star Tribune reported. State Senate Majority Leader Mark Johnson said Eichorn’s resignation was the best option for both the Senate and his family. Eichorn is married and has four children.

How did officers catch Eichorn? Eichorn was caught by a police sting operation meant to suppress the demand for juvenile sex trafficking, according to an FBI agent’s affidavit. According to the Wednesday court filing, Eichorn communicated with an undercover law enforcement officer posing as a 17-year-old girl. Eichorn allegedly solicited inappropriate photos from the fictitious girl and arranged to meet up for sex. Police arrested him when he arrived at the predetermined location.





Saturday, March 15, 2025

TBI Arrests, Charges Medina Man in Ongoing Child Exploitation Case

 

GIBSON COUNTY – Special agents assigned to the Tennessee Bureau of Investigation’s Cybercrime & Digital Evidence Unit have arrested and charged a Medina man accused of uploading child sex abuse material.

On December 26, 2024, TBI agents opened the investigation after receiving a cybertip from the National Center for Missing and Exploited Children (NCMEC) about an individual uploading child sex abuse material to an electronic service provider account. During the course of the investigation, agents identified the user account to be associated with Brandon Fairchild.

On March 9th, 2025, a search warrant was executed for the electronic service provider account, and Brandon Fairchild (DOB: 05/27/1981) was subsequently taken into custody and charged with one count of Sexual Exploitation of a Minor, and two counts of Unlawful Photography. He was booked into the Gibson County Jail.

The charges and allegations referenced in this release are merely accusations of criminal conduct and not evidence. The defendant is presumed innocent unless and until proven guilty beyond a reasonable doubt and convicted through due process of law.

The Tennessee Bureau of Investigation is an ICAC affiliate of the Tennessee ICAC Task Force. Anyone with information about these cases or other cases of online child exploitation should contact the Tennessee Bureau of Investigation Tipline at 1-800-TBI-FIND, TipsToTBI@tbi.tn.gov, or report via the National Center for Missing and Exploited Children (NCMEC) CyberTipline at CyberTipline.org.

Parents seeking additional information about cybercrime, child exploitation, and how best to safeguard their loved ones can visit www.NetSmartz.org for a variety of topical, age-appropriate resources.




Thursday, March 13, 2025

Child Safety Experts Testify in Support of "Duty of Care" to Protect Kids Online

 


WASHINGTON, D.C.] – U.S. Senator Richard Blumenthal (D-CT), the author of the Kids Online Safety Act (KOSA) with U.S. Senator Marsha Blackburn (R-TN), asked two child safety experts about their support for a “duty of care” that would require online platforms to prevent and mitigate certain harms that they know their platforms and products are causing to young users.

KOSA includes a “duty of care” that forces online platforms to consider and address the negative impacts of their specific product or service on younger users, including things like their recommendation algorithms and addictive product features. The specific covered harms include suicide, eating disorders, substance use disorders, and sexual exploitation.

“There needs to be a duty of care because ultimately these children are on their platforms,” answered John Pizzuro, the CEO of Raven, an advocacy organization focused on focused on ending child exploitation. “So there's a burden on them to make sure that the children are safe.”

Michelle DeLaune, the CEO of the National Center for Missing and Exploited Children (NCMEC), agreed: “We cannot prosecute our way out of the problem. The reports are coming in, law enforcement rightly is investigating. Really, we need to be looking upstream about preventing these crimes from happening in the first place.”

Blumenthal and Blackburn first introduced KOSA in February 2022 following reporting by the Wall Street Journal and after spearheading a series of five subcommittee hearings with social media companies and advocates on the repeated failures by tech giants to protect kids on their platforms. KOSA will require platforms to enable the strongest privacy settings by default, force platforms to prevent and mitigate specific dangers to minors, provide parents and educators new controls to help protect children, and require independent audits and research into social media companies.

The full text of Blumenthal’s exchange with Pizzuro and DeLaune is available here and copied below:

U.S. Senator Richard Blumenthal (D-CT): Thank you very much, Senator Hawley. We have worked together, and my hope is that we will continue to work together, especially on the issues that are before us today—and most especially, the Kids Online Safety Act, which was approved by the United States Senate in a vote of 91-3. 91-3. Doesn’t happen very often these days in the Senate. It was last session, and unfortunately the House never gave it a vote, which in my view, is a tragedy because it helps protect kids against the toxic content and the algorithms, the black box methodology that social media uses.

And of course, the tech companies who would be held accountable under this law say they are for it and then they worked behind the scenes against it, and they try to shift blame for this skyrocketing increase in online harms to others, avoiding the blame that they well deserve. But more important than the blame is reforms that they could well institute, providing tools and safeguards for parents and children and a duty of care so that they are required to mitigate harm if they know it is happening or have reason to know what is happening. And of course disclosure of the algorithms—the black box drivers of that toxic content.

Mr. Pizzuro, you say that we are not going to arrest our way—I think in your testimony, we cannot arrest our way out of this problem. Let me ask you, perhaps you and Ms. DeLaune, what you think about the duty of care as a means of providing some safeguards here.

John Pizzuro: Well, I think from my standpoint, is that there is no safeguards. And I think that's the problem, right? The AI algorithms push all this content to them, and it doesn't matter what the mechanism is. So, there needs to be a duty of care because ultimately these children are on their platforms. So there's a burden on them to make sure that the children are safe.

Michelle DeLaune: Thank you, Senator. It is really incumbent upon the companies to know their customer. You know, at this point, most of the sites, most things online, you just check a box, “You’re over 13,” “You’re over 21,” whatever it may be. They are working and looking at age assurance, knowing who the child is, what age they are at. Going back to the case that we just saw a moment ago, knowing who they are engaging with, whether or not they are over age, under age. There is a shared responsibility, in our view, for the platforms, for the app stores, for the device, in knowing who the child is and building and designing safer experiences for them, recognizing their age.

I will also talk briefly about the necessity. We cannot prosecute our way out of the problem. The reports are coming in, law enforcement rightly is investigating. Really, we need to be looking upstream about preventing these crimes from happening in the first place.

One feature that we are seeing an increase, we actually saw a 1300% increase in one year, in the use of generative AI to create child sexual abuse imagery. There, our blockers right now in trying to prevent that. The current legislation allows the National Center to provide specific elements to help in the prevention of these crimes only with electronic service providers.

What the Stop CSAM Act also allows us to do is share this information with other entities who are furthering the protection of children—whether it be an NGO, whether it be a gen AI tech provider. Right now, we are hearing requests working with Meta, with OpenAI, with Google, who are looking to build classifiers to detect generative artificial CSAM, AI CSAM. But there is limited information in some cases about what we can share. So, another really important thing that we just keep going back to is we will continue responding. We need to be preventing, they need to know who their customers are, and we need to be able to share good data, helpful data, to help them build solutions to the problems.

Blumenthal: And I agree that they have the technology. They know the customers. The burden should be on them. That is the importance of the duty of care. It is a design feature, it’s not a censorship mechanism. It does not block content. It gives the consumers choices so that they can block it if they wish, or their parents to take action to protect their children with tools that they deserve to have. And the duty of care imposes a measure of responsibility on the tech companies themselves to address the kind of problems that they are seeing and we are seeing children facing.

Thank you all for your testimony today. Thanks, Mr. Chairman.