Trinity Mount Ministries

Showing posts with label algorithms. Show all posts
Showing posts with label algorithms. Show all posts

Tuesday, May 5, 2026

Caught in the Algorithm: When Child Advocacy Triggers Social Media Suspensions

 


By Brett Fletcher 

​The digital frontlines of child protection are complex, constantly shifting, and heavily guarded—as they should be. Social media platforms carry a massive responsibility to police their networks for predatory behavior and illegal content. To do this at scale, they rely heavily on automated algorithms and aggressive safety protocols.

​Usually, this is a good thing. We want platforms to err on the side of caution when it comes to the safety of the vulnerable. But recently, we experienced firsthand what happens when those automated safety nets cast a little too wide of a net.

​Without warning, our X (formerly Twitter) account was suspended.

​For an organization dedicated to finding missing children and providing advocacy for exploited children, suddenly losing access to a primary communication channel is jarring. Information moves at lightning speed in our line of work. A delay in sharing a missing child flyer or an update on an international law enforcement operation can feel agonizing.

​After an appeal, the account was restored—though we are currently navigating the standard 48-hour waiting period for full functionality to return.

​This brief digital exile highlighted a unique paradox that legitimate child advocates face online. The algorithms designed to flag malicious actors are often triggered by the very terminology we use to fight them. When we discuss the realities of exploitation, share updates on law enforcement stings, or use specific keywords to educate the public, we inadvertently trip the wire.

​We become "friendly fire" in the algorithmic war against exploitation.

​Is it frustrating? Absolutely. But looking at the bigger picture, it is a side effect of a system that is trying—however imperfectly—to do the right thing. If an overly sensitive algorithm occasionally inconveniences an advocate but successfully blocks a predator, that is a trade-off we can survive.

​The heart of Trinity Mount Ministries has never been a single social media account. The true impact

 of this work relies entirely on our incredible community of online supporters. You are the ones who share the alerts, read the updates, and keep the awareness alive, even when the algorithms get confused.

​We will be back to full digital strength shortly. Until then, the work doesn't stop. The mission remains, the advocacy continues, and our community stands strong—algorithm or no algorithm.

Thank you for continuing to support Trinity Mount Ministries. We truly appreciate it.

Brett Fletcher - Founder of Trinity Mount Ministries 




Thursday, April 9, 2026

The Grimm Reality of the Feed: How Social Media Architecture Leaves Youth Vulnerable

By Brett Fletcher

It is no secret that modern childhood and young adulthood have shifted into the digital realm. But when we see young people endlessly scrolling through social media or participating in baffling, reckless trends, it is easy to dismiss it as a lack of discipline. The truth is far more complex, and in some corners of the internet, far darker.

​Young users are not just making isolated poor choices; they are navigating highly engineered digital environments that actively exploit their developmental vulnerabilities. And when those vulnerabilities are exposed, organized predators are waiting in the shadows.

​To understand the danger, we have to look at the machinery driving these platforms and the digital underworld that capitalizes on it.

​The Architecture of Attention: The Hook

​Social media platforms are not passive tools; they are active environments powered by AI designed to maximize engagement. They analyze every click, pause, and interaction to curate a personalized "filter bubble" that constantly offers novel stimuli.

​This relies heavily on a variable reward system. Much like a slot machine, the delivery of content and notifications is unpredictable. This triggers continuous dopamine releases, compelling young users to keep scrolling because the "next" video or post might be the one that provides a massive hit of entertainment. Over time, this leads to reduced reward sensitivity, creating a behavioral loop that is incredibly difficult to break.

​The Validation Loop: The Trap

​Adolescence is a period naturally driven by identity formation, social comparison, and the need for peer acceptance. These apps digitize and amplify that process to an unrealistic degree.

​Likes, views, and follower counts become tangible metrics for social standing. When the algorithm rewards extreme, sensational, or boundary-pushing content with high visibility, it implicitly teaches young users that making bad, risky decisions is exactly the behavior required to be "seen" and valued by their peers.

​The Impulsivity Engine: The Fallout

​This digital environment collides directly with human biology. The adolescent brain is still maturing. The amygdala—which processes immediate emotion and impulses—is highly active, while the prefrontal cortex—responsible for long-term planning, impulse control, and risk assessment—is still under construction.

​When you combine a highly stimulated emotional center with an undeveloped braking system, you get profound impulsivity. This manifests as "stress posting" in the heat of the moment, sharing inappropriate images, or engaging in risky viral trends simply because they see those behaviors heavily endorsed online.

​Unfortunately, these impulsive moments of poor judgment are exactly what digital predators use as bait.

​The Digital Underworld: A Modern Grimm Tale

​At the fringes of mainstream social media, organized predatory groups actively hunt for vulnerable, isolated youth. This is where the platform architecture turns from a behavioral trap into a genuine, true-crime nightmare.

  • The Weaponization of the Taboo: Networks like the notorious "764" group often cloak themselves in dark, "demonic," or extreme occult aesthetics. This branding is not a genuine religious movement; it is a calculated psychological weapon. It is designed to project absolute power, exploit a young person's natural fascination with the taboo, and terrify them into submission.
  • Coercion and Control: Predators gather compromising material by catfishing targets, using social engineering, or simply saving the impulsive posts a young person regrets. They then use brutal extortion tactics to force the youth into a corner, threatening to ruin their lives or harm their families if they do not comply.
  • The Gamification of Cruelty: Once control is established, the demands escalate. These networks operate on a twisted hierarchy where members gain status by forcing their young victims to commit terrible acts. This can range from producing abusive material to committing real-world self-harm or violence.
  • The Wall of Silence: Because the victim feels intense shame about how they were trapped, and genuine terror of the group's artificially inflated "demonic" power, they rarely reach out to authorities. The algorithmic design of platforms keeps this entire ordeal hidden in direct messages and encrypted chats, entirely invisible to the adults in the room.

​Breaking the Silence


These groups are not supernatural forces; they are highly organized criminals exploiting the exact psychological vulnerabilities that modern apps amplify. By understanding the intersection of platform design, adolescent psychology, and digital extortion, we can move past the standard "screen time" debate.

​Protecting our youth requires recognizing that the digital world has real-world consequences, and the first step in dismantling these predatory networks is pulling their tactics out of the shadows and into the light.




Sunday, April 5, 2026

The Ghost in the Machine: How Social Media Algorithms Actually Work

Brett Fletcher 

At its simplest, an algorithm is just a recipe.

​If you’re baking a cake, the recipe tells you exactly what to do to get a specific result. In the world of social media, the "result" the platform wants is your attention. The algorithm is a set of mathematical rules that looks at everything you do—what you like, how long you pause on a photo, and even what you search for—to decide what to show you next.

​It’s Not "Content," It’s "Data"

​Every time you interact with a post, you are training the algorithm. It tracks:

  • Engagement: Did you like, comment, or share?
  • Watch Time: Did you watch the whole video or skip it?
  • Relevance: Does this post match the topics you’ve looked at before?

​The goal isn't necessarily to show you the "best" content; it’s to show you the content most likely to keep you from closing the app.

​The Danger Zone: Why This is Risky for Children

​For adults, algorithms can be annoying. For children and teenagers, they can be dangerous. Because a child’s brain—specifically the prefrontal cortex, which handles impulse control—isn't fully developed until their mid-20s, they are uniquely vulnerable to algorithmic manipulation.

  1. The Rabbit Hole Effect: If a child clicks on one video about a diet, the algorithm may start flooding their feed with extreme fitness or "thinspiration" content. This can lead to body dysmorphia or eating disorders.
  2. Echo Chambers: Algorithms show us more of what we already believe. For a child, this can lead to radicalization or the spread of misinformation, as they are never challenged by a different point of view.
  3. Validation Addiction: The "Like" button acts as a social scorecard. When the algorithm hides a post or it doesn't perform well, children often internalize this as a personal failure or social rejection.

​The "Infinite Scroll": A Trap by Design

​Have you ever noticed that social media feeds never actually end? This is called the Infinite Scroll, and it is one of the most effective psychological "hacks" ever created.

​1. The Slot Machine Effect

​Algorithms use something called Variable Ratio Reinforcement. This is the same logic used in gambling. You scroll through three boring posts, and then—BAM—a funny cat video or a post from a crush. That tiny hit of dopamine keeps you scrolling, hoping the next "win" is just one flick of the thumb away.

​2. Removing "Stopping Cues"

​In the old days of the internet, you had to click "Next Page." That click was a stopping cue—a tiny moment for your brain to ask, "Do I really want to keep doing this?" By removing pages and making the feed infinite, social media sites bypass your conscious brain, keeping you in a "flow state" where time seems to disappear.

​3. Artificial Urgency (FOMO)

​Algorithms often show you things "out of order" to create a sense of urgency. By highlighting what's "trending" or "happening now," they trigger the Fear Of Missing Out (FOMO), making you feel like if you stop scrolling, you’ll be left behind by your social circle.

​How to Fight Back

​You don’t have to delete your accounts to stay safe. Try these three steps:

  • Turn off "Autoplay": Don't let the app decide when the next video starts.
  • Set Time Limits: Use the "Screen Time" settings on your phone to hard-stop your usage.
  • Reset Your Algorithm: Periodically go into your settings and clear your "Interested" or "Search" history to give the machine a fresh start.


Saturday, March 8, 2025

New report urges legislation to address addictive social media algorithms harming children

 BRIANNA KRAEMER

The North Carolina Department of Health and Human Services has advised lawmakers to take action against addictive social media algorithms that officials say are harming youth mental health across the state.

The Child Fatality Task Force released its 2025 annual report this week, offering recommendations on how the governor and the General Assembly can create policies they say will save lives. The report offers 11 recommendations that address a range of issues that threaten child health and safety, including teens spending an average of 3.5 hours per day on social media.

“Frequent social media use may be associated with changes in the developing brain,” the report reads. “Youth who spend more than three hours a day on social media face double the risk of poor mental health.”

According to DHHS, one-quarter of adolescents perceive that they are “moderately” or “severely” addicted to social media. Data shows 78% of 13- to 17-year-olds report check their devices hourly and 46% check almost constantly (compared to 24% in 2018).

Social media use can interfere with sleep, and poor sleep is linked to physical and mental health issues, risky behaviors, poor school performance, and altered brain development. Many experts and national organizations are expressing concern and issuing advisories about the impact of social media on youth mental health.

“Children and adolescents on social media are commonly exposed to extreme, inappropriate, and harmful content, and those who spend more than 3 hours a day on social media face double the risk of poor mental health including experiencing symptoms of depression and anxiety,” stated a US Surgeon General’s Advisory.

House Bill 644 was introduced in 2023 to combat social media addiction. Though it had bipartisan support, it failed to garner substantial action in the House.

Other proposals include more spending to increase the number of school nurses, social workers, counselors, and psychologists. To address firearm safety, the DHHS officials call for recurring funding of $2.16 million for the NC S.A.F.E. Campaign, which educates the public about safe firearm storage. In his previous role as North Carolina’s attorney general, Stein targeted social media platforms multiple times.

In 2024, Stein joined 12 other states in suing TikTok, alleging it was purposely designed to keep children addicted. In October 2023, Stein and more than 40 other bipartisan attorneys sued Meta, which owns Facebook and Instagram, alleging a similar claim that the platforms were purposefully designed to be addictive to children.

While the report endorses legislation targeting addictive social media algorithms, officials provide additional recommendations to address youth suicide, promote mental health, and prevent firearm deaths and injuries among children. 

The full list of legislative recommendations from the Child Fatality Task Force are:

  • Raise the legal age for tobacco product sales from 18 to 21 and require licensing for retailers.
  • Prevent child access to intoxicating cannabis by regulating sales, packaging, and retailer permits.
  • Increase investment in early child care, including child care subsidies.
  • Fund more school nurses, social workers, counselors, and psychologists.
  • Address addictive algorithms in social media that harm children.
  • Provide $2.16 million for the NC S.A.F.E. firearm safe storage campaign.
  • Strengthen firearm storage laws to protect minors.
  • Funding to prevent sleep-related infant deaths.
  • Fund Medicaid reimbursement for doula services throughout pregnancy and postpartum.
  • Legislation for Fetal and Infant Mortality Reviews (FIMR).
Update child passenger safety laws to reflect best practices.