Trinity Mount Ministries

Showing posts with label internet. Show all posts
Showing posts with label internet. Show all posts

Saturday, March 8, 2025

New report urges legislation to address addictive social media algorithms harming children

 BRIANNA KRAEMER

The North Carolina Department of Health and Human Services has advised lawmakers to take action against addictive social media algorithms that officials say are harming youth mental health across the state.

The Child Fatality Task Force released its 2025 annual report this week, offering recommendations on how the governor and the General Assembly can create policies they say will save lives. The report offers 11 recommendations that address a range of issues that threaten child health and safety, including teens spending an average of 3.5 hours per day on social media.

“Frequent social media use may be associated with changes in the developing brain,” the report reads. “Youth who spend more than three hours a day on social media face double the risk of poor mental health.”

According to DHHS, one-quarter of adolescents perceive that they are “moderately” or “severely” addicted to social media. Data shows 78% of 13- to 17-year-olds report check their devices hourly and 46% check almost constantly (compared to 24% in 2018).

Social media use can interfere with sleep, and poor sleep is linked to physical and mental health issues, risky behaviors, poor school performance, and altered brain development. Many experts and national organizations are expressing concern and issuing advisories about the impact of social media on youth mental health.

“Children and adolescents on social media are commonly exposed to extreme, inappropriate, and harmful content, and those who spend more than 3 hours a day on social media face double the risk of poor mental health including experiencing symptoms of depression and anxiety,” stated a US Surgeon General’s Advisory.

House Bill 644 was introduced in 2023 to combat social media addiction. Though it had bipartisan support, it failed to garner substantial action in the House.

Other proposals include more spending to increase the number of school nurses, social workers, counselors, and psychologists. To address firearm safety, the DHHS officials call for recurring funding of $2.16 million for the NC S.A.F.E. Campaign, which educates the public about safe firearm storage. In his previous role as North Carolina’s attorney general, Stein targeted social media platforms multiple times.

In 2024, Stein joined 12 other states in suing TikTok, alleging it was purposely designed to keep children addicted. In October 2023, Stein and more than 40 other bipartisan attorneys sued Meta, which owns Facebook and Instagram, alleging a similar claim that the platforms were purposefully designed to be addictive to children.

While the report endorses legislation targeting addictive social media algorithms, officials provide additional recommendations to address youth suicide, promote mental health, and prevent firearm deaths and injuries among children. 

The full list of legislative recommendations from the Child Fatality Task Force are:

  • Raise the legal age for tobacco product sales from 18 to 21 and require licensing for retailers.
  • Prevent child access to intoxicating cannabis by regulating sales, packaging, and retailer permits.
  • Increase investment in early child care, including child care subsidies.
  • Fund more school nurses, social workers, counselors, and psychologists.
  • Address addictive algorithms in social media that harm children.
  • Provide $2.16 million for the NC S.A.F.E. firearm safe storage campaign.
  • Strengthen firearm storage laws to protect minors.
  • Funding to prevent sleep-related infant deaths.
  • Fund Medicaid reimbursement for doula services throughout pregnancy and postpartum.
  • Legislation for Fetal and Infant Mortality Reviews (FIMR).
Update child passenger safety laws to reflect best practices.




Wednesday, February 19, 2025

Grassley Opening Statement on Ensuring Children’s Safety in the Digital Era

 

Prepared Opening Statement by Senator Chuck Grassley of Iowa
Chairman, Senate Judiciary Committee


Wednesday, February 19, 2025
 
Good morning. In today’s digital era, our young people face risks that previous generations couldn’t have imagined. Even though technology brings amazing opportunities for education and growth, it also opens the door to new dangers that we must confront. This isn’t the first hearing we’ve had on this issue. And unfortunately, it probably won’t be the last.
 
We held a hearing on this same subject roughly a year ago, when we brought in CEOs from some of the largest social media companies to discuss safety issues on their platforms. And we held a similar hearing a year before that.
 
On the one hand, this is alarming because the problem is getting worse. In 2023, for instance, the NCMEC CyberTipline received 36.2 million reports of suspected online child sexual exploitation, a 12% increase over 2022. And even though the numbers haven’t been published for 2024, I hear they’re only expected to go up.
 
Additionally alarming are the new technologies that are being used by bad actors to exploit children online. Predators can use Generative AI, for instance, to take normal images of children and manipulate them to create novel forms of CSAM. In 2024 alone, NCMEC reported over 60,890 instances of Generative AI CSAM.
 
Despite this, so far, Congress has enacted no significant legislation to address the dangers children face online. And tech platforms have been unhelpful in our legislative efforts. Big Tech promises to collaborate, but they’re noticeably silent in supporting legislation that would effect meaningful change. In fact, Big Tech’s lobbyists swarm the Hill, armed with red herrings and scare tactics, suggesting that we’ll somehow break the Internet if we implement even modest reforms.
 
Meanwhile, these tech platforms generate revenues that dwarf the economies of most nations. How do they make so much money? By compromising our data and privacy, and by keeping our children’s eyes glued to the screens through addictive algorithms.

Indeed, in one recent study, 46% of teens reported that they are online “almost constantly.” This has had severe mental health consequences for adolescents. It has also led to a rise in sexual exploitation, as some algorithms have actually connected victims to their abusers.
 
Should tech platforms be allowed to profit at the expense of our children’s privacy, safety and health? Should they be allowed to contribute to a toxic digital ecosystem without being held accountable? I believe the answer is clear. When these platforms fail to implement adequate safety measures, they are complicit in the harm that follows, and they should be held accountable.
 
That said, there are some signs of encouragement. Just as new technologies are being developed that exacerbate harms to children online, so too are technologies being developed to combat exploitation. For example, with AI rapidly evolving, open-source safety tools are being developed to recognize and report CSAM. Some of the witnesses here today will be able to speak to these tools.
 
Additionally, on a Committee with some of the most diverse viewpoints in the Senate, we’ve advanced bipartisan legislation that addresses legal gaps in our current framework—especially those related to the blanket immunity that Section 230 provides.
 
Last Congress, for example, we reported several online safety bills out of Committee with overwhelming bipartisan support. And there are a number of bills that are being considered and refined this Congress, which we’ll give attention to in due course.
 
That being said, we can’t come up with a wise and effective legislative solution without first understanding the nature and scope of the problem. That’s why we’re here today. Our witnesses come from various backgrounds and represent diverse perspectives, all of which point to the need for our Committee to improve legislation and continue working to keep our kids safe.


Friday, August 11, 2023

‘We have the internet, it’s not going away’: How to keep children safe online


Greece Police started investigating him back on May 3. Prosecutors say Milam posed as a high school student on Snapchat and sent and received sexually explicit photos and videos with young girls. 

Since that arrest News10NBC has been hearing from parents rattled by the disturbing allegations.

Parents are wondering what they can do to keep their children safe. News10NBC’s Natalie Faas spoke with the Bivona Child Advocacy Center on Wednesday to see what advice they could offer.

The people at Bivona deal with cases of child abuse in all forms. They are trained to teach both children and adults how to prevent, respond and react to abuse and how to stay safe in all environments.

Bivona’s outreach work is all about educating children and adults. The experts help kids learn how and when to tell a safe adult that they need help. They work with adults on how to react if their child tells them someone is hurting them, and what they should do. 

A major focus is on online safety. With social media constantly evolving, it can be hard to keep up. 

Bivona keeps an extensive database of child safety resources — like a list of apps that parents can use to monitor what their children are doing on their devices, and who they are interacting with. 

“Check your children’s phones, check your children’s devices on a regular basis,” explains Danielle Lyman-Torres, president and CEO of Bivona Child Advocacy. “Make sure you have some settings on there, some parental settings, especially for younger children to maybe block some content. The reality is that we have the internet, it’s not going away, and we are not going to be able to keep children from using it. But we need to be sure that we are being vigilant and checking.”

If you want to know more or need Bivona’s help, they can be reached at (585)935-7800 or click here for their website.




Thursday, July 14, 2022

TikTok to filter ‘mature or complex’ videos as child safety concerns mount

By  

TikTok said it is rolling out a new feature designed to stop underage users from seeing videos with “mature or complex themes” amid mounting scrutiny over the wildly popular video app’s effect on children

The move comes as TikTok faces wrongful death lawsuits filed earlier this month in California by parents who alleged that their eight and nine year-old children died after trying to recreate “blackout challenge” videos that had been served to them through TikTok.

In an effort to protect underage users from “content with overtly mature themes,” TikTok is introducing a ratings metric the company says is similar to systems used in the film, television and video game industries. 

TikTok will start introducing “maturity scores” in the coming weeks, the company said in a blog post.


     TikTok is cracking down on “mature” content.

     TikTok 

“When we detect that a video contains mature or complex themes, for example, fictional scenes that may be too frightening or intense for younger audiences, a maturity score will be allocated to the video to help prevent those under 18 from viewing it across the TikTok experience,” TikTok’s head of trust and safety Cormac Keenan said in a blog post. 

In addition to the maturity score feature, TikTok is introducing another tool for all users that twill allow users to manually block videos with certain words or hashtags from both their “Following” and “For You” feeds. 

For example, vegan users can block videos about dairy or meat recipes, Keenan said. 

Scrutiny of TikTok, which is owned by Beijing-based ByteDance, ranges far beyond the “blackout challenge.” 


          An example of TikTok’s new filter tool. 

          TikTok 

American lawmakers have raised concerns about TikTok serving videos glorifying eating disorders and self-harm to children suffering from such conditions. They have also questioned whether TikTok shares data with the Chinese government, a practice the company has denied

In addition, consumer protection advocates have raised concerns about the spread of misleading advertisements for sketchy payday loans on TikTok, as exclusively reported by The Post in June. 

Following The Post’s reporting, TikTok banned several of the ads. 





Sunday, April 26, 2020

Thorn - Keeping your kids safe online in the age of COVID: Usable tips for parents - PARENTING PREVENTION


By James, Thorn Staff 

As a parent, I’m concerned.

Not only am I worried about my family’s health in the midst of a once-in-a-century pandemic, but I’m figuring out how to run a school out of my dining room, learning how to work with my spouse a few feet away from me at all times, and my cat needs ear drops twice a day.

Since I work at Thorn, I’m also acutely aware of the fact that children are spending way more time online—at least 50% more time on screens for ages 6-12—and now I’m hearing about Zoombombing, where nefarious users hack into Zoom sessions, including elementary school lessons, to share abusive content. I’m not used to hearing about child safety from the news — I usually hear it from my team first. But these days, we’re all in this together.

I know I need to have some really important and difficult conversations with my child about staying safe online right now. Parenting a child with both special needs and a proclivity for technology, I’m constantly striving to balance keeping their digital experience safe while adjusting to a sudden increase in our reliance on technology as a family.

I recognize that I sit in a privileged position in the grand scheme of things, being able to continue working remotely on a mission that I care deeply about. But regardless of where we are in the world or what our daily lives look like right now, parents the world over are facing the same dilemma as me: how do I keep my child safe online — not only right now, but in a future that is based in technology?

My colleagues at Thorn, and our partners in the child-safety community, have been developing and sharing resources that make both my day-to-day parental duties and those tough conversations a little less intimidating.

My hope is that these tips are approachable, pragmatic, and helpful, and in no way act as a source of stress for caregivers that already have a lot to balance.

Here’s how to start thinking about, talking about, and addressing online child safety with your kids in the age of COVID-19:

Ask your kids to teach you about their favorite apps

Cropped image of father and son in casual clothes using smartphones and smiling while sitting on couch at homeAdobe//georgerudy
For children that have their own devices, this is the perfect time to ask about the apps and games they use the most. But don’t stop there: take it a step further and let your kids actually show you how to download and use their favorite apps and games.

By letting your child become the teacher, it gives you a chance to hear directly from them how, when, and why they use these apps. You are invited into their world and see it through their eyes. And most importantly, you’ll see how the games work, where it’s easy to meet new people, which behaviors are risky and which aren’t, so you can help your child navigate their digital world.

This is critical information that takes guesswork out of the equation and also reveals where safety issues might arise. Now when you have conversations with your kids about online safety, you’ll be able to speak their language.

Participate in online trends with your kids
You’ve heard of TikTok, but have you completed the latest viral challenge?

Ask your kids about the viral challenges they’re seeing on TikTok and which one you can do together. That might mean acting a bit silly or feeling a little awkward at first, but it’s both great bonding time and an opportunity to learn more about the platform.

Again, let your kids lead the way here. It’s a chance for them to teach you something, which they don’t often get to do, and a fun way to spend time together while still allowing the kids to interact with technology.

Talk to your kids about sharing content

Times have changed, and just as many adults share suggestive pictures with their partners, sexting is also more common among youth. One recent survey found that as many as 40% of kids are exposed to a sext by the age of 14.
That means content sharing of nude or partially nude images isn’t just an issue that applies to teenagers, but something any child interacting with a device should be aware of.
As a parent, I know just how uncomfortable and awkward this conversation can be. Thankfully, StopSextortion.com has some excellent resources for caregivers on how to start the conversation. There’s also important information on what to do if your child has had an image of themself shared beyond the intended recipient without their consent, and the next steps to take.
Check out the Stop Sextortion site for more ideas to explain the risks of self-generated content.

Know the words “sextortion” and “grooming”


We’ll go into these in more depth in future posts, but sextortion and grooming are two important risks for you to know when keeping kids safe online.
Grooming refers to the tactics used by online predators to convince or coerce children into making and sharing sexually exploitative content. Grooming can take a variety of forms, but hinges on creating trust and leveraging vulnerabilities.
Sextortion refers to the coercion that can happen after that content is produced. For example, through grooming a child may be convinced to share a nude, partially nude, or sexually suggestive image of themself, which predators then use as leverage to coerce a child into further sexual exploitation. This could take the form of a predator pretending to be a child’s peer through text chat, gaining their trust and coercing them to share an image. Once that one image is shared, predators use it as leverage to coerce them into sharing more.
One easy way to get the conversation started? Tell younger kids that if they ever receive a message or interaction from someone they don’t know on any platform, from video games to social media or texts, to never respond and come straight to you.
We’re looking forward to diving into these topics and sharing directly from our team of experts over the coming months. Subscribe to our emails below and follow us on social media to be the first to see our future deep dives on these topics.

Become your kids’ safety net

kids in front of ipadAdobe// ulza
For older children in particular, but young ones as well, make sure they know you’re a safe person to come to, even if the thought of them sharing content makes you feel afraid or frustrated.
I think of how my parents always told us that if we were ever in a situation where people were drinking and we needed a ride home, to call them and they would pick us up no questions asked. This message was coupled with frank conversations about the risks of drinking, and about my parents expectations that I not be drinking, but I felt safe enough to ask for help when I needed it. I trusted that I had a safety net.
Make sure your kids know you’re a safety net. And also make sure they’re aware of resources like the Crisis Text Line, where kids can go if they don’t feel comfortable approaching adults.

Make sure classroom video meetings (and peer video chats) are secure

Schools are doing an exceptional job pivoting to remote learning, but with so much going on it can be easy to miss some key steps in keeping everything secure. And some companies have been caught off guard by the massive increase in users which have in turn exposed security flaws.
You can help by keeping an eye out for some basic security practices:
  • Ensure video chats are always private, and when possible, password protected.
  • Don’t share meeting links outside of private messages (like emails or texts).
  • Designate someone to be the meeting supervisor, who will manage participants and watch for uninvited guests. For most conferencing apps this will default to whoever set up the meeting.
  • Ensure everyone has installed the most up-to-date version of the app. Zoom, for example, has recently been adding new security features every few days.
Make sure schools and your kids are using these basic security protocols for video chats—and talk to them if you find they aren’t. We’re all in this together, and shared knowledge can make the whole community safer.

Report abuse content and sextortion—and never share it

No matter where, when, or how it happens, if you or your kids come across CSAM, report it. If you’re not sure what constitutes CSAM, it is legally defined as any visual depiction of sexually explicit conduct involving a minor (a person less than 18 years old). This can include images, video, audio, and any other content type.
You should report it to whichever platform you find it on, and be sure to also report it to the National Center for Missing and Exploited Children (NCMEC). NCMEC is the clearinghouse for all reports of CSAM, but they also field reports from online platforms. Cover all your bases in this case.
And remember: never share abuse content, even if you’re trying to report it. It’s actually illegal, no matter your intentions, and can keep the cycle of abuse going.

Use existing resources

  • NCMEC’s Netsmartz cartoon is a great way for young children to learn about staying safe online while also being entertained.
  • The Zero Abuse Project has compiled 25 tips for responding to child abuse during a pandemic.
  • Child Rescue Coalition has some additional tips for keeping Zoom meetings safe.
  • The Global Partnership to End Violence Against Children has compiled useful, evidence-based resources for positive parenting during a pandemic.
  • Common Sense Media can help to provide guidance for parents on apps, games, and websites.
  • The Family Online Safety Institute has developed resources for digital parenting.
  • And if anxiety is high and you or your kids just need to talk to someone, you can contact the Crisis Text Line 24 hours a day, 7 days a week. As a remote organization, they are well equipped to connect you or your kids to resources, whether they need help with a potentially abusive situation or just feel anxiety due to the COVID-19 pandemic.

Start the conversation wherever you’re most comfortable – but start it

Portrait of family taking a selfie together with mobile phone at home. Family and lifestyle concept.Adobe//Mego-studio
Online safety is an ongoing conversation that will likely change and grow as quickly as your kids. It’s not always easy to broach these topics, but starting wherever you are most comfortable and taking it in steps can help.
These conversations can happen more organically if you’re spending time with them on the apps they use and showing interest in learning about the virtual world in which all of our children are growing up.
Importantly, one instance in which you should react immediately is if you discover CSAM content or find that your child’s images have been shared without their consent. Reporting content as quickly as possible can help mitigate long-term harm. For more info on getting content removed, check out NCMEC’s guide.

The new normal

Parenting is really different today than it was a few months ago, and it’s going to be that way for a significant amount of time. We’re not always going to have all the answers, and just being here and learning more is a great first step.
We’re all in this together, and together we will be able to best defend our children from online sexual abuse. You are not only a part of a global ecosystem of parents and caregivers, but a community of people dedicated to eliminating child sexual abuse from the internet, which Thorn and our partners work toward every single day.
No matter what you’re doing, or how you’re doing it, thank you for being a defender of happiness and being willing to learn more.


Thursday, September 26, 2019

National Center For Missing And Exploited Children Speaks About Online Safety

by Rachel Ellis


A man whose brother was abducted and killed before he was born is speaking out after the Simpsonville eleven year old drove two hundred miles to Charleston early Monday morning. The boy told police he planned to live with a man he met on Snapchat.

The eleven year old took his brother’s car and got lost on Rutledge Street when his GPS went out. The National Center for Missing and Exploited Children said this is just one of many cases where there are many unanswered questions.

“You know you read the headlines and you think that’s where it’s going to be the worst and then you start getting the details and your head spins,” said Callahan Walsh, son of John Walsh, who founded the center.

He said it’s crucial for parents to know what their kids are doing online.

“There’s three things I always tell parents. One is know the technology. Get on those apps yourself and try to figure out how they work and you know see what apps your children are using. Number two is set ground rules and stick to them. Especially if there’s been bad behavior in the past. And number three is have ongoing conversations with your kids about safety,” said Walsh.

Walsh also added that online encounters have gone up and said it sometimes can be hard to keep track of the many apps that are right at our fingertips.

“It can be very difficult to provide parents with specific tips on specific apps because it’s sort of like whack a mole there. When one pops up it becomes popular and then three more like it pop up. so there’s always a constant landscape that’s changing. there’s always new apps with new features coming out,” said Walsh.


Tuesday, April 16, 2019

THORN - We can eliminate child sexual abuse from the internet:



Reports of child sexual abuse material (CSAM) online have increased 10,000% since 2004.

Our response to this epidemic must be redesigned for the digital age. It won’t be easy, but it is possible.

When we don’t talk about it, abuse and injustice thrive.

Stand with us. Help us have the hard conversations.

Share one post.


Talk to a friend.

Spread the word that we have a solution in sight.




Monday, December 4, 2017

UNICEF highlights child online safety at World Internet Conference

Source: Xinhua

WUZHEN, Zhejiang, Dec. 4 (Xinhua) -- Four sculptures stood in great contrast to the advanced technology on display in Wuzhen Internet International Conference and Exhibition Center; however, they attracted just as much, if not more, attention from visitors.

The "Cyber Cocoon Kids" art installation, presented by UNICEF China at the on-going World Internet Conference, shows the four key online risks for children: cyberbullying, excessive internet use, online child sexual abuse and oversharing personal information.

Artist Xie Yong and creative director Kevin Wang came up with the concept of "Cyber Cocoon Kids" to represent the potential isolation that can occur when children inhabit a cyber world that parents and caregivers do not fully understand.

"Protecting children online is a vital issue in internet governance, and also closely linked to the Sustainable Development Goals," said Fatoumata Ndiaye, UNICEF deputy executive director.

At the conference, UNICEF also co-hosted a Child Online Safety Forum bringing global experts to share learning, experience and practice in this area.

"The internet offers children access to a whole world of possibilities to learn, connect and play," said Rana Flowers, UNICEF Representative to China. "As policy makers, digital industry representatives or as parents and caregivers, we need to protect children from the worst that digital technology has to offer and expand their access to the best."

For Missing Children and Child Safety updates, join Trinity Mount Ministries on Facebook: https://www.facebook.com/trinitymount

And on Twitter: https://twitter.com/TrinityMount (@TrinityMount)

Friday, July 17, 2015

Do Your Homework Before Sharing “Missing Person” Posts:

by 

You see a post on Facebook or Twitter from someone you interact with online. They’re not a close friend, family member or someone you work with.
It’s someone you met through Twitter, an online game, or they belong to one of your Facebook groups, but you’ve never met them in person.
They’re saying their wife or son is missing. Could you let them know if you see their missing family member?
What do you do?

Do Your Homework

If you’re on social media, it may seem natural to quickly spread the news or share the post. You want to help others and social media makes it easy.
But do you have all the facts?
As Kimberley Chapman points out in Be Careful About “Missing Person Posts”:
It’s one thing to circulate a current Amber Alert, ensuring that all of the information is there, that it comes from a proper source (ie don’t just reshare, CHECK THE LINKS), and that the answer is to call police, not just a random number.
And be sure to update your post when the issue is resolved.
But when someone you don’t know personally asks you to contact them about their missing family member, and they don’t provide:
  • Date
  • Location
and they ask you not to call police, think twice.
You may not know the full back story.
The family member may have escaped an abusive relationship. Or they may have changed their identity and left the area.
Is the person really missing?
Always check the story. Follow up on any links provided.

Be Responsible

You want to share posts from police looking for abducted or missing persons.
If you see the person or know something about someone who is missing or abducted, always call the police.
Be suspicious of posts about a missing person with no mention of date or location.
Rather than spreading the information, contact the police. It may be a legitimate request, but always contact the police to confirm.