Trinity Mount Ministries

Showing posts with label CSAM. Show all posts
Showing posts with label CSAM. Show all posts

Friday, September 1, 2023

Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy

LILY HAY NEWMAN SECURITY

Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday.

PHOTOGRAPH: LEONARDO MUNOZ/GETTY IMAGES

IN DECEMBER, APPLE said that it was killing an effort to design a privacy-preserving iCloud photo-scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple had first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of all iCloud users. This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company “detect, report, and remove” child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company. 

Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as Communication Safety features. The company's response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests.

“Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

“Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types.”

Heat Initiative is led by Sarah Gardner, former vice president of external affairs for the nonprofit Thorn, which works to use new technologies to combat child exploitation online and sex trafficking. In 2021, Thorn lauded Apple's plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, August 30, which Apple also shared with WIRED, that Heat Initiative found Apple's decision to kill the feature “disappointing.”

“Apple is one of the most successful companies in the world with an army of world-class engineers,” Gardner wrote in a statement to WIRED. “It is their responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and videos. For as long as people can still share and store a known image of a child being raped in iCloud we will demand that they do better.”

In the email to Cook, Gardner wrote that Apple's photo-scanning tool “not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud. … Child sexual abuse is a difficult issue that no one wants to talk about, which is why it gets silenced and left behind. We are here to make sure that doesn’t happen.”

Apple maintains that, ultimately, even its own well-intentioned design could not be adequately safeguarded in practice, and that on-device nudity detections for features like Messages, FaceTime, AirDrop, and the Photo picker are safer alternatives. Apple has also begun offering an application programming interface (API) for its Communication Safety features so third-party developers can incorporate them into their apps. Apple says that the communication platform Discord is integrating the features and that appmakers broadly have been enthusiastic about adopting them.

“We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago,” Neuenschwander wrote to Heat Initiative. “We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users.”





Tuesday, September 21, 2021

The Dark Side of Apple’s New Child Safety Features

 


by Siddharth Parmar

Quote card courtesy of Opinion. In early August, Apple announced some new child safety features, slated to arrive in the upcoming updates to iOS, macOS and iPadOS. The first change is that the Messages app warns minors (and their parents) of sexually explicit images and gives parents the option to be alerted if a child views or sends such an image. The second involves Siri and Search being tweaked to intervene when someone makes queries related to Child Sexual Abuse Material (CSAM) . The last, and most major, introduces automatic on-device matching of photos (stored inside iCloud Photos) against a database of known CSAM content. If the system discovers enough flagged pictures, a report will be sent to a moderator for evaluation. If the moderator confirms the assessment, Apple will decrypt the photos and share them with the relevant authorities. 

 These new features were announced with the stated intent of protecting children from sexual predators, and they do sound like measures with great intentions behind them. But the changes have been met with considerable backlash and feelings of betrayal.

Of the three features, the changes to Siri and Search have been generally uncontroversial. The others, however, have seen massive opposition, ranging from discontent about the tweak to the Messages app to outrage about the CSAM scanning. So much so that Apple was forced to delay (but not stop) the implementation of these features.  

It may still be unclear to you why there even is opposition or why I’m asking you to be scared.

Even if well-intended, these new features are a massive invasion of privacy and have the potential to inflict serious damage. Coming from Apple, a company that prides itself on taking customer privacy seriously (extending even into their advertisement’s music choices), this is a huge disappointment.

The largest change is the monitoring of peoples’ Photos app. Some might be tempted to think the detection process itself is novel and problematic. This emotion is a natural spillover from tech’s well-documented issues with image scanning and detection. For example, studies show facial recognition software from IBM, Amazon and Microsoft have all underperformed for people of color and women. Recognition software is only as good as its training dataset. Train it on a homogenous dataset, and it will struggle with diversity when used.  

These are valid concerns, but they are not the whole picture. While it is not common knowledge, it is commonplace for major cloud service providers to scan for CSAMs hosted on their platform. This occurs with the help of a database of known CSAM content maintained by the National Center for Missing and Exploited Children and a few checks to prevent false reports. 

If this is an established procedure with checks in place to avoid false flagging, why is there backlash at all? The answer lies in where Apple is conducting these scans.  

Traditionally, all of this happens on a company’s servers which cannot see the contents of end-to-end encrypted data. Apple’s purported scanning will occur on-device, with an option to decrypt photos if need be. That’s very dangerous since end-to-end encryption doesn’t hide information from the device itself, which could lead to a potential backdoor for cyberattacks.  

Speaking of cyberattacks, Apple only recently came out with a security patch to protect iPhones from spyware attacks, which could turn on the camera and microphone on-demand and read messages and other local data, all without any visible sign. Apple, like any other tech giant, is only a few steps ahead of attackers at any given time (and perhaps a few steps behind as well in some cases). 

This leads to other issues. Countries could put pressure on Apple to report photos it finds objectionable, such as photos of protests or dissenters. Will Apple always be able to say no?

Even if Apple does manage to resist these demands, many companies sell software exploits that give access to devices to governments. These are all scary scenarios. So while the cause for Apple’s new software updates may be noble, the risks are too high to be considered safe.  







Monday, July 5, 2021

What to do when you find CSAM or evidence of Child Sex Trafficking Online

 


In the year 2000, just about half of all American adults were online. Today, nine-in-ten adults use the internet in the United States, according to Pew Research Center.


And in 2020, Americans, along with the rest of the world, are spending even more time online. People are spending 45% more time on social media since March of 2020 globally, with a 17% increase in the U.S., according to Statista.
Unfortunately, as our time spent online has increased, so has the chance that we may come across abusive content on the platforms where we should all feel safe.
Because we are not a direct-service organization, Thorn is not able to field reports of child sexual abuse material (CSAM) or child sex trafficking. But since we build software and tools aimed at detecting, removing, and reporting abuse content, we can help to point you in the right direction should you ever inadvertently come across harmful content.
Reporting this content through the right channels as a community helps to keep platforms safe, and could lead to the identification of a victim or help to end the cycle of abuse for survivors.
Here’s what to do if you find CSAM or evidence of child sex trafficking online.

Child sexual abuse material and child sex trafficking

First we need to talk about what child sexual abuse material (CSAM) is, and how it’s different from child sex trafficking.
Child sexual abuse material (legally known as child pornography) refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor. To learn more about CSAM and why it’s a pressing issue, click here.
As defined by the Department of Justice, child sex trafficking “refers to the recruitment, harboring, transportation, provision, obtaining, patronizing, or soliciting of a minor for the purpose of a commercial sex act.” To learn more, click here.
Now let’s look at how to report this type of content should you ever come across it online:

1. Never share content, even in an attempt to make a report

It can be shocking and overwhelming if you see content that appears to be CSAM or related to child sex trafficking, and your protective instincts might be kicked into high gear. Please know that you are doing the right thing by wanting to report this content, but it’s critical that you do so through the right channels.
Never share abuse content, even in an attempt to report it. Social media can be a powerful tool to create change in the right context, but keep in mind that every instance of CSAM, no matter where it’s found or what type of content it is, is a documentation of abuse committed against a child. When that content is shared, even with good intentions, it spreads that abuse and creates a cycle of trauma for victims and survivors that is more difficult to stop with every share.
It’s also against federal law to share or possess CSAM of any kind, which is legally defined as “any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age).” State age of consent laws do not apply under this law, meaning federally a minor is defined as anyone under the age of 18.
The same goes if you think you’ve found illegal ads promoting the commercial sexual exploitation of children (CSEC), such as child sex trafficking. Sharing this content publicly may unwittingly extend the cycle of abuse. Instead, be sure to report content via the proper channels as outlined below.

2. Report it to the platform where you found it

The most popular platforms usually have guides for reporting content. Here are some of the most important to know:
But let’s take a moment to look at reporting content to Facebook and Twitter.
Reporting content to Facebook
Whether you want to report a page, post, or profile to Facebook, look for the three dots to the right of the content, click on them, and then click on Find support or report Page.
Annotated image showing how to report content to Facebook.

From there you will be guided through the process and will get a confirmation that your report has been received. Be sure to select Involves a Child when making your report.
Reporting content on facebook
Reporting content on Twitter:
While you can report a tweet for violating Twitter’s policies in a similar way to Facebook content (clicking the  button to report a tweet), if you are reporting child exploitation content on Twitter, there’s a separate process that ensures reports of CSAM or other exploitative content are given priority.
First, click here to see what content violates Twitter’s child exploitation policies. Then fill out this form with the appropriate information, including the username of the profile that posted the content, and a link to the content in question.
Reporting CSE on Twitter.
To find the direct link to a tweet, click the share button at the bottom of the tweet and select Copy link to Tweet.
Share button on Twitter.
Copy a link to a tweet.

For any other platforms, you should always be able to easily find a way to report abuse content with a quick online search. For example, search for: Report abusive content [platform name].
The National Center for Missing and Exploited Children also offers overviews for reporting abusive content for multiple platforms here.

3. Report it to CyberTipline

The National Center for Missing and Exploited Children (NCMEC) is the clearinghouse for all reports of online child sexual exploitation in the United States. That means they are the only organization in the U.S. that can legally field reports of online child sexual exploitation. If NCMEC determines it to be a valid report of CSAM or CSEC, they will connect with the appropriate agencies for investigation.
Fill out the CyberTipline report by clicking here.
This is a critical step in addressing the sexual exploitation of children online. Be sure to fill out as much detail as you’re able.

4. Report CSEC to the National Human Trafficking Hotline

If you find evidence of child sex trafficking, call the National Human Trafficking Hotline at 1-888-373-7888.
Managed by Polaris, the hotline offers 24/7 support, as well as a live chat and email option. You can also text BEFREE (233733) to discreetly connect with resources and services.

5. Get your content removed and connect with resources

If you have been the victim of explicit content being shared without consent, the Cyber Civil Rights Initiative put together a guide for requesting content to be removed from most popular platforms.
If you have been the victim of sextortion—a perpetrator using suggestive or explicit images as leverage to coerce you into producing abuse content—take a look at our Stop Sextortion site for more information and tips on what to do.
NCMEC has put together a robust list of resources for survivors of sexual abuse material.

6. Practice wellness

Close the computer. Take a deep breath. Go for a walk.
This is an extremely difficult issue, and if you’ve just gone through the steps above, it means you’ve recently encountered traumatic material.
But you’ve also just taken a first step in what could ultimately be the rescue of a child or the cessation of a cycle of abuse for survivors.
Practice the things that create balance and support in your life, and if you need to, connect with additional resources. Text the Crisis Text Line to connect discreetly with trained counselors 24/7.
Or maybe you’re left with the feeling that there’s more work to be done. Learn more about local organizations working in this space and see if they offer volunteer opportunities. Fundraising for your favorite organizations can also make a huge difference.
If you’re here, whether you’re making a report or just equipping yourself with knowledge should you ever need it, you’re joining a collective movement to create a better world for kids. Know that you are part of a united force for good, one that won’t stop until every child can simply be a kid.

Tuesday, May 18, 2021

What to do when you find CSAM or evidence of Child Sex Trafficking Online

 

In the year 2000, just about half of all American adults were online. Today, nine-in-ten adults use the internet in the United States, according to Pew Research Center.


And in 2020, Americans, along with the rest of the world, are spending even more time online. People are spending 45% more time on social media since March of 2020 globally, with a 17% increase in the U.S., according to Statista.
Unfortunately, as our time spent online has increased, so has the chance that we may come across abusive content on the platforms where we should all feel safe.
Because we are not a direct-service organization, Thorn is not able to field reports of child sexual abuse material (CSAM) or child sex trafficking. But since we build software and tools aimed at detecting, removing, and reporting abuse content, we can help to point you in the right direction should you ever inadvertently come across harmful content.
Reporting this content through the right channels as a community helps to keep platforms safe, and could lead to the identification of a victim or help to end the cycle of abuse for survivors.
Here’s what to do if you find CSAM or evidence of child sex trafficking online.

Child sexual abuse material and child sex trafficking

First we need to talk about what child sexual abuse material (CSAM) is, and how it’s different from child sex trafficking.
Child sexual abuse material (legally known as child pornography) refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor. To learn more about CSAM and why it’s a pressing issue, click here.
As defined by the Department of Justice, child sex trafficking “refers to the recruitment, harboring, transportation, provision, obtaining, patronizing, or soliciting of a minor for the purpose of a commercial sex act.” To learn more, click here.
Now let’s look at how to report this type of content should you ever come across it online:

1. Never share content, even in an attempt to make a report

It can be shocking and overwhelming if you see content that appears to be CSAM or related to child sex trafficking, and your protective instincts might be kicked into high gear. Please know that you are doing the right thing by wanting to report this content, but it’s critical that you do so through the right channels.
Never share abuse content, even in an attempt to report it. Social media can be a powerful tool to create change in the right context, but keep in mind that every instance of CSAM, no matter where it’s found or what type of content it is, is a documentation of abuse committed against a child. When that content is shared, even with good intentions, it spreads that abuse and creates a cycle of trauma for victims and survivors that is more difficult to stop with every share.
It’s also against federal law to share or possess CSAM of any kind, which is legally defined as “any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age).” State age of consent laws do not apply under this law, meaning federally a minor is defined as anyone under the age of 18.
The same goes if you think you’ve found illegal ads promoting the commercial sexual exploitation of children (CSEC), such as child sex trafficking. Sharing this content publicly may unwittingly extend the cycle of abuse. Instead, be sure to report content via the proper channels as outlined below.

2. Report it to the platform where you found it

The most popular platforms usually have guides for reporting content. Here are some of the most important to know:
But let’s take a moment to look at reporting content to Facebook and Twitter.
Reporting content to Facebook
Whether you want to report a page, post, or profile to Facebook, look for the three dots to the right of the content, click on them, and then click on Find support or report Page.
Annotated image showing how to report content to Facebook.

From there you will be guided through the process and will get a confirmation that your report has been received. Be sure to select Involves a Child when making your report.
Reporting content on facebook
Reporting content on Twitter:
While you can report a tweet for violating Twitter’s policies in a similar way to Facebook content (clicking the  button to report a tweet), if you are reporting child exploitation content on Twitter, there’s a separate process that ensures reports of CSAM or other exploitative content are given priority.
First, click here to see what content violates Twitter’s child exploitation policies. Then fill out this form with the appropriate information, including the username of the profile that posted the content, and a link to the content in question.
Reporting CSE on Twitter.
To find the direct link to a tweet, click the share button at the bottom of the tweet and select Copy link to Tweet.
Share button on Twitter.
Copy a link to a tweet.

For any other platforms, you should always be able to easily find a way to report abuse content with a quick online search. For example, search for: Report abusive content [platform name].
The National Center for Missing and Exploited Children also offers overviews for reporting abusive content for multiple platforms here.

3. Report it to CyberTipline

The National Center for Missing and Exploited Children (NCMEC) is the clearinghouse for all reports of online child sexual exploitation in the United States. That means they are the only organization in the U.S. that can legally field reports of online child sexual exploitation. If NCMEC determines it to be a valid report of CSAM or CSEC, they will connect with the appropriate agencies for investigation.
Fill out the CyberTipline report by clicking here.
This is a critical step in addressing the sexual exploitation of children online. Be sure to fill out as much detail as you’re able.

4. Report CSEC to the National Human Trafficking Hotline

If you find evidence of child sex trafficking, call the National Human Trafficking Hotline at 1-888-373-7888.
Managed by Polaris, the hotline offers 24/7 support, as well as a live chat and email option. You can also text BEFREE (233733) to discreetly connect with resources and services.

5. Get your content removed and connect with resources

If you have been the victim of explicit content being shared without consent, the Cyber Civil Rights Initiative put together a guide for requesting content to be removed from most popular platforms.
If you have been the victim of sextortion—a perpetrator using suggestive or explicit images as leverage to coerce you into producing abuse content—take a look at our Stop Sextortion site for more information and tips on what to do.
NCMEC has put together a robust list of resources for survivors of sexual abuse material.

6. Practice wellness

Close the computer. Take a deep breath. Go for a walk.
This is an extremely difficult issue, and if you’ve just gone through the steps above, it means you’ve recently encountered traumatic material.
But you’ve also just taken a first step in what could ultimately be the rescue of a child or the cessation of a cycle of abuse for survivors.
Practice the things that create balance and support in your life, and if you need to, connect with additional resources. Text the Crisis Text Line to connect discreetly with trained counselors 24/7.
Or maybe you’re left with the feeling that there’s more work to be done. Learn more about local organizations working in this space and see if they offer volunteer opportunities. Fundraising for your favorite organizations can also make a huge difference.
If you’re here, whether you’re making a report or just equipping yourself with knowledge should you ever need it, you’re joining a collective movement to create a better world for kids. Know that you are part of a united force for good, one that won’t stop until every child can simply be a kid.

Wednesday, August 19, 2020

What to do when you find CSAM or evidence of child sex trafficking online

In the year 2000, just about half of all American adults were online. Today, nine-in-ten adults use the internet in the United States, according to Pew Research Center.


And in 2020, Americans, along with the rest of the world, are spending even more time online. People are spending 45% more time on social media since March of 2020 globally, with a 17% increase in the U.S., according to Statista.
Unfortunately, as our time spent online has increased, so has the chance that we may come across abusive content on the platforms where we should all feel safe.
Because we are not a direct-service organization, Thorn is not able to field reports of child sexual abuse material (CSAM) or child sex trafficking. But since we build software and tools aimed at detecting, removing, and reporting abuse content, we can help to point you in the right direction should you ever inadvertently come across harmful content.
Reporting this content through the right channels as a community helps to keep platforms safe, and could lead to the identification of a victim or help to end the cycle of abuse for survivors.
Here’s what to do if you find CSAM or evidence of child sex trafficking online.

Child sexual abuse material and child sex trafficking

First we need to talk about what child sexual abuse material (CSAM) is, and how it’s different from child sex trafficking.
Child sexual abuse material (legally known as child pornography) refers to any content that depicts sexually explicit activities involving a child. Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor. To learn more about CSAM and why it’s a pressing issue, click here.
As defined by the Department of Justice, child sex trafficking “refers to the recruitment, harboring, transportation, provision, obtaining, patronizing, or soliciting of a minor for the purpose of a commercial sex act.” To learn more, click here.
Now let’s look at how to report this type of content should you ever come across it online:

1. Never share content, even in an attempt to make a report

It can be shocking and overwhelming if you see content that appears to be CSAM or related to child sex trafficking, and your protective instincts might be kicked into high gear. Please know that you are doing the right thing by wanting to report this content, but it’s critical that you do so through the right channels.
Never share abuse content, even in an attempt to report it. Social media can be a powerful tool to create change in the right context, but keep in mind that every instance of CSAM, no matter where it’s found or what type of content it is, is a documentation of abuse committed against a child. When that content is shared, even with good intentions, it spreads that abuse and creates a cycle of trauma for victims and survivors that is more difficult to stop with every share.
It’s also against federal law to share or possess CSAM of any kind, which is legally defined as “any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age).” State age of consent laws do not apply under this law, meaning federally a minor is defined as anyone under the age of 18.
The same goes if you think you’ve found illegal ads promoting the commercial sexual exploitation of children (CSEC), such as child sex trafficking. Sharing this content publicly may unwittingly extend the cycle of abuse. Instead, be sure to report content via the proper channels as outlined below.

2. Report it to the platform where you found it

The most popular platforms usually have guides for reporting content. Here are some of the most important to know:
But let’s take a moment to look at reporting content to Facebook and Twitter.
Reporting content to Facebook
Whether you want to report a page, post, or profile to Facebook, look for the three dots to the right of the content, click on them, and then click on Find support or report Page.
Annotated image showing how to report content to Facebook.

From there you will be guided through the process and will get a confirmation that your report has been received. Be sure to select Involves a Child when making your report.
Reporting content on facebook
Reporting content on Twitter:
While you can report a tweet for violating Twitter’s policies in a similar way to Facebook content (clicking the  button to report a tweet), if you are reporting child exploitation content on Twitter, there’s a separate process that ensures reports of CSAM or other exploitative content are given priority.
First, click here to see what content violates Twitter’s child exploitation policies. Then fill out this form with the appropriate information, including the username of the profile that posted the content, and a link to the content in question.
Reporting CSE on Twitter.
To find the direct link to a tweet, click the share button at the bottom of the tweet and select Copy link to Tweet.
Share button on Twitter.
Copy a link to a tweet.

For any other platforms, you should always be able to easily find a way to report abuse content with a quick online search. For example, search for: Report abusive content [platform name].
The National Center for Missing and Exploited Children also offers overviews for reporting abusive content for multiple platforms here.

3. Report it to CyberTipline

The National Center for Missing and Exploited Children (NCMEC) is the clearinghouse for all reports of online child sexual exploitation in the United States. That means they are the only organization in the U.S. that can legally field reports of online child sexual exploitation. If NCMEC determines it to be a valid report of CSAM or CSEC, they will connect with the appropriate agencies for investigation.
Fill out the CyberTipline report by clicking here.
This is a critical step in addressing the sexual exploitation of children online. Be sure to fill out as much detail as you’re able.

4. Report CSEC to the National Human Trafficking Hotline

If you find evidence of child sex trafficking, call the National Human Trafficking Hotline at 1-888-373-7888.
Managed by Polaris, the hotline offers 24/7 support, as well as a live chat and email option. You can also text BEFREE (233733) to discreetly connect with resources and services.

5. Get your content removed and connect with resources

If you have been the victim of explicit content being shared without consent, the Cyber Civil Rights Initiative put together a guide for requesting content to be removed from most popular platforms.
If you have been the victim of sextortion—a perpetrator using suggestive or explicit images as leverage to coerce you into producing abuse content—take a look at our Stop Sextortion site for more information and tips on what to do.
NCMEC has put together a robust list of resources for survivors of sexual abuse material.

6. Practice wellness

Close the computer. Take a deep breath. Go for a walk.
This is an extremely difficult issue, and if you’ve just gone through the steps above, it means you’ve recently encountered traumatic material.
But you’ve also just taken a first step in what could ultimately be the rescue of a child or the cessation of a cycle of abuse for survivors.
Practice the things that create balance and support in your life, and if you need to, connect with additional resources. Text the Crisis Text Line to connect discreetly with trained counselors 24/7.
Or maybe you’re left with the feeling that there’s more work to be done. Learn more about local organizations working in this space and see if they offer volunteer opportunities. Fundraising for your favorite organizations can also make a huge difference.
If you’re here, whether you’re making a report or just equipping yourself with knowledge should you ever need it, you’re joining a collective movement to create a better world for kids. Know that you are part of a united force for good, one that won’t stop until every child can simply be a kid.