Tech giant Meta has introduced new updates on Facebook and Instagram that makes it harder for potentially suspicious adults to interact with teens.
The move comes on the heels of Meta rolling out similar privacy defaults for teens on Instagram and aligns with its safety-by-design and ‘Best Interests of the Child’ framework.
“Last year, we shared some of the measures we take to protect teens from interacting with potentially suspicious adults. For example, we restrict adults from messaging teens they aren’t connected to or from seeing teens in their People You May Know recommendations,” Meta said.
- Advertisement -
“In addition to our existing measures, Meta is now testing ways to protect teens from messaging suspicious adults they aren’t connected to, and we won’t show them in teens’ People You May Know recommendations.
“A “suspicious” account is one that belongs to an adult that may have recently been blocked or reported by a young person, for example. As an extra layer of protection, we’re also testing removing the message button on teens’ Instagram accounts when they’re viewed by suspicious adults altogether.”
“We’ve developed a number of tools so teens can let us know if something makes them feel uncomfortable while using our apps, and we’re introducing new notifications that encourage them to use these tools,” it said.
Starting now, everyone who is under the age of 16 (or under 18 in certain countries) will be defaulted into more private settings when they join Facebook.
Meta has encouraged teens already on the app to choose these more private settings for:
- Who can see their friends list
- Who can see the people, pages and lists they follow
- Who can see posts they’re tagged in on their profile
- Reviewing posts they’re tagged in before the post appears on their profile
- Who is allowed to comment on their public posts
Meta also shared an update on the work it does to stop the spread of teens’ intimate images online, particularly when these images are used to exploit them — commonly known as “sextortion”.
“The non-consensual sharing of intimate images can be extremely traumatic and we want to do all we can to discourage teens from sharing these images on our apps in the first place.
“We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent.”
Sharing intimate images
“This platform will be similar to work we have done to prevent the non-consensual sharing of intimate images for adults. It will allow us to help prevent a teen’s intimate images from being posted online and can be used by other companies across the tech industry.
We’re also working with Thorn and their NoFiltr brand to create educational materials that reduce the shame and stigma surrounding intimate images, and empower teens to seek help and take back control if they’ve shared them or are experiencing sextortion,” Meta said.
Meta added that it is planning to launch a new campaign that encourages people to stop and think before resharing lurid images online and to report it to them instead.
ALSO READ: WATCH: New high-tech Digital Dome catapults Joburg Planetarium into future