Meta launches new teen safety features, removes 635,000 accounts that sexualize children
Briefly

Meta introduced safety features to protect teens, including notifications about messaging accounts and an easier way to block and report inappropriate users. The company removed thousands of accounts making sexualized comments on minors' accounts. As scrutiny on social media's impact on youth mental health increases, Meta reported that teens blocked and reported over a million accounts after receiving a safety reminder. The firm is testing AI tools to validate the ages of users on Instagram, enforcing strict measures for accounts misrepresenting age. Legal challenges on the design of its platforms are ongoing.
Meta has implemented new safety measures, including information about messaging accounts and a one-tap option to block and report. Over a million accounts were blocked by teens.
The company removed thousands of accounts that made inappropriate comments, 135,000 of which were actively commenting on kids’ accounts. Companies face scrutiny over their influence on youth mental health.
Read at ABC7 Los Angeles
[
|
]