:strip_exif():quality(75)/medias/780/70264c98007c0a0ec417721eea26344c.jpeg)
Meta, the parent company of Instagram, has recently introduced enhanced privacy features and parental controls for Instagram accounts belonging to users under 18 years old. These changes aim to address growing concerns about the potential negative impacts of social media on teenagers, especially their mental well-being.
Teen Accounts and Enhanced Privacy
Instagram accounts linked to teenagers will automatically transition to "Teen Accounts," which default to private profiles. This means users will only be able to receive messages and be tagged by accounts they follow or have existing connections with. The platform has also tightened settings for sensitive content, limiting what young users can view and interact with.
Furthermore, users under 16 years old will require parental permission to adjust these default privacy settings. This measure aims to ensure that young users are shielded from inappropriate content and interactions.
Parental Control and Monitoring Features
Parents will have access to a range of controls that allow them to monitor their children's interactions on Instagram. These features include the ability to restrict app usage, track activity, and manage who their children can communicate with. This increased parental involvement is designed to foster a safer and more controlled online environment for teenagers.
Meta's decision to implement these enhanced privacy features and parental controls is driven by growing concerns about the potential negative impacts of social media on teenagers. Studies have suggested a link between excessive social media use and higher rates of depression, anxiety, and learning difficulties, particularly among young people.
Social Media and Mental Health
The addictive nature of social media platforms like Instagram, TikTok, and YouTube has been a subject of scrutiny and legal action. Numerous lawsuits have been filed against these companies on behalf of children and school districts, alleging that these platforms are designed to be addictive and contribute to mental health issues.
Last year, 33 states in the US, including California and New York, filed lawsuits against these tech giants, accusing them of misleading the public about the potential harm of their platforms. These lawsuits highlight the increasing concern about the impact of social media on young people and the need for more robust safety measures.
Meta's Response to Growing Concerns
Meta's move to enhance privacy and introduce parental controls comes after a three-year pause in developing a separate Instagram app specifically for teenagers. This decision was made in response to pressure from lawmakers and advocacy groups who expressed concerns about the safety and well-being of young users.
In July, the US Senate proposed two online safety bills, the Children's Online Safety Act and the Kids Online Safety and Privacy Act. These bills aim to hold social media companies accountable for the impact of their platforms on children and teenagers.
Meta's recent changes reflect a growing trend among social media companies to address concerns about the safety and well-being of young users. These changes, along with the proposed legislation, indicate a shift in focus towards protecting children and teenagers from potential harms associated with social media.