Social media has been ramping up its efforts to create safe communities online from Twitter Spaces designed by Maya Gold Patterson to Tinder’s panic button, which was introduced in January 2020. Instagram is now adding a few new safety features of its own which will prevent adults from direct messaging teenagers.
“Around the world it’s widely understood that most social media platforms require a 13-year minimum age requirement, but the complexity of age verification remains a long-standing, industry-wide challenge. That’s why it’s positive to see Instagram investing in innovative technologies that can and will create a safer online environment for younger users,” said Lucy Thomas, co-founder and co-CEO of Project Rockit.
Thomas continued: “By using machine learning to flag potentially inappropriate interactions, improving teen privacy features and DM-ing younger users with realtime safety info, Instagram is equipping young people with tools to be the architects of their own online experience.”
The restriction will work as follows: if an adult DMs a user under 18 years of age who isn’t following them, the adult will receive a notification that DMing them is not an option.
The statement read: “This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up. As we move to end-to-end encryption, we’re investing in features that protect privacy and keep people safe without accessing the content of DMs.”
Currently, Instagram requires users to be a minimum of 13-years-old and verify their age upon signing up for a profile. While Instagram is aware that users can lie about their age upon creating an account, they have decided to develop “new artificial intelligence and machine learning technology to help us keep teens safer and apply new age-appropriate features,” such as the above restrictions mentioned.
According to the official Instagram blog post, the social media platform will be moving forward with ongoing efforts to keep the youngest users of the social media platform safe. Moreover, Instagram has collaborated with The Child Mind Institute and ConnectSafely to publish a new Parents Guide, which will include safety tools and guidelines, as well as talking points for parents to discuss the world of social media do’s and don’t’s when it comes to their content and how they brand their online presence.
Instagram’s newest safety features will continue to expand to possible restriction of adults seeing teens in their ‘Suggested Users’ tab and preventing adults from seeing Reels content made by teens.