Instagram is taking a more aggressive stance on teen safety by implementing a stricter set of content rules that will be applied by default to all users under the age of 18. This new PG-13 inspired system marks a significant shift from optional controls to built-in protection.
The new “13+” setting will be automatically activated for all teen accounts. This ensures a more filtered experience is the standard, rather than something parents have to find and turn on. Parental consent will be required for a teen to remove these protections.
The stricter rules will expand the definition of sensitive content. Posts containing profanity, dangerous stunts, and themes that could encourage harmful behaviors will now be hidden or their reach limited within the teen community. Searches for sensitive keywords will also be blocked.
This move toward a “safety by default” model is a direct response to criticism that existing, optional tools were not sufficient to protect young users from harm. The change reflects a growing consensus that platforms must take a more proactive role in safeguarding minors.
While the new rules are being welcomed as a step in the right direction, safety advocates are emphasizing the need for ongoing vigilance. They are calling for transparency from Meta on how these rules are enforced and are demanding independent audits of their effectiveness.
Instagram Implements Stricter Content Rules for Teens by Default
30