Instagram is expanding its Teen Accounts in Malaysia with stricter age-based content controls, introducing a 13+ content standard inspired by movie ratings as well as a new “Limited Content” setting for parents seeking tighter supervision.

The updated framework, rolled out gradually over the coming months, will automatically place users under 18 into a default 13+ experience, its parent firm Meta said in a statement Wednesday.

Under this setting, teens will not be able to opt out without parental permission, and the content they see will be designed to resemble material suitable for viewers aged 13 and above.

Meta said the approach is modelled on familiar film classification standards to help parents better understand the type of content their children are exposed to online.

While acknowledging that social media differs from cinema, the company said the aim is to ensure teens “occasionally” encounter mild suggestive content or language, but that such instances will be minimized through tighter enforcement.

“At Meta, keeping teens safe online is our top priority,” said Clara Koh, Director of Public Policy for Central Southeast Asia and ASEAN.

She said the updated Teen Accounts are designed to provide “age-appropriate content by default,” while giving parents more tools to shape their children’s experience.

A key addition is the new “Limited Content” mode, which allows parents to impose stricter controls beyond the default 13+ setting.

This option further reduces the type of content teens can view and removes the ability to see, leave or receive comments on posts.

Meta said it has also strengthened its content policies for teen users, aligning them more closely with 13+ movie rating criteria and parent feedback.

The updated guidelines expand restrictions beyond existing rules that already block sexually suggestive content, graphic imagery and adult themes such as tobacco or alcohol promotion.

Under the revised system, Instagram will also limit or remove recommendations involving strong language, risky stunts and other potentially harmful behavior, including posts featuring marijuana-related paraphernalia.

The company said these changes are intended to reduce exposure to borderline or potentially unsafe content across its platforms.

The protections extend across Instagram’s ecosystem, including Search, Explore, Reels, Feed, Stories and direct messaging.

Teens will be blocked from searching or viewing a broader range of mature terms, including words such as “alcohol” or “gore”, with safeguards designed to also account for misspellings.

Instagram said its improved detection systems will also prevent teens from following or interacting with accounts that frequently post age-inappropriate content.

Such accounts will be restricted from engaging with teen users, including sending messages or commenting, while their content will no longer be recommended or easily discoverable.

The platform has also updated its artificial intelligence features for teens, ensuring responses remain consistent with the 13+ content framework and avoid inappropriate outputs.

Meta said the changes reflect ongoing efforts to respond to parental feedback and evolving concerns over teen safety online.

Google to require verification for financial ads in Malaysia