Teenage users of Instagram will face stricter rules under a new “PG-13” rating system announced by Meta Platforms. This initiative aims to limit exposure to adult content. According to Meta, the company already prohibits or hides such content, but it is now elevating its approach. The new guidelines will hide or avoid recommending posts featuring strong language, risky stunts, or anything that could encourage harmful behavior, including marijuana imagery. Meta envisions that teens will only see content equivalent to what they would find in a PG-13 movie.
Additionally, parents will gain more control over their children’s viewing experience on the platform. This move comes amid a wave of lawsuits against Meta, asserting that the company has deliberately targeted minors. One suit, backed by 33 states, accused Meta of using advanced technology to captivate and entrap youths, with profit as its primary motive.
Concerns about safety and exploitation on social media have been amplified. A lawsuit by an Apple executive claimed that Meta solicited his 12-year-old child on Facebook. The New Mexico Attorney General labeled Meta’s platform a “breeding ground” for predators targeting children.
Meta’s announcement appears to be a step toward addressing these serious allegations, but skepticism remains among advocacy groups. Ailen Arreaza, executive director of ParentsTogether, conveyed doubts about the effectiveness of these measures, noting that previous promises have often failed to deliver. She stated, “Our children have paid the price for that gap between promise and protection.” Arreaza emphasized the need for transparent and independent testing, not just proclamations from the company.
Critics have also labeled the new rating system as merely a gimmick. Charles Rivkin, chairman of the Motion Picture Association, expressed reservations, asserting that claims linking Instagram’s system to the movie rating process are misleading. Rivkin remarked, “We welcome efforts to protect kids from content that may not be appropriate for them.”
Meta anticipates that these updated safety measures will be fully operational in the United States by the end of this year. As scrutiny over social media’s role in the lives of minors intensifies, it remains to be seen whether this new approach will genuinely enhance user safety or if it will merely serve as a public relations move without meaningful change.
"*" indicates required fields
