
A parent-managed Instagram account for a 12-year-old public figure is drawing attention to how social platforms enforce minimum age rules for minors.
North West, 12, posted her first image on Instagram on Dec. 20, appearing on a newly created account labeled as managed by her parents.
The post surfaced publicly on the platform the same day and showed North posing in a dimly lit room decorated for the holidays. A second image followed on Dec. 21, continuing the account’s initial rollout.
The debut matters beyond celebrity interest because Instagram’s published rules require users to be at least 13 years old.

North West appears in a low-light, motion-blurred photo shared on her newly launched Instagram account in December.
Meta, Instagram’s parent company, says it removes accounts it believes belong to younger users, even when those users are widely known.
As lawmakers and regulators increase scrutiny of how platforms protect children online, the appearance of a supervised account linked to a high-profile family highlights ongoing questions about age verification, parental controls, and consistent enforcement.
North’s first post referenced a pose long associated with her mother, Kim Kardashian, whose early Instagram presence helped shape influencer culture in the 2010s.
Within a day, the account drew hundreds of thousands of followers, including members of the Kardashian-Jenner family.
The account biography states it is managed by North’s parents. Meta’s Family Center allows guardians to supervise teen accounts, but the company’s terms of service still list 13 as the minimum age to hold an Instagram account.
Meta reiterated in its 2024 transparency updates that accounts suspected of belonging to underage users may be removed.
North has previously appeared on a shared TikTok account with her mother, where TikTok’s family pairing tools permit adult oversight.
Instagram’s minimum age requirement of 13 is tied to the U.S. Children’s Online Privacy Protection Act (COPPA), which limits how online services collect, use, and store personal data from children under that age.
The law applies to U.S.-based users and services operating in the U.S., regardless of a user’s public profile or visibility.
Meta has stated in U.S. transparency reports that it removes millions of suspected underage accounts each year, using a combination of automated detection tools and user reports. The company says these enforcement efforts are ongoing and apply across all account types.
Within the United States, scrutiny of youth safety on social platforms has increased.
Meta executives testified before Congress multiple times in 2023 and 2024, where lawmakers questioned age verification practices, parental supervision tools, and whether existing safeguards are being applied consistently.
Against that backdrop, neither Kim Kardashian nor North’s father, Kanye West, has issued a formal public statement about the Instagram account.
Online discussion in the U.S. has focused primarily on age eligibility, parental oversight, and enforcement of platform rules rather than on the content of the posts themselves.
For families, the situation illustrates how parental supervision tools are increasingly used to manage children’s online visibility.
Instagram recommends private accounts, content controls, and time limits for teens, but it does not currently offer full accounts for users under 13.
Comparable cases have involved celebrity children appearing on shared or restricted profiles rather than independent accounts.
Consumer advocates note that visibility does not change how platform rules apply, even when an account is closely monitored by adults.
No. Instagram’s terms of use require users to be at least 13 years old. Accounts believed to belong to younger users can be removed.
It indicates adult oversight of posts and settings. Management does not exempt an account from Instagram’s minimum age requirement.
Yes. She appears on a shared TikTok account with her mother, which operates under TikTok’s family supervision features.
Yes. In the U.S., COPPA governs data collection from children under 13, while other countries apply separate online safety laws.
Meta has not announced any enforcement action related to the account. Under Instagram’s published policies, action is taken when age violations are identified, but the company does not typically confirm or explain individual moderation decisions.
In company updates released in late 2024, Meta said it would continue expanding parental supervision features and age-detection systems, framing enforcement as an ongoing internal process rather than a public one.
The episode carries wider relevance beyond a single family. Children with high public visibility are increasingly appearing on social platforms earlier than most users, drawing attention to how age limits are applied and how parental oversight works in practice.
For parents and audiences in the U.S., the situation highlights the need to understand platform rules, supervision tools, and enforcement standards as youth social media use continues to grow.





