What Meta's AI Age Detection System Actually Does
Here's something that might stop you mid-scroll: Meta has started scanning photos and videos on Facebook and Instagram using AI — not to recognize your face, but to read your body. Specifically, physical cues like height and bone structure. The goal is to figure out whether someone using the platform is under 13 and shouldn't be there in the first place.
Meta was pretty direct about what this system is and isn't. The company clarified it's not using facial recognition — instead, the AI looks at general visual themes and cues, such as height or bone structure, to estimate a user's approximate age without identifying the specific individual in the image. That's a meaningful distinction, even if it still raises plenty of questions about how personal physical data is being processed at scale.
The visual scanning doesn't work in isolation, either. Meta says it combines these visual insights with an analysis of text and interactions, which the company believes will significantly increase the number of underage accounts it can identify and remove. So it's not just looking at a photo and making a call — it's piecing together signals from multiple directions.
How Meta Builds a Full Picture of a User's Age
The visual component is one piece of a much larger effort. Meta's broader system analyzes entire profiles for contextual clues — things like birthday celebrations or mentions of school grades — scanning across posts, comments, bios, captions, and more. Think about it: if someone's friends are commenting "happy 12th birthday!!" on their wall, that's a pretty clear signal. The AI is trained to catch exactly that kind of thing.
And Meta isn't stopping there. The company plans to expand this technology to more parts of its apps, including Instagram Live and Facebook Groups, in the future. That suggests a much wider net is coming — one that will eventually touch almost every corner of how people interact on these platforms.
What Happens When the AI Flags an Account
If the system decides someone looks underage, it doesn't just add a warning label and move on. Meta will deactivate the account, and the user will then need to go through the company's age verification process to prevent the account from being permanently deleted. That's a real consequence — your account goes dark until you prove you're old enough to be there.
The visual analysis system is currently operating in select countries, with Meta working toward a broader rollout. So if you're outside those initial markets, this may be coming your way sooner than you think.
Teen Accounts Are Expanding Too
Alongside the age detection announcement, Meta made another move that's worth paying attention to. The company is expanding its technology that automatically places teens into stricter "Teen Accounts" on Instagram to 27 countries in the EU and Brazil.
These aren't just slightly tweaked profiles. Teen Accounts come with a stricter account experience that includes receiving direct messages only from people they follow or are already connected to, hiding harmful comments, and setting accounts to private by default. Basically, the default experience is locked down in ways that adult accounts are not.
Meta also said it's expanding the Teen Accounts technology to Facebook in the U.S. for the first time, followed by the U.K. and EU in June. So this isn't just an Instagram story anymore — it's becoming a platform-wide shift.
The Legal Pressure Behind These Moves
It's hard to look at all of this without acknowledging what's been happening in the courts. Weeks before these announcements, a New Mexico jury ordered Meta to pay $375 million in civil penalties for misleading consumers about platform safety and putting children at risk — and the company was also ordered to implement fundamental changes to its platforms.
That's a staggering number. And the fallout didn't stop there. Meta has since threatened to shut down its social media services in the state of New Mexico. Whether that's a genuine threat or a negotiating posture, it signals how seriously the company is taking this legal pressure.
That New Mexico case is one of many lawsuits that Meta and other large tech companies are currently facing over child safety. The age detection and Teen Account expansions, then, aren't just product decisions — they're happening inside a legal environment that's become increasingly hostile to platforms that can't demonstrate they're doing enough to protect minors.

