When I was in fifth grade, I downloaded Instagram onto my phone without either of my parents knowing. I yearned to be like my friends, most of whom already had Instagram and Musical.ly accounts. I chose my username to be “rainbowfluffyunicorn,” which I found to be clever and obviously not a child. Over the first few weeks of creating my private Instagram account, I started to receive follow and message requests.
The follower requests weren’t always from people at school but from adult men and women alike. The message requests, always from men, asked if they could follow me and if I could send them photos of me because I was so young, fresh and cute.
I was not the only one of my friends getting messages like these. As I grew older, the requests never stopped, and as my other friends started downloading Instagram, it started for them, too.
As someone who was just 11 when I started using Instagram, I didn’t understand the dangers that lay within those platforms, and that’s why social media platforms need to start enforcing age limits on accounts to protect younger kids.
Social media has been shown to be infested with people who prey on younger kids. From sexual predators messaging kids on social media to stalking them, most kids on social media platforms have experienced some type of harassment by predatory adults on the same apps. Even now, in the age of artificial intelligence, there is a new risk of AI child porn being made of young kids just from a few images of their faces online.
Not only does social media provide a place where kids can be exploited, it also exposes them to inappropriate content such as violence and explicit material. Unlike other platforms that are made for children — such as YouTube Kids and PBS Kids — social media like Instagram and TikTok lack filtering mechanisms that can shield younger users from such content. Instead, they give them access to every type of media produced, good and bad. Access to content like this for older people is important, allowing a new level of freedom of expression, but for young kids, it’s just harmful.
Additionally, recent studies have shown that social media can be harmful to young kids’ mental health, shorten attention spans, disrupt sleep and erode self-esteem. Social media, by setting unattainable standards and being a constant barrage of consumerist content, can negatively impact kids’ mental health.
In a 2022 study, the Pew Research Center found that most parents worry that social media is “leading to cyberbullying, anxiety or depression, exposure to explicit content, or lower self-esteem.”
Now yes, parents should be monitoring their kids so that they are not misusing social media or have access to content beyond their years, but there also needs to be safety measures made by social media companies as well to protect the younger, more vulnerable populations.
Social media companies should also put in more mechanisms to keep children safe on their apps, but there are other easier solutions. One way to start enforcing age limits is to follow the protocol of Uber by requiring users to submit a photo of their ID or passport, proving they meet the age requirements. By doing this, social media companies can properly age-gate their platforms and prevent young kids from getting on them without direct parental aid and supervision.
If Instagram required age verification for creating accounts, 11-year-old Chloë would’ve never had to endure the grown, male aggression that came with having an account.