Facebook, the world’s largest social network (which requires users to be at least 13 years old) has created a messaging app for children, 6 to 12-years-0ld.
Messenger Kids has parental controls and policies in place to ban inappropriate content and cyberbullying, but that doesn’t make the service exempt from Facebook’s pattern of moderation failures. And in the event a child is harassed or exposed to banned content in Messenger Kids, the burden falls in part on Facebook’s human moderators to act. Facebook says they will have specialized reviewers [for Messenger Kids] that will look for other signs of potential abuse and take appropriate action, whether it’s removing specific messages or entire accounts for repeated abuse.
Parents have to be Facebook friends with the parents of any kid that their kid wants to talk to. Once a parent adds someone to their child’s contact list through the main Facebook app, kids can video chat as well as send photos, videos, and texts, or pick something from “a library of kid-appropriate and specially chosen GIFs, frames, stickers, masks, and drawing tools,” according to Facebook’s announcement post.
Facebook claims their new app gives parents control over how long their kids can use the app. Messenger Kids includes a tool that lets kids report when someone is being “mean,” and, according to Facebook’s product manager, both humans and machines at Facebook will be moderating the space for inappropriate content. When detected (or if, given Facebook’s shoddy moderating history), such content will be scrubbed from the app.
Should you allow your child to use this app?
I think there are two important issues for parents to consider before giving their very young child access to social media:
1. Screen time. Do you want to give your child more opportunities to spend more time on a phone?
In a peer-reviewed study that appeared in the journal Clinical Psychological Science, Jean Twenge showed that, after 2010, teens who spent more time on new media were more likely to report mental health issues than those who spent time on non-screen activities. Using data collected between 2010 and 2015 from more than 500,000 adolescents nationwide, Twenge’s study found kids who spent three hours or more a day on smartphones or other electronic devices were 34% more likely to suffer at least one suicide-related outcome—including feeling hopeless or seriously considering suicide—than kids who used devices two hours a day or less. Among kids who used electronic devices five or more hours a day, 48% had at least one suicide-related outcome.
2. Facebook is promising to protect your kids…what could possibly go wrong?
A Facebook spokesperson said in an email to Gizmodo (online tech blog), “We’ve built automated systems that can detect things like nudity, violence, and child exploitative imagery to help limit that content from being shared on Messenger Kids. We also have blocking and reporting mechanisms, and have a dedicated team of human reviewers that review all content that is reported.” Facebook’s systems are reactionary and won’t stop your child from seeing these traumatizing images, they will clean them up after the fact. Unfortunately, the damage is already done. And there are still privacy concerns to consider. Facebook is not immune to security breaches and the social network has a history of experimenting on its users. Parents would be misguided to believe that they have the ultimate authority over their child’s experience. In reality, Facebook does.
Cyber Safety Cop’s Recommendation on Messenger Kids
I do not recommend parents give their very young (6 to 12-year-old) access to social media, even if it is moderated for children. I encourage parents to look for non-screen activities for their children, especially family oriented activities. This is an uphill battle. The average teen consumes 6 to 9 hours of screens per day.
Lastly, why does Facebook want your child to be on social media? Ultimately, they are doing it to create a new generation of users that are addicted to their product. Sean Parker, the co-founder of facebook, in a recent symposium admitted Facebook was created to make you come back for more. Parker said, “It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology…God only knows what it’s doing to our children’s brains.”
Social media and technology is a moving target. Parents must know what is happening in their child’s digital world. Subscribe to Cyber Safety Cop’s free newsletter that will let you know when there is a new app or online issue parents need to know about.
Get the companion guide to every device your child is using to connect to the internet and other people online - the step-by-step guide to making your family safe online, “Parenting in the Digital World.