Make it a habit to have ongoing, open conversations with your kids about both online and offline safety. If you are posting unhealthy content, they will, too. You can’t expect kids to limit their time and attention on highly addictive social media channels if you aren’t doing the same. Show them how to create a strong password and security answers, choose a private or limited account, and remind them to never show their physical address to anyone. Ask your child what they would do in the same situation. Talk about the ill consequences of other people who made poor decisions online or got scammed on social media. Show them trust, but don’t let them forget that you care about the decisions they choose to make.
Keep an open conversation going about your expectations for their social media use and presence.
Tips to keep kids safe on social media 1. No matter how Facebook proceeds with product development and how lawmakers guide that process, it’s up to parents to teach kids how to be safe on social media. Would it be good to have a version of Instagram for kids under 13 and parental controls for kids 13 and older? Does Facebook have a responsibility to make sure its products aren’t harming its users? In its own defense, Facebook says, “The reality is that they're already online, and with no foolproof way to stop people from misrepresenting their age, we want to build experiences designed specifically for them, managed by parents and guardians.” Through results from its own research, Facebook is well aware that its Instagram product has a toxic effect on tween mental health, making body image issues worse for one in three teen girls. Earlier this week, under pressure from critics and lawmakers, they announced they were pausing the project “to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today.” In March, Facebook confirmed a rumor that they were developing a version of Instagram for kids under 13 that included parent monitoring features. Although this decision prioritizes the mental health of tweens, Facebook's responsibility for the wellbeing of its users remains hazy