In a recent survey of 1,000 UK children aged 9-17 and 2,000 parents, it was discovered that a troubling 7 in 10 children experience harm online, including contact from strangers, hate speech, and misinformation. Alarmingly, only 36% of those affected reported the incidents, revealing significant barriers to reporting. For example, just 23% reported misinformation, often due to perceived complexity in the reporting process. Many children (46%) felt that current systems are unclear, and a considerable percentage expressed concern about anonymity, particularly regarding reporting known individuals. Despite this, among those who did report, 83% found the process straightforward. Nevertheless, 40% of children believe their reports go unanswered, which contributes to a culture of silence around online harm. Parents are equally frustrated, desiring better clarity on outcomes and an independent body for escalations, rather than the platform it’s self – similar to Australia’s eSafety Commissioner which can remove harmful content within 24 hours. Encouragingly, the Protection of Children and Illegal Harms Codes are set to come into force this summer, which will address some of the needs and concerns raised by parents and children in the research. As parents, you can continue to foster a safe online environment for your children by researching reporting tools for the apps and games your child uses and encouraging an open dialogue.