Hello Cath, thank you for making time to speak with me about your latest book Tech-Smart Parenting: how to keep your kids happy and safe online. I think it could easily have been called Tech-Smart Counselling or even Tech-Smart Adulting as I’m sure many non-parents or professionals will find it hugely valuable too. With that in mind, why did you decide to write this book for parents and why now?

I was approached by the publisher via social media because I had gone viral with over six million views on one of my videos. They liked my sensible approach and thought I could write a book that would support parents rather than scare them. 

Do you think it’s possible for young people to be both happy and safe online, as the subtitle suggests? 

Yes, when they are supervised and the adults around them take an interest in what they’re doing. For example, when playing computer and video games, adults should ask about the type of gaming (e.g. one with save points such as Minecraft in creative mode, or timed elements such as the matches in Fortnite), and who they are playing with as each game has its own set of fans. There are so many social media platforms, types and ways in which we can ‘social’ media with each other (such as email, for example, which many people don’t realise fits under the typology of social media), so asking about how young people are engaging with it can give insights into what they do on each platform and why, which can give an indication of how safe they are depending on the risks associated with each platform (e.g. end-to-end encrypted spaces and file sharing apps mean children can send or receive images and videos that are untraceable and cannot be viewed by adults or many parental control apps). A very quick example here is the way in which young people use many platforms at once to communicate and answer each other (e.g. receiving on instant messaging and replying on Snapchat) for several different reasons (which are too lengthy to go into here). This is what I call multi-platform communication, and it requires a really large memory bank to keep up with it all. 

You suggest thinking about digital spaces as a city park or a shopping centre. Can you say more about this idea? 

This is a way to help parents think of the digital landscape, which is imaginary and alien to some of them, through a metaphysical lens to appreciate how it works and understand what children are doing, where they ‘go’ and the risks they can encounter. Some of the risks that children face are described in online safety circles by the four Cs, which are content, contact, conduct and commercialism. These risks can subject children to issues such as grooming, bullying, stalking, data collection, violence, sexual content and more (I have currently listed about 70 online harms in my books and training so it’s a very large area). 

You suggest it’s in young people’s nature to take ‘non-thinking risks’ and ‘doing it on purpose risks’. Can you elaborate in terms of online activity? 

This can look like accidentally or purposefully accessing sites, games or content that has a higher age rating than the child’s actual age, which does not always mean porn, it could be games that are rated beyond their maturity level not just the ratings applied. I say much more about this in my PhD: ‘An exploration with seven to 10-year-olds of digital media in everyday life and the potential for this to generate traumatic experiences of gore/violent content’, which has a planned publication date of March 2026. 

I like your definition of safeguarding as an awareness of potential harm included in the four (plus two) Cs of online risk. Can you elaborate? 

Risk in online spaces can be assessed in terms of content (e.g. gore, violence, sexual images, abuse); contact (e.g. adoption issues such as being contacted by the family of origin on social media, or grooming, sextortion); conduct (e.g. threatening language, hate crimes) and commercialism (e.g. data used to target them for adverts of products). Harm is not always deliberate, for example, therapists who have not taken training in the use of technology may not secure and protect emails, voice or video recordings appropriately and as a result, cybercriminals can use tactics to access those files. The two additional Cs I suggest are the child’s developmental stage, age and abilities, and context as to where potential harms may occur, such as in the homes of different or separated family members, friends, older siblings, sports clubs, schools etc. This is what we might think of as contextual safeguarding. 

You draw on the work of Dan Siegel by emphasising children’s needs to be safe, seen, soothed and secure. How can we ensure this online? 

We may not be able to. For some children and young people, they are seeking online what is missing in the corporeal world. As therapists, we need to understand their digital lives, help them use digital spaces safely, and ensure they feel secure in the knowledge they can bring their digital life to us and won’t be met with a blank face that either lacks interest or is full of fear about this space. 

Most chapters in your book include notes on children with social, emotional or educational needs, neurodiversity and/or trauma. Why was it important to you to highlight those children? 

Originally, there was going to be a chapter or two on working with trauma and tech, and autism and tech for example, and other issues that present in therapy so that professionals could also manage, learn and work with these issues. However, the book became way too long! Plus, it was primarily aimed at parents who may not have the academic background or want to know about the science of trauma, for example, and may just want the ‘what do I do?’ approach. So, I added boxes to highlight some differences with this cohort of children and young people so parents can apply the thinking and reasoning that is relevant to their child, rather than suggest blanket techniques. 

What does the evidence say about the effects of technology on brain development and is it all bad? 

Evidence is extremely hard to assess and actual hard evidence that tech is causing delays, changes or even rewiring brains is pretty poor. Many pieces of research test cognitive tasks (and of course, we know children are not cognitively fully developed and some have learning difficulties or brain-related disorders, mental health issues or are eating a diet that can impact their levels of cognition). Many studies ask for self-reported measures about screen time, which is not really a measurable concept, and I bet you underestimate yours quite considerably. This is a major issue as not all screen time is equal or requires the same cognitive motivation, attention and engagement. This skews results and makes it difficult to say ‘this causes that’ as we know (or should know) that correlation does not equal causation, and much of the neuroscience as it stands is correlational.

Who are the ‘bad actors’ online and how can we help young people avoid them? 

Bad actors is a name from the field of cybersecurity (my old field) and is another way to describe cybercriminals or those with nefarious intent against another person by means of trickery or manipulation. In the context of child safety, bad actors are often adults with harmful intentions such as creating or sharing child sexual abuse material and child exploitation – a field I’ve worked in for over 15 years. Learning about how bad actors operate is the most robust way to help children manage their online activity and contact risks. 

You mention that the field of sexology mostly ignores emerging adolescent sexuality. Why do you think that is? And do you think that’s why many teens turn to the internet for sex education? 

Sexology has mostly been aimed at educating psychosexual therapists who work with adults because some of the interventions that are used are illegal when working with under-18s. As a sex education tutor many years ago, I noticed that conversations around porn and anatomical names of genitalia were omitted from sex and relationship or personal, health, social and economic (PHSE) lessons for political, moral and religious reasons, and it is still early days for including these conversations in school settings. Young people use language that they overhear that can be sexual, foul or fall under ‘ist’ types (racist, sexist) online from around the age of five in some gaming spaces. But don’t forget, children can be exposed to this language offline too. This can pique their curiosity and lead them to sites with sexual content, where they may be sent images or videos that provide them with a sexual experience. Children are experimenting with nude images, and we don’t have accurate data about the numbers here except for when things go wrong and they are reported or ‘found out’. 

While some child trainings cover developing sexuality, many do not include specific training modules around porn viewing because it’s complicated and underresearched, and many adults aren’t aware of the issues either, because they are not in the spaces online where this content exists. When I speak with attendees on my courses or other professionals in education settings, for example when there is a safeguarding issue relating to this topic, they are overwhelmed or lacking knowledge, not because they are incompetent, but because they have not had updated training. There is a basic understanding of sexual development but a reluctance, avoidance or difficulty in finding good training. Children and young people seek out sexual content for education, or to arouse sexual feelings, or for self-soothing, or because of child sexual abuse, or to find their own images, or to send images to others for pranks or because they are plain curious about the types of sex that exists. There’s lots of reasons! 

What do we need to know about the Online Safety Act (2023) and does it go far enough to protect young people? 

It is a framework for the UK to ensure certain actions are completed on behalf of the law by Ofcom to ensure that the media we are engaging with is regulated, just like the way in which TV, radio and cinema are overseen by Ofcom. We need much more training around online issues to deepen our understanding and illuminate the digital spaces with our secondary, unintended roles as safeguarding, tech experts, advisors, cybersecurity consultants, just to name a few of the unspoken roles people bestow upon us because we ‘work with children’, and to give us the vernacular with which to communicate and understand what/where/who and how children are behaving online. This is not to say that we must understand every platform, game, meme or trend but we do need to keep up to date with emerging issues, spaces and what risks they pose, beyond the corporeal aspects of young people’s lives. 

I love your conversational style of writing. You advocate for boundaries rather than bans, discussions rather than dictating, checking in rather than checking-on. Do you have more to say on this or anything to add? 

Nope, that’s summed me up nicely!