WARNING- this blog post talks about suicide. If you, or anyone you know requires help in a crisis call Lifeline on 13 11 14.
In recent weeks, if I’m really honest, I’ve found it incredibly challenging to be a digital wellbeing ‘expert’. I’ve been bombarded with horrific and devastating stories of kids, some of whom are as young as 8, who’ve seen distressing, deplorable and age-inappropriate content online. One of many incidents that I’ve heard about is the live streaming and digital dissemination of a death by suicide video on popular social media platforms.
As many of you are aware, a death by suicide was originally streamed live on Facebook on August 31st, 2020 and then recordings surfaced on Instagram and TikTok amongst other platforms in early September. The platforms struggled to stop the videos from being broadcast to millions of users. Tik Tok and other platforms attempted to use Artificial Intelligence to detect and flag this content, but the recommendation algorithm meant that the content was often recommended to kids and teens. To further complicate matters the footage was subsequently spliced into innocent cat videos and other kid-directed content, meaning many young users viewed the footage. Many young people unintentionally saw the video, whilst others sought out the viral suicide footage, as they were eager to know what their friends were discussing (the video became part of their social capital).
The problem when our kids see anything inappropriate online is that they can’t unsee it. The images can be indelibly sketched on their minds (and mentally replayed over and over again). Other problems related to our kids and teens seeing distressing or age-inappropriate digital content is that they have mirror neurons in their brains, meaning that they’re hard-wired to copy. They may imitate what they see online, especially if they don’t have trusted adults helping them to interpret and contextualise what they’ve viewed online (one of the multitude of reasons why pornoghraphy is a problem for young people). Christine Morgan, the National Suicide Prevention Adviser to the Australian Prime Minister, highlighted the concerns with young people viewing a video of suicide, stating, “content which includes explicit descriptions, images or footage of suicide, especially where methods are shown, have been linked to increases in suicidal thoughts, suicide attempts and suicide deaths.”
Finally, when our kids see something perturbing or unusual online they sometimes, as they are naturally curious, go in search of other related content (and unfortunately the recommendation algorithm can accelerate this process).
I spent a couple of weeks seriously questioning whether my voice and work, in the digital wellbeing space, could make a significant difference. And if I’m honest, I almost gave up. However, I’ve divided that we need to use this unsavoury and distressing event as the catalyst for conversation. We need parents to seriously consider our kids’ and teens’ use of social media and access to other platforms and devices.
In parent seminars I encourage parents to be the pilot of the digital plane. To be the pilot of the plane, I propose that we need to focus on establishing the 3Bs (with not on our kids)- boundaries, basic needs and boredom.
Most parents establish boundaries around how much time their kids and teens spend online. This is usually the area that most parents are seeking guidance around- how much time is healthy for my kids/teens to spend online? Most parents try to enforce boundaries around how much time their kids spend online. However, enforcing screen time rules often result in tears and tantrums (from kids and their parents), with the dreaded techno-tantrum often resulting. Whilst the amount of time that ‘screenagers’ (aka, kids and teenagers) spend online is important, especially if it impacts on their basic needs such as sleep, relationships, physical movement and play, focusing exclusively on how much time they’re spending online means that we’re overlooking other essential aspects to their digital wellbeing. We need to have more nuanced conversations about screen time and broader digital boundaries beyond simply quantifying how much time they’re spending plugged in.
I’m not for a moment deflecting responsibility from the tech companies, whom, in my opinion, should be putting up the guardrails to stop users from accessing such abominable content. They have a moral and social obligation to deploy the full repertoire of technologies (such as artificial intelligence) to stop the dissemination of such material. However, even with the best filtering tools and software and restrictions in place, the sad reality is that our kids will still see inappropriate things online. We need to use this event as an opportunity to reflect on our digital boundaries.
Five simple things parents can do to minimise the likelihood our kids will see inappropriate content online
// Don’t prematurely dunk them in the digital stream– I understand that it’s hard (really hard) when your son or daughter is telling you that every child in Year 3 has TikTok and you prohibit them from setting up an account. One of the reasons we struggle and often allow our kids to have access to apps, websites and digital tools before we feel they’re ready is because our kids tell us they’ll be socially ostracised as everyone has access to the app/website/tool they’re requesting. As humans, one of our fundamental psychological needs is for relational connection- we need to be part of a tribe and we want that for our kids.
This exchange is usually followed by a statement along the lines of, “I hate you. You suck. It’s so unfair that I can’t have TikTok.” when you deny their access. We have to be alright with limiting the apps, games, websites and digital tools our kids have access to. We have to be okay if our kids don’t agree with our boundaries,
// Set firm boundaries as to WHAT they can use/download/play/watch– I firmly believe that focusing on what more than how much is really important. Know the digital playground where they’re playing and know the potential pitfalls and risks. I strongly recommend that you do due diligence before allowing them to set up a social media account, or access to a game- I suggest the eSafety Commissioner and Common Sense Media for advice.
The legal age, for most social media platforms, is 13 years. This has nothing to do with children’s psychological readiness to use these tools. The Children’s Online Protection Act (COPPA) imposes certain requirements on operators of websites or online services directed to children under 13 years of age. You know your child or teen best and know when they’ll be able to cope with the myriad of demands that the online world poses.
When you need to say no, try to provide reasons and a rationale for your decision (not just a hard no). I’ve found that it’s hard for kids and teens to argue with science, facts and the law. “No, you can’t have a TikTok account because you’re not 13 years old and that’s the legal age when you can have an account.” Also, remind them that it’s a “no for now.” Let your children or teens know that you’ll reconsider their request at a later date (and by later, you don’t mean tomorrow night).
// Don’t use technology as a punishment tool- if there’s any perceived threat of ‘digital amputation’ it will discourage children and adolescents from seeking guidance from the pilot of the plane (ie. their parents or caregivers) when they see something inappropriate online. This is true for viewing unsavoury content such as pornography, violent or racial content. It is also the case when online predators approach children or teens, or when they’re dealing with cyber-bullying incidents. Instead of approaching you as the pilot of the digital plane, they often seek advice from their fellow passengers (i.e. their peers and/or siblings) who are often just as ill-equipped as they are to safely navigate and deal with these situations.
By all means, have boundaries in place as to your digital rules and expectations and clearly articulate these to your children. Removal of devices or withdrawal of access to technology will not, as with any punishment, deal with the root of the problem. Confiscating their devices tends to make us feel ‘good’ as parents as we feel like we’ve handled the situation and curtailed their behaviour, but only in the short-term. Long-term behavioural change requires us to explore what’s driving their behaviour. I particularly like Dr Justin Coulson’s Explore, Explain and Empower approach.
We need, as pilots of the digital plane, to encourage our kids to come to us when they experience a problem online. When they see a distressing video, when they’re a victim of cyber-bullying, or when a predator approaches them online, we want our kids and adolescents to feel assured that they can come to us and report the incident, without fear that they’ll be stripped of their digital devices. Talking openly, having ongoing conversations about their online activities and showing a vested interest in their digital pursuits can help to build this rapport and assurance.
// Keep devices in open areas (and out of bedrooms and bathrooms)– as the pilot of the digital plane we also need to establish boundaries around where devices can be used at home. Where are your no-go tech zones? I strongly encourage keeping devices out of bedrooms, bathrooms and meal areas. Rather than focusing on where devices are prohibited, instead outline where they can use devices. Shift the conversation as to the best ergonomic positions and setting up zones in your home for particular activities. Why? Your teenage son or daughter is much less likely to be sending ‘nudes’ when she uses her phone in the kitchen and lounge room, but much more likely to be doing it in the bathroom or bedroom.
Again, encourage your children and teens to establish these boundaries with you and provide a rationale for your preferred areas (without terrifying them). For example, if you want to keep devices out of bedrooms, you could explain that using devices before sleep can have a detrimental impact on both the quality and quantity of their sleep and poor sleep results can have a negative impact on their mood, focus, physical health and even growth.
// Minimise their use of social media at night- at night, the logical part of the brain that helps with self-regulation and helps with working memory, the prefrontal cortex, switches off at night and the amygdala (emotional centre of the brain) fires up. This can be a diabolical combination: their logical, problem-solving brain is off and their emotional brain is switched on. This is why a lot of cyberbullying and online predatory behaviour occurs at night. So as the pilot of the plane, you also need to set limits around when your kids and teens can use devices.
For further information about depression contact beyondblue on 1300 224 636 or talk to your GP, local health professional or someone you trust.