Risk factors: all you need to know about apps

We all know that social media and messaging apps are a huge part of young people’s lives.

According to research from Ofcom, 70 per cent of 12 to 15 year-olds and 21 per cent of 8 to 11 year-olds who go online have a social media profile. While there can be positives to using social media and apps - such as helping children feel closer to friends and family - they do pose safeguarding risks too.

Make sure you know your Snapchat from your WhatsApp and that you’re clear on the main existing and emerging threats that apps can pose.

Instagram

Instagram is a photo and video-sharing app for smartphones. You can post content publicly or to ‘followers’, see content posted by others, ‘follow’ each other, ‘like’ posts, comment, and send direct messages.  According to research from Ofcom, nearly a quarter (24%) of 12 to 15 year-olds say Instagram is their main site or app. The minimum age according to Instagram is 13, but it’s not that difficult for a younger child to set up and use. People can also quite easily set up multiple/fake accounts.

What can children be at risk of?

Pupils using Instagram can be at risk of receiving inappropriate contact from strangers, who can see and comment on their photos and videos if their account is set to ‘public’. They also have exposure to upsetting or harmful material, such as images relating to eating disorders, self-harm and suicide; and bullying, through fake accounts and unkind comments on posts. It can also create pressure to ‘look’ a certain way – it’s an image-centred app.

What’s the risk level?

According to research, there’s a ‘high’ risk of seeing sexual content and bullying content. ‘High risk’ here means that, when asked by the NSPCC and O2, more than 25% of children and parents reported seeing these types of content on this app. There’s also a ‘medium’ risk of seeing content related to violence and hatred, suicide and self-harm, and drink, drugs and crime. ’Medium risk’ means that between 5% and 25% of children and parents reported seeing these types of content on the app in the same research.

What should I be alert to?

Listen out for pupils talking about Instagram in general, particularly anyone who you also have mental health concerns about. Red flags can include references to seeing upsetting or harmful material, being contacted by strangers and getting unkind comments.

Snapchat

Snapchat is a messaging app used to share photos, videos and messages with contacts. The ‘snap’ is on screen for up to 10 seconds, then disappears; or you can opt for no time limit. You can share snaps in a sequence for up to 24 hours.  The minimum age according to Snapchat is 13, but it’s not that difficult for a younger child to set up and use.

What can children be at risk of?

Pupils using Snapchat can be at risk of grooming, as the app will share their location unless they use ‘ghost mode’, and strangers can send them messages/requests. Image-sharing without consent can also occur, as people can save screenshots of images they post before they ‘disappear’.
Other issues could include sexting, via requests for sexual images from people they don’t know, and bullying, through photos being posted with unkind comments.

What’s the risk level?

According to the NSPCC and O2 research mentioned above, there’s a ‘high’ risk of seeing sexual content and bullying content.

There’s also a ‘medium’ risk of seeing content related to violence and hatred, suicide and self-harm, and drink, drugs and crime.

What should I be alert to?

Listen out for pupils talking about Snapchat or ‘Snap’ in general, especially talk of getting inappropriate messages, requests for photos or strangers they’ve made contact with.

TikTok

TikTok is a video-sharing app. You can record and upload short video clips, watch other people’s videos, ‘follow’ people, gain ‘fans’, ‘like’ and comment. The minimum age according to TikTok is 13, but you don’t have to prove your age when creating an account so younger children can still use it easily.  It’s most popular with under-16s. Users cannot exchange images and videos via in-app messaging, but once they’ve made contact they move on to another platform to trade, such as Snapchat.

What can children be at risk of?

Children are at risk of exposure to explicit or inappropriate videos, such as pornography and upsetting/harmful content, and age-inappropriate lyrics.
Strangers may also see videos they have shared, if their account is set to ‘public’; and anyone can see their profile information. They could also get contact from strangers asking to ‘trade’ explicit images/videos, and feel pressured to record inappropriate or explicit videos to gain more followers.

What’s the risk level?  

TikTok is a newer/emerging app to be aware of (it was formerly called Musical.ly). At the moment it still has a reputation for being comparatively free of trolling and danger, but there are some known risks. There have been reports of some users harassing children for nude images and videos.  

What should I be alert to?

Listen out for pupils talking about TikTok, especially talk of videos that sound inappropriate, being asked to ‘trade’ or swap pictures/videos, and strangers they’ve made contact with.

WhatsApp

WhatsApp is a free, multi-function, instant-messaging app that allows you to send messages, images, videos and your location, as well as make calls. The minimum age according to WhatsApp is 16, but it’s not that difficult for a younger child to set up and use.

What can children be at risk of?

Children can be at risk of bullying, e.g. directly in a group chat, or by being excluded from a group, or sexting, as they can send and receive explicit photos. Grooming, if they share their location, is also a risk.
What’s the risk level?

According to the NSPCC and O2 research, there’s a ‘medium’ risk of seeing sexual content, bullying content, and content related to violence and hatred.

What should I be alert to?

Listen out for pupils talking about WhatsApp in general or ’group chats’, especially people being unkind in groups, excluding people from groups, or sharing photos, videos or locations.

If you have any concerns about children using these apps, or any others,  talk to your designated safeguarding lead (or deputy) and follow your school’s safeguarding procedures (as for any safeguarding concern).  Remember, too, that new functionalities and new apps appear all the time, and many of the risks outlined for the apps above could be present for any app.

Jenny Moore is a lead content editor at The Key, a provider of up-to-the-minute sector intelligence and resources that empower education leaders with the knowledge to act. The information in this article is taken from The Key’s Safeguarding and child protection INSET pack 2019/20, part of its Safeguarding Training Centre.