Cyberbullying Attacks the Young
Cyberbullying inquiry finds the mental health of young people is severely affected by online abuse. Social media companies such as Facebook, Snapchat and Twitter have been accused of failing to protect young people from harassment after a cyberbullying inquiry found that online abuse severely affects their mental health.
Almost half of young people have experienced threatening, intimidating or abusive messages on social media, pushing some to the verge of suicide in the most extreme cases, according to a survey commissioned by the Children’s Society and YoungMinds.
Sixty-two percent of respondents were under 18 and three-quarters were female. The findings were based on oral and written evidence from young people, including an online survey of 1,089 children, social media companies, mental health experts and children’s charities.
Respondents said they felt let down by social media platforms, and wanted companies to take tougher action against cyberbullying, including banning abusive users.
The children’s charities have recommended that social media companies pilot approaches to identify children using their platforms, and to gain explicit parental consent for under-13s. They said the government should require social media firms to publish data on their responses to reports of online bullying, which the inquiry found to be “inadequate”.
“You would kind of expect to experience it: nasty comments on the selfie, Facebook posts and Twitter posts, people screen grabbing your Snapchat story to laugh about it … I feel like it’s something people don’t take seriously. But leaving just one nasty comment could really hurt someone,” a 15-year-old girl told the inquiry.
“Social media companies should take complaints more seriously. If someone reports something, they shouldn’t take days to review it, they should literally just remove it straight away. The reaction from adults is just delete your account to stop the bullying, but that’s taking something away from that young person’s life for something that’s not their fault,” she added.
The inquiry’s findings have been published in advance of the government’s response to its Internet safety strategy consultation. Forty-seven percent of respondents had experienced threatening or abusive messages on social media and 61% had their first account aged 12 or under, despite platforms stating that users must be over 13.
Alex Chalk, the Tory MP who led the inquiry, said: “Cyberbullying can devastate young lives, but to date the response from social media companies has been tokenistic and inadequate. It has failed to grip the true scale of the problem. For too long they have been marking their own homework and its time they become far more transparent, robust and accountable.”
Frequent social media users are most likely to have symptoms of anxiety and depression, the inquiry found, with some young people who had experienced bullying repeatedly checking their feeds to see what had been posted about them.
The chief executive for the Children’s Society, Matthew Reed, said: “The inquiry has heard from young people describing cyberbullying as ‘inescapable’, and in the most extreme cases it has pushed some to the verge of suicide.
“But we’ve also heard about the positives that social media brings for young people. Social media is part and parcel of teenage life and we all need to support young people to stay safe online, including better education in schools and information for parents.”
Sarah Brennan, chief executive of YoungMinds, said: “With so much of young people’s everyday lives involving the online world, it’s crucial that it is a place that young people can feel safe and enjoy being part of.
“We need to see platforms creating age-appropriate content for younger users, as well as parents and teachers speaking to young people early about how to respond positively to the online world, and what to do if they come across upsetting content.
“But most of all, this inquiry has shown loud and clear that it’s time social media companies sit up and take action to tackle cyberbullying and promote good mental health on their platforms.”
A spokeswoman for Snap said: “Snapchat is designed for a teen and adult audience and we use the best technology available to prevent someone who is under the age of 13 from creating an account or using the service.
“Our trust and safety team works around the clock to review abuse reports and take action when they become aware of a violation. In the vast majority of cases, they respond to reports and concerns well within 24 hours of a report.”
Simon Milner, Facebook’s policy director for Europe said: “Our priority is to make Facebook a safe place for people of all ages which is why we have spent a long time working with safety experts like the UK Safer Internet Centre, developing powerful tools including a Bullying Prevention Hub to help people have positive experiences on Facebook.
“Our work with Childnet International and The Diana Awards means we’re offering every UK secondary school a Digital Safety Ambassador this year and we’re members of the Duke of Cambridge’s cyber-bullying taskforce. We welcome close collaboration between industry, experts and government to continue our work in this area.”
You Might Also Read:
Snapchat Map Raises Child Safety Concern:
Are Children 'Bingeing' on Social Media?
Give Children More Control Of Data Privacy: