<https://www.theguardian.com/commentisfree/2023/jul/04/smoking-gambling-children-social-media-apps-snapchat-health-regulation


WhatsApp is for old people. That is, people over 30. At least that’s what my 
first-year medical students tell me. WhatsApp has become the realm of 
government backchats about Covid-19 policy, or family groups sharing graduation 
photos and past-their-prime memes. Young people, especially teenagers, 
communicate through Snapchat, an app that reaches 90% of 13- to 24-year-olds 
and 75% of 13- to 34-year-olds across the UK. It has more than 21 million 
active users in Britain per month, which is almost one-third of the population, 
and each accesses the app on average more than 50 times a day.

If we go back to the 1980s and 1990s, parenting was often about keeping 
children physically safe. A moody 14-year-old wanting to be alone in their room 
was a normal thing– in fact this was sufficient to protect them. Parents had 
clear oversight over who their kids were around at home, and the physical 
materials they were consuming.

Fast-forward to 2023 and children are in their rooms at home, but while 
physically safe, they’re exposed to an entire virtual world with almost no 
regulation over the content they see, how much time they’re spending on devices 
or the boundary between necessary social communication with friends and a much 
darker world of harmful or pornographic material, and predatory influencers.

Compounding the problems facing teenagers and parents is the fact that apps are 
often designed deliberately to be addictive, or used compulsively. They 
regularly turn social interactions into games – like Snapchat’s “Snapstreak” 
feature that scores a user for the number of consecutive days they share a 
picture with each contact, encouraging them to be available on the app 
constantly. Apps also come with many safety and oversight concerns. Snapchat 
popularised disappearing messages, meaning parents can’t check what their child 
has been viewing or posting, and many apps share the user’s location with 
others by default. Teenagers and children may not even know that they’re 
sharing live data on their location throughout the day.


The potential harms of these apps are increasingly being called out. The US 
surgeon general, Vivek Murthy, issued an “advisory” last month on the effects 
social media use has on young people’s mental health. He said: “The most common 
question parents ask me is, ‘is social media safe for my kids?’ The answer is 
that we don’t have enough evidence to say it’s safe, and in fact, there is 
growing evidence that social media use is associated with harm to young 
people’s mental health. We are in the middle of a national youth mental health 
crisis, and I am concerned that social media is an important driver of that 
crisis.”

The president of the American Medical Association, Jack Resneck Jr, used even 
stronger words: “With near-universal social media use by America’s young 
people, these apps and sites introduce profound risk and mental health harms in 
ways we are only now beginning to fully understand.” The evidence of its mental 
health impact is concerning.

The latest Centers for Disease Control Youth Risk Behaviour survey in 2021 
illustrates the scale of the problem: 42% of high school students surveyed had 
experienced persistent feelings of sadness over the last year, while 22% had 
seriously contemplated suicide. Research cited by Murthy shows that adolescents 
who spend more than three hours per day on social media have double the risk of 
depression and anxiety. Furthermore, 46% of adolescents aged between 13 and 17 
say social media makes them feel worse about their body image, and 64% of that 
age group are “often” or “sometimes” exposed to hate-based content. Yet 
children themselves feel unsure of how to cope: one-third or more of girls aged 
11 to 15 say they feel “addicted” to certain apps, and more than half of 
teenagers say they would find it hard to give up social media.

A study by researchers at the University of Bath looked at the mental health 
effects of a week-long social media break, which on average freed up about nine 
hours of their week. Even after only seven days, participants had significant 
improvements in wellbeing, depression and anxiety compared to a control group.

There is growing consensus among health experts about the negative chronic 
health effects of social media use. And yet we are largely leaving it up to 
parents, and other concerned adults, to find solutions to a side-effect of 
universal smartphones. The burden is on the government to recognise the 
negative health impact that certain social media apps have, create policies to 
regulate companies to ensure they’re not selling a dangerous product to 
children, and work towards collective solutions. This could include working 
with these companies to make design and development decisions in their product 
that prioritise safety and health.

Companies may resist this because it’s in their financial interest to keep 
users on their platforms for as long as possible. Screen time is their revenue 
stream: 99% of Snapchat revenue is from displaying advertising to its users; 
the figure for Meta, which owns Facebook and Instagram, is about 98%. The 
longer that kids spend on their platform, the more money they make. And the 
cost, maybe, is paid in the mental health of the next generation of society.

In April 2023, a bipartisan group of senators in the US put forward new 
legislation, the protecting kids on social media bill, which would require a 
minimum age of social media use at 13, parental consent for children between 13 
and 18, and would ban platforms from using certain algorithms on young users.

As one of the senators proposing the US legislation, Brian Schatz, said: “The 
growing evidence is clear: social media is making more kids more depressed and 
wreaking havoc on their mental health. While kids are suffering, social media 
companies are profiting. This needs to stop.”

This is exactly what public regulation is for. Like policies for tobacco, 
alcohol or gambling, it can ensure that when private companies sell a product 
with potential negative health impacts, they do so in a way that follows 
certain guidelines and protects vulnerable users from being completely exposed.

Devi Sridhar is chair of global public health at the University of Edinburgh

_______________________________________________
nexa mailing list
nexa@server-nexa.polito.it
https://server-nexa.polito.it/cgi-bin/mailman/listinfo/nexa

Reply via email to