Dark Light
Limits on social media use could affect mental health help for teens

Health News tamfitronics

The effects of social media on young people’s mental health are not well known. But that hasn’t stopped Congress, state legislatures and the U.S. surgeon general from moving ahead with age-based bans and warning labels for YouTube, Instagram and TikTok.

The emphasis on the harmful effects of social media may cause policymakers to overlook the mental health benefits it provides to teens, say researchers, pediatricians and the National Academies of Sciences, Engineering and Medicine.

In June, Surgeon General Vivek Murthy, the nation’s top physician, called for banner warnings on social media platforms. On July 30, the Senate passed the bipartisan Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act. And at least 30 states have pending legislation related to young people and social media, from age bans and parental consent requirements to new digital and media literacy courses for K-12 students.

Most research suggests that some features of social media can be harmful: algorithmically manipulated content can distort reality and spread misinformation; incessant notifications distract attention and disrupt sleep; and the anonymity the sites offer can be fertile ground for cyberbullies.

But they can also be helpful for some young people, said Linda Charmaraman, a researcher and director of the Youth, Media and Well-Being Research Lab at the Wellesley Centers for Women.

According to Charmaraman’s research, published in the Handbook of Adolescent Digital Media Use and Mental Health, social media can reduce isolation for minority children and LGBTQ+ youth, as well as others who are not widely represented in society. Age-based bans, she said, could disproportionately affect these marginalized groups, who also spend more time on the platforms.

“At first you think, ‘This is terrible. We have to get them out of there,’” she explained. “But then you find out why they do it, and it’s because it helps them feel a sense of affirmation of identity when they’re missing something in real life.”

Arianne McCullough, 17, said she uses Instagram to connect with black students like herself at Willamette University, where 2 percent of students are black.

“I know how isolated it can feel when you’re the only Black person, or any minority person, in a space,” said McCullough, a freshman from Sacramento, California. “So having someone I can just send a quick text and say, ‘Let’s hang out,’ is important to me.”

After about a month at Willamette, which is in Salem, Oregon, McCullough created a social network with other black students. “We’re all in a little chat room,” she said. “We talk and make plans.”

Social media hasn’t always been so helpful to McCullough. After California schools closed during the pandemic, he said he stopped competing in football and track. He gained weight, and his social media constantly promoted home workouts and fasting diets.

“That’s how I started comparing myself to other people,” McCullough said, noting that she felt more irritable, distracted and sad. “I was comparing myself to other people and things that I didn’t feel self-conscious about before.”

When her mother tried to take the phone away from her, McCullough responded with an emotional outburst. “It was clearly addictive,” said her mother, Rayvn McCullough, 38, of Sacramento.

Arianne said she ended up feeling happier and more like herself when she started using social media less.

But the fear of missing out came back to haunt Arianne. “I missed seeing what my friends were doing and having easy, quick communication with them.”

For a decade, before the Covid-19 pandemic sparked what the American Academy of Pediatrics and other medical groups declared “a national mental health emergency for children and adolescents,” the number of young people struggling with their mental health had been rising.

According to behavioral surveys conducted by the Centers for Disease Control and Prevention (CDC) among high school students, more and more young people were reporting feelings of hopelessness and sadness, as well as suicidal thoughts and behaviors.

Increased use of “immersive” social media — such as the rise of videos on YouTube, Instagram and TikTok — has been blamed for contributing to the crisis. But a committee of the national academies found that the relationship between social media and young people’s mental health is complex, with potential benefits and harms.

Evidence of the effect of social media on child well-being remains limited, the committee reported this year, while calling on the National Institutes of Health and other research groups to prioritize funding for such studies.

In its report, the commission cited legislation passed last year in Utah imposing age and time limits on young people’s use of social media, and warned that such a policy could backfire.

Health News tamfitronics Arianne McCullough (left) and her mother, Rayvn (right), take a selfie together.
Arianne McCullough (left) and her mother, Rayvn, of Sacramento, California, support social media legislation that would require platforms like YouTube, Instagram and TikTok to be more transparent about the effects of their products on teens’ mental health.(Rayvn McCullough)

“Lawmakers’ intent to protect time for sleep and homework and prevent at least some compulsive use could have unintended consequences, such as isolating young people from their support systems when they need them,” the report said.

Some states have considered policies that echo the recommendations of the national academies. For example, Virginia and Maryland have passed laws prohibiting social media companies from selling or disclosing personal data of minors and requiring platforms to have privacy rules in place in advance.

Other states, including Colorado, Georgia and West Virginia, have created curricula for public school students on the mental health effects of social media use, something also recommended by national academies.

The Children’s Internet Safety Act, now in the House of Representatives, would require parental consent for users under 13 and impose a “duty of care” on companies to protect users under 17 from harms such as anxiety, depression and suicidal behavior. The second bill, the Children’s and Teens’ Online Privacy Protection Act, would prohibit platforms from targeting ads to minors and collecting personal data from young people.

Attorneys general from California, Louisiana, Minnesota and dozens of other states have filed lawsuits in federal and state courts alleging that Meta, the parent company of Instagram and Facebook, misled the public about the dangers of social media for young people and ignored the potential harms to their mental health.

Most social media companies require users to be at least 13 years old, and sites often include safety features such as blocking adults from messaging minors and pre-setting privacy settings for minors’ accounts.

Despite existing policies, the Justice Department says some social media companies are not following their own rules. On August 2, it sued TikTok’s parent company for allegedly violating child privacy laws, alleging the company knowingly allowed children under 13 to access the platform and collected data about their use.

Polls show that age restrictions and parental consent requirements enjoy popular support among adults.

NetChoice, an industry group whose members include Meta and Alphabet, which owns Google and YouTube, has filed lawsuits against at least eight states, seeking to stop or overturn laws that impose age limits, verification requirements and other policies intended to protect children.

According to Jenny Radesky, a physician and co-director of the American Academy of Pediatrics’ Center of Excellence on Social Media and Youth Mental Health, much of the impact of social media may depend on the content children consume and the features that keep them “hooked” to a platform.

Age bans, parental consent requirements and other proposals may be well-intentioned, but they don’t address what she sees as “the real mechanism of harm”: the business models that aim to keep young people posting, scrolling and buying.

“We’ve created a system that is not well designed to promote mental health for young people,” Radesky said. “It’s designed for these platforms to make a lot of money.”

Chaseedaw Giles, digital strategy and audience engagement editor for KFF Health News, contributed to this report.

https://www.tamfitronics.com/privacy-policy/

Discover more from Tamfitronics

Subscribe now to keep reading and get access to the full archive.

Continue reading