California sues Facebook parent Meta, alleging harm to young people 

Los Angeles, Oct. 25, (tca/dpa/GNA) – California and other states on Tuesday sued Facebook parent company Meta over allegations that it “designed and deployed harmful features” on the main social network and its platform Instagram. 

“Our bipartisan investigation has arrived at a solemn conclusion: Meta has been harming our children and teens, cultivating addiction to boost corporate profits,” California Attorney General Rob Bonta said in a statement. 

“With today’s lawsuit, we are drawing the line. We must protect our children and we will not back down from this fight.” 

The 233-page lawsuit, filed in a federal court in Northern California, alleges the social media giant violated consumer protection laws and a federal law aimed at safeguarding the privacy of children under 13 years old. 

Bonta co-led a bipartisan coalition of 33 attorneys general filing the federal lawsuit against Meta. Eight attorneys general are also filing lawsuits against Meta on Tuesday in state courts, according to Bonta’s office. 

In 2021, a bipartisan group of state attorneys general, including from California, Tennessee and Nebraska, announced they were investigating Meta’s promotion of its social media app Instagram to children and young people. 

Advocacy groups, lawmakers and even parents have criticized Meta, alleging the platform hasn’t done enough to combat content about eating disorders, suicide and other potential harms. 

As part of the investigation, the state attorneys general looked at Meta’s strategies for compelling young people to spend more time on its platform. The lawsuit alleges that Meta failed to address the platform’s harmful impact to young people. 

Meta said it’s committed to keeping teens safe, noting it rolled out more than 30 tools to support young people and families. 

“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” a Meta spokesperson said in a statement. 

Scrutiny over Meta’s potential damage to the mental health of young people intensified in 2021 after Frances Haugen, a former Facebook product manager, disclosed tens of thousands of internal company documents. 

Some of those documents included research that showed Facebook is “toxic for teen girls,” worsening body image issues and suicidal thoughts, the Wall Street Journal reported in 2021. 

Meta said its research was “mischaracterized,” and teens also reported Instagram made them feel better about other issues such as loneliness and sadness. 

That year, executives from the social media company including Instagram’s head Adam Mosseri testified before Congress. Instagram then paused its development of a kids’ version of the app and rolled out more controls so parents could limit the amount of time teens spend on it. 

Social media apps like Instagram require users to be at least 13 years old, but children have lied about their age to access the platform. 

The photo and video-sharing app Instagram is popular among US teens, according to a Pew Research Center survey released this year. About 62% of teens reported using Instagram in 2022. Google-owned YouTube, TikTok and Snapchat are also commonly used by teens. 

The amount of time teens spend on social media has been a growing concern especially as platforms use algorithms to recommend content they think users like to view. In 2022, attorneys general across the country started investigating TikTok’s potential harm to young people as well. 

This story originally appeared in Los Angeles Times

GNA