Khaberni - Documents in a class action lawsuit filed by American educational districts against Meta and other social media platforms revealed that the company stopped internal research that showed evidence of the impacts of Facebook and Instagram on users' mental health.
According to Meta documents found through discovery, Meta scientists worked on a 2020 research project named "Project Mercury" to measure the effects of disabling Facebook and Instagram, and the documents showed the company's disappointment after finding a decrease in feelings of depression, anxiety, loneliness, and social comparison in people who stopped using Facebook for a week.
The lawsuit stated that instead of publishing these findings or pursuing further research, Meta canceled more work and internally declared that the study's negative results were mixed with an "existing media narrative" about the company.
The lawsuit mentioned that despite Meta’s own work documenting a causal relationship between its products and negative impacts on mental health, Meta told Congress that it was unable to determine whether its products were harmful to underage girls.
Concealing Dangers
Employees specifically assured Nick Clegg, then Meta's Vice President of Global Public Policy, that the research conclusions were correct.
An employee expressed concern that remaining silent about the negative findings would be akin to what tobacco companies do, "which conduct research and know that cigarettes are harmful, then keep that information to themselves."
The claim of Meta concealing evidence on the harms of social media is just one of several allegations brought in a lawsuit filed by the law firm "Motley Rice" late last Friday, a law firm suing Meta and Google and TikTok and Snapchat on behalf of educational districts across the country.
The prosecutors argued that the companies deliberately concealed the recognized dangers of their products from users, parents, and teachers.
The allegations against Meta and its competitors include implicitly encouraging children under the age of 13 to use their platforms, failing to address child sexual abuse content, and seeking to expand minors' use of social media products while at school.
The prosecutors also say that the platforms tried to pay organizations that focus on children to defend the safety of their products in public places.




