If you're struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 1-800-273-8255; in the UK, call the Samaritans at 116 123; and in Australia, call Lifeline at 13 11 14. Additionally, you can find help at these 13 suicide and crisis intervention hotlines.
A mother is suing Snap and Meta, alleging that Snap's Snapchat social network and Meta's Instagram photo site contributed to her 11-year-old daughter's death by suicide. The lawsuit, reported earlier by Engadget, was filed Thursday in the District Court's Northern District of California in San Francisco. It alleges wrongful death as well as violations of California's Unfair Competition Law.
The lawsuit alleges that the death of Selena Rodriguez, who died July 21, 2021, was caused or contributed to by "Selena's addictive use of and exposure to defendants' unreasonable dangerous and defective social media products."
Meta and Snap "have invested billions of dollars to intentionally design their products to be addictive and encourage use that they know to be problematic and highly detrimental to their users' mental health," the lawsuit alleges. "Internal, non-public data collected by Instagram and Snapchat reveal large numbers of its users -- particularly teenage girls -- are engaging in problematic use of its products."
Snap said it can't comment on active litigation but that it is working "with many mental health organizations to provide in-app tools and resources for Snapchatters as part of our ongoing work to keep our community safe."
"We are devastated to hear of Selena's passing and our hearts go out to her family," a Snap spokesperson said in an emailed statement.
Meta didn't immediately respond to a request for comment.
The lawsuit alleges Snap and Meta designed their social media platforms to cause addiction through "psychological manipulation techniques."
The allegations brought against Meta and Snap include:
- Strict liability based on "defective design of their social media products that renders such products not reasonably safe for ordinary consumers in general and minor users in particular."
- Strict liability based on the companies' failure to provide adequate warning for minors and their parents of the mental, physical and emotional damage that could be caused by using their platforms.
- Common law negligence from "unreasonably dangerous social media products and their failure to warn of such dangers."
- Breach of California's Unfair Competition Law.
Tammy Rodriguez is demanding a trial by jury and seeks relief including for the past physical and mental pain and suffering of Selena Rodriguez and the loss of enjoyment of life. The suit also seeks relief for the loss of Selena Rodriguez's services, comfort, care, society and companionship; loss of future income and earning capacity of Selena Rodriguez; and punitive damages.
An order is also sought "to stop the harmful conduct alleged herein, remedy the unreasonably dangerous algorithms in their social media products and provide warnings to minor users and their parents that Defendants' social media products are addictive and pose a clear and present danger to unsuspecting minors."
Instagram and Snapchat, along with YouTube and TikTok, have been investigated by Congress for their impact on teens.