Snapchat, Facebook And TikTok Are Being Sued: Here's Why It's Different This Time

[ad_1]

Three popular social media networks are being sued over claims of creating mental health problems. Here are the key details.

What Happened: Lawsuits have been filed by the Social Media Victims Law Center against Snap Inc SNAP owned Snapchat, Meta Platforms Inc META owned Facebook and TikTok, owned by ByteDance.

The lawsuits accuse the social media platforms of causing mental health problems in minors, according to The Hollywood Reporter.

Among the complaints are social media platforms leading to injuries that include eating disorders, anxiety and suicide. The complaints compare the platforms to being defective products, with over 20 lawsuits filed across the country.

“This is the business model utilized by all Defendants – engagement and growth over user safety – as evidence by the inherently dangerous design and operation of their social media products,” one of the complaints read. “At any point any of these Defendants could have come forward and shared this information with the public.”

The lawsuits cite algorithms from the social media platforms prioritizing engagement over safety, leading to users taking part in viral challenges that could be harmful to their health.

“Defendants chose to continue causing harm and concealed the truth instead.”

Social media platforms have rules against opening multiple accounts, something that the complaints say Snapchat is not enforcing as it helps boost their user growth. The result is increased bullying, the lawsuit reads.

The lawsuits take on the age of users with a lack of parental controls in place. TikTok has a minimum age of 13 to join its platform, but doesn’t verify ages according to the lawsuits.

“Each of the Defendant’s products are designed in a manner intended to and that do prevent parents from exercising their right to protect and oversee the health and welfare of their child.”

The lawsuits allege liability, negligence and invasion of privacy among other claims.

One plaintiff included in the multiple lawsuits is the mother of a child who committed suicide, which is blamed on TikTok for “intentional infliction of emotional distress.”

Related Link: Google Wins Immunity Under This Section From Being Framed For Animal Cruelty 

Why It’s Important: Social media platforms have been the target of lawsuits before, with many revolving around liability issues. Platform companies often get around cases with Section 230 of the Communications Decency Act, which shields tech companies from potential liability coming from third parties.

The report claims the lawsuits sidestep typical claims centered on content posted on the platforms. Instead, the lawsuits take on issues based on the algorithms and policies involving minors, which could make this case unique in the battles against big technology companies.

“Plaintiff’s claims do not arise from third party content, but rather, Defendants’ product features and designs, including but not limited to algorithms and other product features that addict minor users, amplify and promote harmful social comparison, (and) affirmatively select and promote harmful content to vulnerable users based on their individualized demographic data and social media activity,” the complaint says.

The Hollywood Reporter makes mention of a case from last year that saw a federal appeals court telling Snapchat it couldn’t use Section 230 to protect itself in a lawsuit over a speedometer function that was linked to a fatal crash.

The lawsuits could be closely monitored by social media users, parents, and investors moving forward. 

 

[ad_2]

Image and article originally from www.benzinga.com. Read the original article here.