Supreme Court to review federal law that protects websites from lawsuits over user-generated content

Supreme Court to review federal law that protects websites from lawsuits over user-generated content



The Supreme Court hears arguments today in a major case that threatens the multi-trillion-dollar tech industry. The argument is over a federal law protecting social media platforms from lawsuits over online content that they recommend to their users. Jan Crawford spoke with a mother who says she paid a very high price indeed for the tech giant's freedom. Ten-year-old Nyla Anderson was always smiling. And that's her twin brother. And there she is as a baby. She was Tawanna Anderson's only daughter.

She was the life of my life. She was smart. She was loving. She was caring. She was a shining star. She was my butterfly. She was everything any mother can ask for.

Nyla died in December 2021 after attempting the so-called blackout challenge, which she had seen on TikTok. Like other social media outlets, TikTok's algorithms recommend videos and other content to users. Anderson said that feature led to Nyla's death. They are actually feeding into our children. They are sending on videos that they never even searched before. Anderson sued TikTok in his file papers in a separate case now before the Supreme Court. If for the first time could hold social media outlets accountable for some of the information and videos they are recommending to users.

TikTok declined a comment on the lawsuit. The social media companies say a 1996 federal law shields them from liability and that the modern internet would not exist if companies couldn't sort and recommend third-party content to users. Free speech advocates say social media companies have rights similar to newspapers deciding what articles to publish. People criticize social media platforms today and they have the right to criticize them, but they don't have the right to legally force them to publish certain content, to not recommend other types of content, and so on. But for Anderson and other grieving parents, they argue that has to change. How many more kids until they come to an end? How many more? That's my question to them. How many more children until this stops? For CBS Mornings, I'm Jan Crawford in Philadelphia.

So there's a lot in that case there and there are other related cases. If terrorist videos are recommended to people, there was a case out in California where a federal officer was murdered after a social media platform connected to two of the criminals who committed the crime. And the question is, should these tech companies be platforms that are neutral, passive receivers of all this content? Should they be held liable? Or liable when they start suggesting it, making connections, making associations. Placing it in the algorithm, right? This is such an interesting time because we know as adults we can be influenced. So you can only imagine as a nine or 10-year-old, which then becomes how do you sort of police that? Because children need the technology a lot of times for school or what have you, but you almost have to, the kids are growing up so fast because you're addressing things like suicide and sex. And at nine or 10, because they're being exposed to it via social media or friends who have access to social media. It's tough.

My heart goes out to that mother. Yeah, and reaction to that piece. Any parent that has lost a child or any family member to social media or the pressures of, it's tough because my daughter was talking about that in a room, the Blackout Challenge. And I remembered it because that was something that kids were talking about in elementary. When we were younger, hold your breath and that's my pressure on your chest. We had a conversation with her. And all I'm just saying is because as parents, we have to be overly involved.

We have to be overly involved. It's the keeping of the tech companies making money off the suggestion because they want to keep you on the site. The longer you're there interacting, the more money they make, right? And so these lawsuits are saying, no, if you're going to push this, then you're going to be liable for this. And the thing that TikTok and Instagram and some of these things will say is, well, you're supposed to be 13 or you're supposed to be 16. But obviously there's ways around that.



video, cbs, news, supreme court, scotus, social media, accountable, information, misinformation, websites, lawsuits, content, jan crawford

Post a Comment

Previous Post Next Post