U. S. News

The “fundamental architecture” of the internet might never the same anymore

In California, Morton is studying to become a licensed therapist. It was her then-boyfriend, who is now her husband, who suggested that she post videos on the platform to spread mental health information.

In 2011, Morton thought that most of the videos on YouTube were of cats playing the piano and how-tos for putting on makeup. But Morton decided to give it a try after seeing other posts on the site.

At first, her audience was small, and only a few people watched her videos. Since then, though, Morton’s YouTube channel has grown to have more than 1.2 million subscribers.

YouTube’s system for recommending content to users, which the company started building in 2008, is very important to the growth of Morton’s audience. It uses a very complicated algorithm to figure out what videos people will want to watch and keep watching. She said that today, half of Morton’s ideas come from what other people have told him.

Morton told CBS News, “If you could see the whole life of the channel, it was very, very slow and steady.” “Then, through recommendations and collaborations, things have grown because you can reach a bigger audience and YouTube can understand the content better.”

The recommendation algorithms used by YouTube, TikTok, Facebook, and Twitter are at the center of a legal dispute that will be heard by the Supreme Court on Tuesday. The case involves the powerful legal shield that helped the internet grow.

Aaron Mackey, a senior staff attorney at the Electronic Frontier Foundation, told CBS News that what’s at stake in the case known as Google v. Gonzalez is rewriting the laws that govern the basic structure of the internet.

“A key part of online life”

Section 230 of the Communications Decency Act protects internet companies from being held responsible for content posted by third parties. It also lets platforms take down content that is considered offensive or obscene. The dispute before the Supreme Court is the first time the court will look at how far the law goes. The question before the justices is whether Section 230’s protections for platforms cover targeted recommendations of information.

ISIS members killed 129 people in terrorist attacks in Paris in November 2015. This led to the court case. Nohemi Gonzalez, an American college student studying abroad who was 23 years old, was one of the people who died. She was killed at a bistro in the city.

In 2016, Gonzalez’s parents and other family members filed a civil lawsuit against Google, which owns YouTube. They said that by recommending videos posted by ISIS to users, Google broke a federal law against terrorism by helping ISIS.

Google tried to get the complaint thrown out, saying that Section 230 gave them protection from the claims. A federal district court in California agreed, and because the videos in question were made by ISIS, the court found that Google was protected by the law.

The decision of the lower court was upheld by the U.S. Court of Appeals for the 9th Circuit, and Gonzalez’s family asked the Supreme Court to weigh in. In October, the high court said it would hear the case.

A lot of people have weighed in on the court battle, and many of them are on Google’s side. Platforms like Twitter, Meta, and Reddit, which all rely on Section 230 and its protections, say that algorithmic recommendations help them organize the millions of pieces of third-party content that appear on their sites, making the experience better for users who would otherwise have to sort through a huge number of posts, articles, photos, and videos.

Lawyers for Meta, the company that owns Facebook and Instagram, told the court that because there is so much content on the internet, it is important to organize, rank, and show content in ways that are useful and appealing to users.

Even the company that runs the online dating sites Match and Tinder said that Section 230 was “vital” to its mission of bringing singles together because it lets “its dating platforms recommend potential matches to its users without having to worry about overwhelming litigation.”

But conservatives are using the case to attack “Big Tech” companies and spread the idea that platforms censor content based on their political beliefs.

A group of Republican senators and representatives told the Supreme Court that lower court decisions have led to a “broad grant of immunity.” They said that platforms “have not been shy about restricting access and removing content based on the politics of the speaker.” This is an issue that keeps coming up as Big Tech companies censor and remove content with conservative political views, even though Section 230 doesn’t give them immunity for doing so.

The case has given the justices a rare chance to hear directly from the people who wrote the law in question. Ron Wyden, a Democrat from Oregon who is now a senator, and Chris Cox, a Republican from California who used to be in the House, wrote Section 230 in 1996. The two people from different political parties wrote a “friend of the court” brief that explained the clear meaning of their law and the policy balance they were trying to reach.

“Section 230 protects targeted recommendations in the same way it protects other ways of gathering and presenting content,” they wrote. “Any other way to read Section 230 would go against its goal of encouraging new ways to moderate and present content. Section 230 encourages the real-time transmission of user-generated content, which has become a key part of online life. Many internet users and platforms rely on this feature.”

They argued that Google should be protected from liability under Section 230 because the platform’s recommendation algorithm is just matching users with the types of content they want.

Wyden and Cox said, “The algorithm works in a way that isn’t very different from the many curatorial decisions platforms have always made about how to show third-party content.”

The fight also shows how people have different ideas about how Section 230 has changed the internet and how it works now. For tech companies, the law has set the stage for new online platforms, an industry of online creators, and free speech to grow. The algorithmic suggestions have been deadly and harmful for Gonzalez’s family and others.

Like the Gonzalezes, Taiwant to Anderson has tried to hold a social media site accountable for the content it suggests to its users.

Last May, Anderson sued TikTok and its parent company, China-based ByteDance, because her 10-year-old daughter Nylah died in late 2021 after trying to do the dangerous “Blackout Challenge.” In this challenge, users are told to strangle themselves until they pass out and then share videos of the experience.

The challenge, which went viral on TikTok, was suggested to Nylah by TikTok’s algorithmic recommendation system on her “For You” page.

Anderson’s lawsuit tried to hold TikTok responsible for putting dangerous content in front of minors on purpose through challenges and for encouraging them to do things that put their lives in danger. TikTok used Section 230 to ask the federal district court in Pennsylvania to throw out the case.

In October, U.S. District Judge Paul Diamond threw out the case. He wrote that TikTok was not responsible under the law because it was promoting the work of others. But in a short order, he said that TikTok made the Blackout Challenge “easy to find” on their site and that their algorithm “was a way to get the challenge in front of the people who were most likely to be interested in it.”

“Congress, not the courts, is the right place to talk about how wise it is to give such immunity,” Diamond wrote.

Mackey, who works for the Electronic Frontier Foundation, said that if people don’t agree with how the courts have interpreted Section 230, Congress, not the Supreme Court, should change the law.

“When they passed it, they set this balance and said that they didn’t think there wouldn’t be harmful content, but that on balance, the growth of the internet and development of a tool that became central to our lives, commerce, and political expression was more important,” Mackey said. “Congress can change that balance if it wants to.”

A new economy for creators

Since Section 230 became a law 27 years ago, the internet’s rapid growth has created a multi-billion-dollar industry of independent online creators who rely on large tech platforms to reach new audiences and make money from their work.

Morton’s YouTube channel has helped her reach patients all over the country, even in places with few mental health resources.

She said, “Because I can get over a million views on YouTube, I can reach so many more people, and information about mental health isn’t hidden behind a paywall.”

Alex Su is a lawyer by training and runs the TikTok account LegalTechBro. In 2016, he started sharing content on LinkedIn as a way to get people to know about the tech company he worked for. After getting lawyers and others in the legal field to follow her on LinkedIn, Su started trying out TikTok in 2020.

His TikTok videos about what it’s like to work at a law firm struck a chord with other lawyers and people connected to the field. He said that LinkedIn’s recommendation system has helped Su reach his target audience and market the services of his company a lot.

He told CBS News, “These algorithms let me go viral with people who can relate to my jokes.” “Most people probably wouldn’t find this kind of content as funny if I showed it to them.”

Internet companies and people who support Section 230 say that the law has helped new and growing companies become leaders in their fields without having to spend a lot of money fighting frivolous lawsuits.

Su was one of the first people in the legal field to use LinkedIn and TikTok. He said that creators are often quick to use new platforms where they can reach new audiences.

“I don’t think it’s a coincidence that there are shifts where new people come in. As a content creator, you can use this to your advantage because you can quickly go viral on that platform with a new audience,” he said. “I wouldn’t have been able to grow the way I did without those different platforms.”

Little help from the court

The Supreme Court hasn’t said much about how it might handle Section 230. Only Justice Clarence Thomas has written about how the legal shield has been interpreted by lower courts.

“Courts have long focused on non-textual arguments when interpreting [Section] 230, leaving behind questionable precedent,” Thomas wrote in a 2020 statement asking the court to look at whether the law’s text “aligns with the current state of immunity enjoyed by internet platforms.”

The Supreme Court could make a decision that agrees with how lower courts have interpreted Section 230, or it could make the law less safe.

But internet companies told the court that if it narrows the scope of Section 230, it could make a big difference in how they handle content that people post to their sites. With less protection and a higher chance of expensive lawsuits, companies may be less likely to let content on their sites that could be a problem. Instead, they may only let content that has been checked out and doesn’t pose much legal risk.

Mackey said, “If you worry about censorship, the last thing you want is a legal system that punishes platforms for keeping things online.” “There will be more censorship, more things will be taken down, and a lot of things won’t get out there in the first place.”

By the summer, we should know what the Supreme Court will do.

Donald Wolfe

Donald’s writings have appeared in HuffPost, Washington Examiner, The Saturday Evening Post, and The Virginian-Pilot, among other publications. He is a graduate of the University of Virginia. He is the Virginian Tribune's Publisher.

Related Articles

Comments are closed.

Back to top button