Is the internet giant responsible for the terrorist attack? A lawsuit could topple the main pillar of the internet

In the United States, there is a trial on whether the internet giant Google can be held partially responsible for the terrorist attack. The incident could shatter the foundations of the internet as we know it.

In February There was laughter in the courtroom of the US Supreme Court.

“We are a court of law. We don’t really know these things. Here are not the nine biggest internet experts,” said the judge Elena Kagan.

The prank made the people in the hall laugh – so says the minutes of the session – but its content is serious. So is the topic of the session.

It’s a case that could break one of the fundamental pillars of the internet. It could permanently change the foundation on which internet companies, small and large, have been built in the United States.

At the same time, it is part of a wider discussion about the regulation and legal responsibilities of technology companies.

Basically, however, the court is about whether the internet giant Google can be held partly responsible for the terrorist attack because its algorithm also processes content intended for extremist groups?

Stateside Internet companies have been protected by a 26-word piece of legal text since the mid-90s. American professor Jeff Kosseff even wrote a book about the birth of the law, which was called “26 words that created the internet”.

Specifically, it is section 230 of the Communications Decency Act 1996. It says that a user or provider of an “interactive computer service” should not be treated as a speaker or publisher of information content produced by someone else.

In practice, the law means that a company or person maintaining an internet platform cannot, in principle, be sued for content that someone else publishes on its platform.

According to the section, platforms are also allowed to voluntarily limit the availability of obscene, very violent or harassing content, as long as they act in “good faith”. That is, in practice, to monitor the content of the platform as they see fit.

A point of law has been considered one of the reasons for the rapid growth of internet companies, but it has also attracted criticism.

Some have seen it as giving Internet companies an unfair get-out-of-jail-free card that allows them to slip away from any responsibility. It is also old regulation, from a time before algorithms like today and the everyday internet.

“Everyone is trying their best to figure out how this pre-algorithm statute fits into the post-algorithm world,” Justice Kagan said at a February Supreme Court hearing.

Algorithms are also at the center of the lawsuit being considered by the Supreme Court.

The lawsuit has its roots in the November 2015 terrorist attacks in Paris, where 130 innocent people and seven terrorists died.

One of those killed in the attack was a 23-year-old American student nohemi gonzalez.

His family believes that YouTube, the video platform owned by Google, is partly responsible for what happened, because its algorithms recommend videos of the Isis terrorist organization to users who are interested in them. However, the lawsuit does not claim that the terrorists behind the Paris attack were radicalized through YouTube.

Family members of Nohemi Gonzalez listened to a statement from attorney Eric Schnapper outside the Supreme Court House in Washington in late February.

February at the hearing, a lawyer for the Gonzalez family Eric Schnapper told the Supreme Court that the lawsuit specifically focuses on YouTube’s recommendation feature.

A system where YouTube’s algorithm suggests the following videos to the viewer based on the previous video and the user’s previous viewing habits.

According to the Gonzalez family’s interpretation, Section 230 does not protect the platform in this case because the algorithm that recommends videos is YouTube’s own active speech, not the actions of an external user independent of the platform.

According to Schnapper, the videos recommended by the algorithm can be equated to an email, the recipient of which is prompted to watch a certain video.

The highest the trial justices were mostly skeptical of Schnapper’s argument. They seemed especially worried about the scope of Schnapper’s argument.

According to the lawyer, section 230 does not protect the algorithms of, for example, Facebook or Twitter, which are used to build each user’s personal feed.

According to Schnapper, they are also active recommendations of social media companies – that is, the companies’ own words.

Algorithms are everywhere on the modern internet, so the interpretation pushed by the Gonzalez family would change the foundations of the entire internet, at least in the United States.

The judges of the Supreme Court may not be the biggest experts on the matter, but they also stated at the hearing that the huge sea of ​​content on the internet must be organized somehow.

The other party to the lawsuit, YouTube’s parent company Google, took this argument even further.

“Helping users find the needle in the haystack is existentially necessary on the Internet,” said Google’s lawyer Lisa Blatt in the February session.

In court, Blatt painted a stark picture of what the loss of this protection would mean.

According to him, it would force internet companies to choose either an extremely strict line of control, where all even the slightest questionable content is cleaned off the platform, or to accept all content uploaded to the platform, no matter how objectionable.

The judges also seemed to be skeptical about the extent of Blatt’s interpretation. According to Blatt, section 230 protects not only the general algorithms of large internet companies, but also an algorithm that openly denigrates or even favors content produced by ISIS.

The highest the court is expected to give its decision in the case during its session that ends in June.

It is currently considering another almost similar case. It asks the court to decide whether companies like Twitter, Facebook and Google can be sued in the case of a person who died in another terrorist attack because the companies have allowed ISIS to operate on their platforms.

If the judges decide that the companies cannot be properly sued in this case, the Gonzalez family’s case will also fall.

Discourse the responsibilities of internet companies will not end with the Supreme Court’s decision in any case.

In the opinion of many, the regulation of internet companies should be quickly reformed in the United States as well. For example, last year the European Union approved a new digital service act, which increases and specifies the responsibilities of internet companies.

In the hearing between Gonzalez and Google, the Supreme Court judges also stated several times how a legislative solution to the situation could be better than a judicial decision.

President of the United States Joe Biden said in a January op-ed in The Wall Street Journal in his opinion piecethat section 230 should be thoroughly reformed.

“Major technology companies must take responsibility for the content they spread and the algorithms they use,” Biden wrote.

A representative of the Biden administration also spoke at the hearing between Gonzalez and Google in February, who at least argued The New York Timesin mainly in line with the Gonzalez family’s view.

The mother and stepfather of Nohemi Gonzalez, who died in the terrorist attack, in front of the Supreme Court building in Washington at the end of February.

March at the beginning CNN reported that there is growing bipartisan support for Section 230 reform in the US Senate.

“Here’s a message for big tech companies: reform is coming. I can’t predict that it will happen in the next couple of weeks or months, but we are hearing a growing consensus and demand from the American people that we need to act across party lines,” said the Democratic senator. Richard Blumenthal According to CNN.

The reform of the old Internet Act is a significant matter, especially when technological development is already speeding deeper into the world of artificial intelligence applications. There was already a foretaste of that in the handling of the case between Gonzalez and Google, when a conservative judge Neil Gorsuch seemed to have semi-accidentally ruled chatty AI apps like Chat GPT out of section 230.

“Artificial intelligence produces poems, it produces polemics already today. This can be considered content that goes beyond selecting, analyzing or categorizing content. And it’s not protected,” Gorsuch said.

By Editor

Leave a Reply