Search
Monday, December 05, 2022

The United States Supreme Court will rule on social networks’ responsibility for their users’ content

The Supreme Court of the United States today launched a judicial course full of important cases affecting positive discrimination in universities, the environment, electoral regulations, and discrimination against homosexuals, among many other issues. Just as the oral hearings began, which resumed in-person for the first time since the outbreak of the pandemic, the Supreme Court announced that it would admit new cases. Two of them are those that will measure the responsibility of big technology companies in the content published by their users on the social network.

Google, Facebook, Twitter, Amazon and other companies with social networks are thus experiencing a great judicial trial over the moderation of their content, an issue that has sparked controversy and is subject to various regulations in the states. one of the recruitment cases Gonzalez, Reynaldo and others against Google, Will investigate to what extent Google can be held responsible for the Mataclan massacre in Paris for allowing the spread of videos inciting Islamist violence on its YouTube platform. The second affects Twitter, Google and Facebook in relation to an attack on a nightclub in Istanbul in 2017, which resulted in 39 deaths.

YouTube suing Google are relatives of 23-year-old US university student Nohemi Gonzalez, one of 131 people killed by Islamic State militants in a series of attacks that shook Paris on November 13, 2015 at the Bataclan . Concert halls and other places in the capital of France. Gonzalez was murdered in the restaurant where she was dining that day. Lower courts have rejected the claim, but the family appeals to the Supreme Court, which has now agreed to handle the case.

US law states that Internet companies are not responsible for the content posted by their users, but this issue has become controversial for a variety of reasons. Many perpetrators of several murders have broadcast their activities live. On the other hand, the content of the network has become the object of political propaganda. While Democrats condemn the propaganda of excessive authority and conspiracy spread on the network, Republicans complain about the content moderation policy practiced by some Big Tech and what they consider censorship.

Nohemi Gonzalez’s family criticizes that YouTube is not limited to a passive role in allowing users to watch, but rather that its algorithm recommends videos based on each user’s history. In addition, watchers of Islamist propaganda videos received more of this type of content, fueling their radicalization. They complain that the Google Group company, whose parent company is now Alphabet, allowed the spread of radical propaganda videos inciting violence.

“If section 230 [la norma que en principio descarga de responsabilidad a las compañías por los contenidos de sus usuarios] These algorithm-generated recommendations have great practical importance to be applied to,” the family argued in the resource. Continually guide the recommendations.” The victim’s family believes that Google has violated anti-terrorism law by allowing the dissemination of these videos.

Google counters that the only link between the Paris attacker and YouTube was that one of the attackers was an active user of the platform and once appeared in an ISIS propaganda video. “This Court should not take lightly the reading of Section 230 that threatens the basic organizational decisions of the modern Internet,” argues Google.

In the second case, lower courts have held that Twitter, Facebook and Google must take some responsibility for the material spread regarding the massacre at the Rina Club, an Istanbul nightclub at a New Year’s Eve party in 2016. Conflict , Unlike Twitter, Tamneh This has also been accepted by the Supreme Court. This second case has nothing to do with the content recommended by the algorithm, but rather asks whether the social network could be prosecuted for alleged complicity in an act of terrorism when the platform requested content from users. who have hosted generally express their support for the group. He is behind the violence, even if they do not mention any specific attack.

Both cases will be the first pulse in a battle that expands on the immunity that companies should have about their users’ content or not, as well as the margins they have for their moderation policy. Several Supreme Court justices, including conservatives Clarence Thomas and Samuel Alito, had already expressed their interest in admitting cases over Internet content moderation. Last March, when the court declined to accept the lawsuit against Facebook, Thomas noted in a dissenting opinion: “Considering that Congress does not intervene to clarify the scope of Section 230, we need to find an appropriate should be done in this case.”

The court issues a resolution on those matters, which it admits every year, within a period up to the end of June, beginning of July, when the judicial course ends.

you can follow country technology In Facebook You Twitter or sign up here to receive our newsletter seminal,

Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.

Latest News

Related Stories