Saturday, February 4, 2023

Twitter may need more than just good algorithms

Now that he’s back on Twitter, neo-Nazi Andrew Anglin wants someone to explain the rules to him.

The account of Anglin, the founder of the infamous neo-Nazi website, was restored on Thursday. Anglin was one of several previously banned users who benefited from an apology offered by Twitter’s new owner, Elon Musk. The next day, Musk suspended Yeh, the rapper formerly known as Kanye West, after he posted an image of a swastika with a Star of David.

“It was cool,” Anglin tweeted on Friday. “Whatever the rules are, people should follow them. All we need to know is what the rules are.

That’s something to ask Musk. Since the world’s richest man paid $44 billion for Twitter, the platform has struggled to define its standards on misinformation and hate speech, issuing conflicting statements and failing to fully address What researchers say is a worrying rise in hate speech.

As the “Twitter Boss” is learning, running a social network with nearly 240 million daily active users requires more than just good algorithms and often imperfect solutions to sticky situations: tough decisions that ultimately Must be done by a human being and that would definitely bother someone.

Musk, a self-described free-speech absolutist, has said that he wants Twitter to be a global, digital public square, but has also said that he would “think” about content or restore banned accounts before “making a difference”. Will not make major decisions about doing. Moderation Council” that integrates diverse viewpoints.

He soon changed his mind after Twitter users polled him. He offered to restore the accounts of a long list of previously banned users, including former President Donald Trump, Yeh, the satire site The Babylon Bee, comedian Kathy Griffin and neo-Nazi Anglin.

While Musk’s own tweets indicated that he would allow all legal content on the platform, Ye’s suspension showed that this is not entirely the case. According to Eric Goldman, a technical law expert and professor at the Santa Clara University School of Law, the swastika image posted by the rapper falls into the “legitimate but nefarious” category, which often irks content moderators.

While Europe has implemented rules that force social media platforms to enact policies against misinformation and hate speech, Goldman stressed that, at least in the United States, the looser rules allow Musk to run Twitter. allow, as he sees fit, despite his inconsistent point of view.

Goldman insisted, “What Musk is doing with Twitter is perfectly permitted under United States law.”

Pressure from the EU could force Musk to highlight his policies to ensure it complies with new European legislation, which will take effect next year. Last month, a senior EU official warned Musk that Twitter would need to step up its efforts to tackle hate speech and misinformation: failure to comply could result in hefty fines.

In another confusing move, Twitter announced in late November that it would end its policy that prohibits misinformation about COVID-19. However, a few days later, he posted an update saying: “None of our policies have changed.”

On Friday, Musk revealed what he said was the story behind Twitter’s 2020 decision, allegedly based on information obtained from the laptop of Hunter Biden, an attorney and US lobbyist, a suspect in New York. The post was to limit the dissemination of the article. US President Joe Biden’s second son. Facebook also took steps to limit the spread of the article.

Twitter initially blocked the link to the article on its platform, citing concerns that it contained hacked material, but then-Twitter CEO Jack Dorsey later criticized the decision.

The information revealed by Musk included Twitter’s decision to remove a handful of tweets after receiving a request to that effect from the Joe Biden campaign. The tweets included nude photos of Hunter Biden that were shared without his consent in violation of Twitter’s rules against revenge porn.

More than revealing nefarious conduct or collusion with Democrats, Musk’s revelations shed light on the kind of difficult decisions he will now face about content moderation.

“Difficult, confusing and thorny decisions” are inevitable, said Joel Roth, Twitter’s former head of trust and security, who resigned just weeks after Musk took ownership of the platform.

Roth said that, while the old Twitter wasn’t perfect, it was consistent in being transparent with users and enforcing its rules. That changed with Musk, he added, speaking at a recent Knight Foundation forum.

“When push comes to shove, when you buy a $44 billion thing, your final decision is how to run the $44 billion thing,” Roth said.

According to activists of a campaign called #StopToxicTwitter (“#FinAlTwitterTóxico”), while most attention has focused on Twitter’s decisions in the United States, the layoffs of many people working in content moderation are affecting other parts of the world as well. . ,

“We are not talking about people not being flexible enough to listen to hurt feelings,” said Thenmozhi Soundararajan, executive director of Equality Labs, which works to combat caste-based discrimination in South Asia. “We’re talking about stopping dangerous genocidal hate speech that could lead to mass atrocity.”

Soundararajan’s organization is part of Twitter’s Trust and Safety Council, which has not met since Musk took over. He added that “millions of Indians are horrified to know which account will be restored.” Twitter has stopped responding to concerns expressed by the group.

“So what happens if there is another call for violence? Do I have to label Elon Musk and hope he deals with the pogrom?” Soundararajan wondered.

Hate speech and racial epithets escalated on Twitter after Musk bought the company, as some users tried to test the new owner’s limits. Since then, the number of tweets containing hate words has continued to rise, according to a report released Friday by the Center for Countering Digital Hate, a group that tracks hate and extremism online.

Musk says Twitter has reduced the spread of tweets containing hate speech until a user discovers it, but that hasn’t satisfied the center’s executive director, Imran Ahmed, who calls the increase in hate speech ” Hate” said. An abysmal failure to live up to its own self-proclaimed standards.”

Following Musk’s inauguration and the firing of most of Twitter’s employees, researchers who previously reported harmful hate speech tweets or misinformation to the platform confirmed that no one responded to their pleas.

Jesse Littlewood, vice president of campaigns for the Common Cause organization, said his group contacted Twitter last week about a tweet by Republican Representative Marjorie Taylor Greene alleging voter fraud in Arizona. Musk restored Green’s personal account after she was banned from Twitter for spreading misinformation about COVID-19.

This time, Twitter acted swiftly, telling Common Cause that the tweet didn’t violate any rules and would remain on the platform, despite the fact that Twitter requires any content that talks about consequences False or misleading claims may be tagged or removed. ,

Twitter did not provide any explanation to Littlewood as to why it was not following its rules.

“I find it quite confusing,” Littlewood said.

Twitter did not respond to messages seeking comment for this story. Musk has defended the platform’s sometimes jerky moves since taking office, arguing that bugs will arise as it evolves. He tweeted, ‘We will do a lot of stupid things.

For many of Musk’s online fans, the clutter is a feature, not a bug, under the site’s new ownership, and a reflection of the free-speech mecca they hope Twitter will be.

“Love Alone Twitter so far,” tweeted a user calling himself some friend (“a guy”). “Anarchy is Glorious!”

Nation World News Desk
Nation World News Desk
Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.
Latest news
Related news