Saturday, October 16, 2021

Federal election 2021: why we shouldn’t always rely on ‘good’ political bots

During the 2019 federal election campaign, concerns about foreign interference and dreaded “Russian bots” were rife. In contrast, throughout the 2021 election cycle, new political bots are being touted for their potentially helpful contributions.

From online toxicity detection to replacing traditional voting, political bot makers are experimenting with artificial intelligence (AI) for automated analysis of social media data. These types of political bots can be framed as “good” uses of AI, but even though they can be helpful, we need to be critical.

The cases of Sambot and Polly can help us understand what to expect and demand when people choose to use AI in their political activities.

SAMbot was created by Areto Labs in partnership with the Samara Center for Democracy. It is a tool that automatically analyzes tweets to assess harassment and toxicity targeted at political candidates.

Advanced Symbolics Inc. has deployed a tool called Polly to analyze social media data and predict who will win an election.

Both are attracting media attention and impacting election coverage.

We know very little about how these tools work yet we rely heavily on them because they are being used by non-partisan players. But these bots are setting the stage and the standard for how such AI will be used going forward.

people make bots

It’s tempting to think of SAMbot or Polly as friends, giving us a sense of the confusing mess of political nonsense on social media. Samara, Areto Labs and Advanced Symbolics Inc. All promote the things their bots do, with all the data analyzed by their bots and all the findings discovered by their bots.

SAMbot is depicted as an adorable robot with big eyes, five fingers on each hand, and a nametag.

Polly is introduced as a woman. However, these bots are still tools that require humans to use them. People decide what data to collect, what kind of analysis is appropriate and how to interpret the results.

But when we do identify, we risk losing sight of agency and responsibility bot creators and bot users. We need to think of these bots as tools used by people.

Black Box Approach is Dangerous

AI is a catch-all phrase for a wide range of technology, and the technology is evolving. It is a challenge to explain the process even in long academic articles, so it is not surprising that most political bots are given little information about the way they work.

Bots are black boxes — meaning their inputs and operations aren’t visible to users or other interested parties — and right now bot makers are mostly suggesting: “It’s doing what we want, trust us.” “

The problem is that what goes on in those black boxes can be extremely varied and messy, and small choices can have massive knock-on effects. For example, the Jigsaw (Google) Perspective API – which aims to identify poisonings – has notoriously and unintentionally incorporated racist and homophobic tendencies into their tools.

Read Also:  Huawei has released a new version of the P40 smartphone, the price is known

When people started asking questions about unexpected results, Jigsaw simply discovered and fixed the issues.

When we see new political bots, we need to establish a base set of questions to ask. We must develop digital literacy skills so that we can question the information we see on our screens.

some questions we should ask

What data is being used? Does it really represent the population we think it does?

SAMbot only applies to tweets mentioning current candidates, and we know that by better known politicians there is the potential to generate a high level of negativity. The SAMbot website makes this clear, but much of the media coverage of his weekly reports throughout this election cycle misses the point.

Polly is used to analyze social media content. But that data is not representative of all Canadians. Advanced Symbolics Inc. Canadians in its analysis work hard to reflect the general population, but the population that never posts on social media is still missing. This means that there is an unavoidable bias that needs to be explicitly acknowledged so that we can position and interpret the findings.

How was the bot trained to analyze the data? Are there regular checks to make sure the analysis is still doing what the creators did initially?

Each political bot can be designed very differently. See a clear explanation of what was done and how bot makers or users check to make sure their automated tool is indeed on target (validity) and consistent (reliability).

The training procedures for developing both Sambot and Polly are not explained in detail on their respective websites. Methods data has been added to the SAMbot website throughout the 2021 election campaign, but is still limited. In both cases you can find a link to a peer-reviewed academic article that explains part of their approaches, but not all.

While it’s a start, often linking to complex academic articles can make it difficult to really understand the tool. Instead, simple language helps.

Some additional questions to consider: How do we know what counts as “toxic”? Are humans checking the results to make sure they’re still on target?

A media microphone is pictured as Liberal leader Justin Trudeau prepares to take questions during a campaign stop in Montreal. We need to ask questions not only to our political leaders, but also to the creators of political bots.
Canadian Press/Sean Kilpatrick

Next Step

Sambot and Poli are tools created by non-partisan entities with no interest in influencing, creating confusion or influencing who will win the election on Monday. But the same tool can be used for very different purposes. We need to know how to identify and criticize these bots.

Whenever a political robot or indeed any type of AI is employed in politics, it is necessary to know how it was built and tested.

It is important that we set expectations early for transparency and clarity. This will help everyone develop better digital literacy skills and allow us to differentiate between reliable and unreliable use of these types of tools.

This article is republished from – The Conversation – Read the – original article.

Nation World News Deskhttps://nationworldnews.com
Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.
Latest news
Related news
- Advertisement -