Tuesday, August 9, 2022

Siege of Macho Algorithm

The social gaps of the analog world are also digitized at the hands of artificial intelligence (AI), as this technology often works with biased data on gender issues, and the solution to purging them is near, as consulted by EFE. According to experts, it is only the functioning of AI for improvement and to do the same through legal requirement.

Harvard University researcher Alexandra Przegalinska is convinced that, once they investigate artificial intelligence in depth, public officials “will immediately see that perhaps there is a certain abuse of power along with its potential.”

In this sense, it celebrates the pioneering legislation that the European Commission has proposed, as it “considers AI from a risk perspective”, according to the professor attending the School of Women’s Leadership organized by the Chinese technology company Huawei. After, and that week brought together Prague’s young talents from across Europe.

Read Also:  If you have these Intel processors up to 10th generation, support for their integrated GPU will end

“There is hope associated with AI, and we should uncover it. But there is also a risk associated with the misuse of artificial intelligence, or rather artificial intelligence. And I think it’s good that we have this regulation,” he said. it is said.

The rules drawn up by Brussels consider the use of AI to be ‘low’ risk to those who clearly carry ‘high’ risk, a criterion which, according to Przegalinska, “impacts on human life”. varies depending on. This technology.

“Think of a field like weather forecasting. It’s not something that’s going to hurt anyone if you tell them it’s going to rain. It’s a trivial problem compared to a situation where a biased algorithm says That you have a disease you don’t really have, or where a very complex algorithm tells you you can’t get a loan and you don’t know why,” he says.

women, more affected

Read Also:  Use 4 people with 1 plan up to 500 people in these Airtel plans

In these situations, which Przegalinska uncovers, women are often most affected, given that their parameters are underrepresented in the databases with which algorithms operate and, therefore, they are a transmission belt of true discrimination. Which women have to suffer in the analog world.

“Technology is not the problem; But people. There is a saying in the AI ​​world that ‘garbage out, garbage in’. So, if the data is biased, there will certainly be a bias in the system as well,” says Maria Postma, head of the cognitive science field at Tilburg University, who also spoke with Efe.

The professor recalls that AI is an attempt to “create an artificial version of human intelligence”, for example, in the simultaneous translation of languages ​​or autonomous driving of vehicles, so that the original idea was, according to him, «a sort of human brain. Create a simulation».

Nation World News Desk
Nation World News Deskhttps://nationworldnews.com
Nation World News is the fastest emerging news website covering all the latest news, world’s top stories, science news entertainment sports cricket’s latest discoveries, new technology gadgets, politics news, and more.
Latest news
Related news
- Advertisement -