Bruno Fortea Miraso
Prague, 24 July. The social gaps of the analog world are also digitized at the hands of artificial intelligence (AI), as this technology often works with biased data on gender issues, and the solution to purging them is near, as consulted by EFE. Doing so also through the workings and legal imperative of AI to improve, according to experts.
Harvard University researcher Alexandra Przegalinska is convinced that, once they thoroughly examine artificial intelligence, public authorities “will immediately see that perhaps there may be a certain abuse of power along with its potential.”
In this sense, she celebrates the pioneering legislation that the European Commission has proposed, as “it considers AI from a risk perspective”, according to the professor attending the Women’s Leadership School organized by the Chinese technology company Huawei. After, and which brought young talents from all over Europe together in Prague this week.
“There is hope associated with AI, and we should uncover it. But there is also a risk associated with artificial intelligence, or the misuse of artificial intelligence. And I think it’s good that we have this regulation,” he says .
The rules drawn up by Brussels consider the use of AI to be ‘low’ risk to those who clearly carry ‘high’ risk, a criterion which, according to Przegalinska, “impacts on human life”. varies depending on. This technology.
“Let’s think about a field like weather forecasting. It’s not something that’s going to hurt anyone if you tell them it’s going to rain. It’s a trivial problem compared to a situation where a biased algorithm Says you have a disease that you don’t have. You really don’t have it, or where a very complicated algorithm tells you you can’t get a loan and you don’t know why,” he says.
Women, the most hurt
In these situations, which Przegalinska uncovers, women are often most affected, given that their parameters are underrepresented in the databases with which algorithms operate and, therefore, they are a transmission belt of true discrimination. Which women have to suffer in the analog world.
“It’s not the technology that’s the problem, it’s the people. There’s a saying in the AI world that ‘garbage out, garbage in’. So if the data is biased, there will certainly be bias in the system as well”, says Tilburg University of Cognitive Science That’s the claim of the head of the region, Maria Postma, who also spoke with EFE.
The professor recalls that AI is an attempt to “create an artificial version of human intelligence”, for example in simultaneous language translation or autonomous driving of vehicles, so the original idea was, according to him, “create a kind of human.” Brain simulation”.
AI feeds on the database and from this information, in turn, generates algorithms with mathematical models that guide its actions.
“Suppose a database contains information about the profiles of job candidates based on decisions made in the past. And, in the past, many decisions were made with gender bias and female candidates were not selected. fixed position”, he cites as an example.
And he continues: “If the system works with this information, it will use gender as a variable in its decision-making process, because the AI will think that, if it excludes women, it will arrive at the same decision that humans did in the past,” he warned.
Both Postma and Przegalinska claim the existence of solutions to stop this spiral of bias, which results either from systems “excluding gender variables”, according to Postma, or by “deliberately screwing up” mathematical models of AI. Przegalinska proposes scrambling and scrambling of the data to “enhance the randomness” of the algorithm.
For the director of the Spanish Women’s Leadership School, Berta Herrero, the scope of work of these two experts shows that “ethics experts are still needed” in the field of AI.
The Spanish representative at the school, Maitane González, a 22-year-old from Bilbao who has just graduated in law, is in favor of preventing “pre-regulation” of new technologies such as AI. EFE
(This chronicle is part of a series supported by Huawei. Efe’s editorial content is independent of this company’s posts)