Buenos Aires ombudsman proposes creation of National Algorithm Agency
She did so at the Supreme Court hearing in the case of Natalia Deneger; Try to control the impact of algorithms on human rights
in the middle A public hearing convened by the Supreme Court on the right of Natalia Deneger to be forgotten. And that access to its past content be banned from Google, upon request Guido Lorenzino, ombudsman of the province of Buenos AiresOne of the speakers, in which he called on “the national legislature to consider a bill to guide the creation of National Algorithm Agency“. At the end of his presentation, he noted that it serves to “establish an enhanced dialogue and interaction with technology companies and coins.” Human rights regulations“.
As the ombudsman explained to LA NACION, “We got the idea from the comparative experience of Europe. Be one A state agency, independent, within which it will be possible to generate and establish dialogue with technology companiesIn which algorithms can be registered, classified according to their use, their purpose, their purpose, evaluate what their purpose is and know what the method of execution is, in addition to the fact that they are often secret, but at least To know that the algorithm works by weight, see the operating instructions and watch its evolution“, They noted.
According to the request, which they address to the National Congress on the recommendation of the Ombudsman, “the agency will regulate the use of algorithms, personal data, ensure compliance with the implementation of human rights, within the framework of development. The Fourth Industrial Revolution. And I would work in dialogue with Big Tech. “
The algorithm is an accurate and unambiguous sequence of instructions with the result. An analogue can be a recipe. The algorithm consists of components (information and sensors) that then produce the result, which will be visible to users after a search, within a social network timeframe, or on a path suggested in a map app. Knowing what the “ingredients” are behind it will help us to understand why one result occurs instead of another.
For other purposes, they noted that the agency could “assess” that “there is no discrimination in the use of algorithms”. “In fact, some of the applications complained about the travel appointment and that they would terminate your appointment from time to time if you came with a certain series. The same goes for search weight. What is the criterion that if you search for Natalia Denegre in one of the first results, a video of Coppola’s case will appear and not the trajectory she has in the US? All of these questions need to be addressed, especially in order to pay attention to what we call human rights. “Suspicious categories” that suggest discrimination or segregation. It will be interesting if he will be able to monitor and receive complaints, complaints or requests based on these issues, ”they explain.
Earlier – and in agreement with Denegri – he compared the case of plaintiff Cristina Fernandez de Kirchner, who denounced Google as a “thief of the Argentine nation” in describing her role. “Lack Algorithmic transparency It was possible for Google to index his name on false content, which obviously harmed his personality and dignity, as well as democratic institutions and presidential investment. “The case of Natalia Deneger and Christina Kirchner are two similar cases that will lead us to the conclusion: either we limit the algorithms, or the algorithms determine our rights and institutions,” Lorenzino said.
Source: La Nacion
John Cameron is a journalist at The Nation View specializing in world news and current events, particularly in international politics and diplomacy. With expertise in international relations, he covers a range of topics including conflicts, politics and economic trends.