Skip to content Skip to sidebar Skip to footer

Source: ZDNet as of 21-04-2020

 

Technology: The development of a charter for inclusive AI is a first step before the creation of an international GEEIS-AI label in the coming months for companies. 

Intelligent robots  and facial recognition technology  are regularly cited in public debates around algorithmic biases. The Arborus Association, which works for gender equality, has looked into the issue and, in collaboration with Orange, has developed a seven-point charter for artificial intelligence that is inclusive and does not present bias or stereotypes from its conception. Six large companies and one international SME are among the first signatories, including EDF, Danone and Sodexo.

While artificial intelligence is “a tremendous lever for development, progress in education,health, the environment, industry,” the technology can also be “a factor of gender inequality,” an Arborus spokes person told ZDNet. “As compter of today,we are opening this charter to the signature of all companies, regardless of their size, their sector of activity, their nationality.. This charter is therefore open to the signature of all companies and organisations that wish to do so,via an online platform created for the occasion.” l’occasion

 Among the principles,the charter mentions “promoting diversity and diversity in teams working on artificial intelligence-based solutions,” d’intelligence or “organizing to assess and respond to any form of discrimination that may result from biased or stereotypical data.” Today, nearly 12% of researchers worldwide are women, according to a study by Canadian company Element AI.

Learning from conception

The charter also advises “training to raise awareness and accountability among designers, developers and all stakeholders involved in AI manufacturing, stereotypes, in ways that can lead to discrimination” and to “ensure that suppliers are properly selected and evaluated in an iterative manner to ensure that the entire AI value chain is non-discriminatory.” 

If algorithms can be problematic,it is because they are designed “in our image” and, “according to the data that feeds them, can reproduce biases and stereotypes that already exist in society,” argues avance Delphine Pouponneau, director of diversity and inclusion at Orange. “These biases are the result of human intervention at the time of conception, at the time of data collection or in writing the code. If -t- algorithms, as well as the choice of data used to feed these algorithms, are though, created and managed only by Caucasian men, machines will only translate and amplify a particular and therefore biased vision of society,” she says..

Orange says it is committed to this project in more ways than one. titre “First of all, we have been a French player in artificial intelligence for many years. We develop it in the areas of customer relations, in finance and we begin to exploit its potential in the supervision of our networks.. Soon,we will be able to do predictive maintenance to avoid incidents. Artificial intelligence is therefore one of the professions that we want to develop as a priority at Orange, with many planned in the coming years,” the group said in a statement.

A charter that lays the foundations for a future label

The European Union had also in the past reiterate support for artificial intelligence applications that were considered    reliable  and  ethical. In the context of Covid-19, the Council of Europe recently spoke on the subject through its “Recommendation on the Impacts of Algorithmic Systems on Human Rights”.“.

For Delphine Pouponneau, there is no need to further strengthen the regulation of the sector. “My conviction is that we must be careful not to curb the dynamism of the sector, especially in the face of Chinese or American actors,” says the head of Orange,which supports “incentive and pedagogical,not coercive, approaches.”

This charter marks a first step before the creation of an international label GEEIS-AI in the coming months. “Based on the GEEIS (gender equality) label repository, we interviewed each of the criteria in light of the problems brought by AI. The challenge is to propose an HR and technical audit aimed at apprehending biases,” says Arborus.

Show CommentsClose Comments

Leave a comment

News ORS © 2020. All Rights Reserved.