As European countries prepare to launch their COVID patient contact tracing applications, debates are crystallizing around mass surveillance, restrictions on freedoms and the future use of data by artificial intelligence (AI) devices whose purposes are still unclear.
The pandemic has revealed the impotence of AI when it does not have masses of data and its inability to cope with the new. Adorned with the fantasies of superpower,the supposed triumphant march of AI met an unexpected limit: reality. Only n’est in the future,to face some SARS-CoV-20XX that the AI,rich in the data collected today, could decline its potential. It is important to learn from this for the direction of the present. Of these,two are essential.. On the one hand, it is reality that defines, defines and limits the operating capabilities of AI devices. On the other hand, the AI is calling for its own deployment, a system of digital capture and monitoring of beings and human interactions reduced to data, where a new ecosystemis anticipated. Democratic?
The daily world of technophiles is punctuated by the appearance of AI devices designed to revolutionize the real. At least it was like that, before.. Sincethen, the pandemic has shattered the mirror with bytes.louettes With the exception of BlueDot, the AI that “predicted”the trajectory of coronavirus, the enchanting universe of AI, and with it its predictive ability and transhuman efficiency, has disappeared from bot radars. The world of AI has found itself, like the others, confined and dependent on the real world: the one on which it relies and without which it collapses.. None of the reputable GAFAM laboratories anticipated the pandemic,and none provided a direct solution to it.
The pandemic has revealed the impotence of AI when it does not have masses of data and its inability to cope with the new.
Were they able to minimize its effects? Digital tools have occupied the containment space and allowed the work to be organized in telework. But what about the impact of digital outside this confined space? The case of Amazon is exemplary. Algorithmic systems of sales, order tracking, coupling preferences continued to work and purchases exploded due to the closure of shops. But concrete operations have been literally confined to and by en reality, from the production line to that of distribution, through logistics. So we had to invest the real,engage arms or,in a more mature environment, robotize it.
In the same way, tracing applications are bounded by the material reality of the tests, the presence of swabs and reagents,the capacity of laboratories,the speed of testing and the sensitivity of PCR tests. Without the availability and congruence of all these elements, the usefulness of contact tracing systems and their effects on the spread of the virus are marginal. This is the fundamental argument of their critics: in a rule of law, restrictions can only be imposed on freedoms if they are proportionate to the bare necessary to act more effectively and for a right deemed superior, here the right to health. When the effectiveness of the digital tracing system is not demonstrated,and depends, moreover, on parameters that are externalto it, the restrictions on fundamental freedoms (privacy,protection of personal data) are disproportionate to the stated objective.
Foreign experiments show that the digital tracing system and in general, the AI devices, can do little for us in a new pandemic situation. But,we, and our data, can do something for them: let them experiment, perfect themselves and then spread for other, uncontrolled purposes.
In the absence of a real impact in the control of the pandemic,AIs (applications, drones, facial recognition, etc.) do what they always know how to do: monitor, monitor,capture and cross-reference data that will be useful for their own deployment. Thus,the tracing applications reuse to deploy,the data necessary for their emergence: the data captured during a contact are re-engaged to initiate a new process that refines the overall performance of the system that applies to these same data and would allow – it is its fertility – another intelligence of the pandemic.
This typical feedback loop and self-reference operation illustrates one of the ways in which a new ecosystem emerges, a new organization of the world that is organizing itself by ordering itself  . Digital and AI devices are not more or less in line with current standards, which are inserted into our daily world to achieve certain ends. These are means that are their own means, in other words, objects that generate their uses, their own extension. If GAFAM’s algorithmic systems, AI devices and tracking applications are of concern, it is not only that they could be used for purposes not controlled by democratic or social norms. It is also because, on the one hand, these uncontrolled uses are immanent to them, internal to their own functioning and, on the other hand, because these different devices converge, intersect, recoupent communicate with each other,inter-operate according to their norms, thanks to a common binary language. Scanning calls for the extension of digitization and robotics, and reconfigures via a network of AI objects (the Internet of Things,IoT) the reality that limits them. It is important to be aware of these mechanisms of convergence between the objects of AI in order to conduct a democratic debate about the society we desire and to maintain control over its scheduling according to standards chosen and desired by us.
 According to the “second cybernetics”(literally the science of government),the science of information mechanisms ” ( of complex systems born with the projects of automation and artificial intelligence – and the distrust in the “human government” that accompanied them .