France in Europe in general, we have one of the most protective legislation when it comes to data and privacy However new technologies development and their new uses harm to jeopardize these legal protections to simplify one could say that there are combined phenomena on one hand, more and more data is being collected in different ways as we surf the Internet and as we use our mobile phones and even more in the future because of what is known as the Internet of Things large amounts of data are collected often unknown to people involved so data collected massively besides that there is a growing interest from different individuals and organizations to exploit these data that is called data analysis data analysis and everything involving big data is a very public issue these days at that point, one can really notice the tensions between privacy protection as it is under current law and what they intend to do in practice for example for Big Data, the logic is to collect as much data as possible to then be able to analyze it to infer new knowledge but on matters and in directions that we don't necessarily know when collecting the data that goes completely against the law currently in France and generally in Europe the collection of personal data must serve specific purposes which are defined at the time of the collection and once these objectives are met the data must be deleted so this is the complete opposite of what people who analyze data on a large scale intend to do I'm not suggesting that information technologies are inevitably a threat to privacy but we can also design new technologies aimed to protect privacy to fill the gap big data and data analysis that I mentioned earlier so ultimately we could consider it a subject of research on information technology for example to solve this large-scale data collection and analysis gap anonymization technique maybe designed we know it is in fact statutory that personal data protection law doesn't safeguard anonymous data because by definition anonymous data is not personal when exploiting data massively it is convenient to first anonymize them but data anonymization is a very complex matter what anonymous data relly means this is a very elusive thing to define so today a leading research topic in the field concerns robust anonymization techniques the right to control our own data is also expected in France and in Europe in general this is occasionally called informational self-determination and like other rights, this one is being undermined in several ways particularly through the widespread use of algorithms nowadays especially predictive or decision-making algorithms that concerns a wide range of fields and even our everyday activities for example, when it comes to the field of information it is well known that today more and more people get their information from social media but we don't always know the information obtained through social media is graded by algorithms and we don't necessarily know how algorithms work this can have a major impact on a right to be informed or over censorship it was even brought up upon the u.s. presidential election some social media were accused of bias one ranking information favoring one candidate over another and these accusations are reinforced as we don't know precisely how these algorithms work but this is only one of all the examples regarding employment for instance it's known to some companies in the u.s. use algorithms to rank applications so this allows them to hire people without having to interview regarding insurance, we could receive premium offers calculated by the user profiling algorithms same goes for bank loans more seriously maybe there is an emerging use of this kind of algorithms in police and judicial matters as far as police it's what's known as predictive policing many saw the movie Minority Report today this dystopian kind of scenario is conceivable with the predictive police which is already implemented in certain County in the United States it is used to anticipate the areas and even the days of the week at greatest risk of crime or misdemeanors so police forces will be deployed during those hours and at those spots based on algorithms but then again there are related risks like the risk of discrimination and stigmatization of some populations so the rather opaque uses of algorithms are likely to create new risks and to jeopardize certain rights like those related to non-discrimination fair treatment let alone the right to be informed I've just spoken about decision-making algorithms but there are other situations in which source codes are integrated like on trains Subway's planes and ever more frequently on self-driving cars, source codes make major decisions but in those cases it would be even a matter of life and death as self-driving cars decide whether to brake accelerate or turn therefore the questions we face today concern liability who is responsible in case of malfunction when dealing with gigantic software including millions of lines of code provided maybe by a range of actors how to determine the misfunction or damages were caused by a given software component or generated by a particular sub or supplier again these are complicated issues involving legal notions like liability compensation and technical issues as we must be able to analyze the concerned logs and to point out the area where the error occurred and thereby the components and suppliers that must be held responsible all this shows as regards linkage between legal and technical matters there are many difficult questions about software and the technical tools that we currently use on a daily basis that can raise complex questions concerning the law and way more so one can argue that these are societal or even ethical issues we hear more and more about robotics in the social context in the medical field or even in military activity which raises ethical questions then it's crucial that not only technicians and policymakers understand these issues public, in general, must comprehend them as well most importantly of course the younger generations the high school students it really is an essential point in education today to make it clear that these digital tools and new services bring up truly complex issues legal and political issues and also in a broader sense ethical issues concerning the treatment of large quantities of data I referred earlier to two technical solutions saying that anonymization might be one of them turning transparency, particularly in algorithms, this is a subject that initiates significant research work and that arises highly important questions for example how to prove that an algorithm used to scream applicants to determine which candidates will be selected for a given position how to make sure that this algorithm won't lead to discrimination these are extremely complex questions and to do that it is not enough to disclose the source code we must be able to analyze it and understand it now there are movements what is called explainable AI or XAI how to manage to explain the results of an algorithm especially algorithms used in artificial intelligence those based on machine learning nowadays we often use algorithms based on machine learning that are able to make predictions or to establish connections classifications ties than are extremely subtle sometimes highly accurate but whose results we are unable to explain that raises major challenges for those who use the results of the algorithms and evidently, that also raises certain issues regarding societal and political matters can we trust these algorithms in a situation in which their results will to make important decisions that impact individuals this also represents an emerging topic nowadays and I have no doubts that it will become a major issue over the next decade