Autonomous Intelligent Cars on the European intersection of liability and privacy



Download 132.72 Kb.
Page4/5
Date20.10.2016
Size132.72 Kb.
#5614
1   2   3   4   5

Interim observations


The regimes on (harmonized) product liability and on (non-harmonized) liability for damages caused by accidents in which motor vehicles are involved, have not been drafted with the emergence of Autonomous Intelligent Cars ‘in mind’. Most of these regimes use a concept of causality for determining and allocating liability. The notion that cars are becoming more ‘autonomous’, and thus less rely on the input of human operators, and that these cars are increasingly ‘intelligent’, able to operate in changing environments, proposes a challenge in terms of determining causality. A higher level of autonomy may implicate that it is harder to establish the exact cause of a damage inflicting accident. In the sketched scenario, it is hard to prove the relationship between the failed self-update, the behavior of the driver and the damage to the bicyclists disobeying traffic rules. For a claimant, it will be an uneasy task to underpin a claim towards the producer of the car: to prove that the AIC was defective, and that the defect caused or contributed to the origination of the accident. Whereas in most EU jurisdictions the driver or holder of a vehicle will be held (strictly) liable for at least a portion of the damages caused by accidents in which AIC are involved, autonomous intelligence makes it difficult to establish the respective influences on the origination of the event causing damages, and therefore the definitive apportionment of liability.

Uncertainty regarding causation of damages could lead to reluctance of developers to innovate in this field, for legal uncertainty will follow from the fact that it will be harder to calculate liability risks in this respect. Legal uncertainty is observed to often form an impediment to innovation.60 Willingness of consumers to adopt AIC technology may also depend on the question whether or not liability for damages caused by autonomous vehicles they own or drive can be insured. Introduction of AIC thus proposes challenges in terms of the current regulatory framework. Problems in this perspective may interfere with the very objective of liability regimes to fairly apportion risks. A regulatory solution could be to install a regime of strict no-fault liability for damages in which AIC are involved, for drivers or owners of these vehicles, and a more strict product liability regime. With regard to liability of owners and/or drivers of AIC, it would therefore be necessary to harmonize the respective liability regimes of Member States, which has not proven successful in the past at this specific topic. Furthermore, a too stringent liability regime for producers is also observed to form an impediment to innovation.61



Besides optimizing the regulatory framework on liability for damages in which AIC are involved, a practical contribution to a solution of determining the causation of damages could be to equip these vehicles with tracing mechanisms, recording and/or sharing movements and control-actions of these vehicles.

  1. AIC tracing technology and information privacy

    1. AIC tracing technology?


Insurance companies sketch that the emergence of Autonomous Intelligent Cars could lead to a shift in motor car insurance. While the advent of AIC technology is promising in terms of increased safety on the roads, resulting in less damage to be covered for, it is also thought that severity of accidents may increase the amount of damage caused by AIC accidents.62 They also observe that accidents caused by autonomous technology “would need extensive software and hardware analysis expertise in order to how and why it occurred”.63 One of the options to assess the cause of an accident, and therefore to aid in answering the question of where liability lies, could be to equip vehicles with black boxes, or with telematics technologies connecting AIC to a dedicated infrastructure, and/or to remote servers.64 Objectives of these types of technologies are amongst other things to record movements of autonomous cars, and operational choices that are made by either the car itself or the driver controlling its movement, as well as data concerning events and objects in the vicinity of an autonomous vehicle. While black box technology would be in a way similar to the functionalities that airplanes are currently being equipped with, leaving the gathered data inside the vehicle and offering a potential for assessment for instance after an accident took place, telematics technology may have wider applications. Telematics technology could not only be used for assessing errors and damage causes after accidents have happened, it could even contribute to preventing accidents from happening. Examples of telematics technology for that purpose, are Vehicle-to-Vehicle communication (V2V), for collision prevention between cars, and Vehicle-to-Infrastructure communication (V2I), for “safety, mobility and environmental benefits” in general.65 V2I are to “provide continuous, real time connectivity”66 in the operation of AIC on the roads. Although black box technologies and telematics solutions as V2V and V2I communication may be promising in terms of preventing accidents and distributing damages caused by AIC-accidents, the right to information privacy of people inside and surrounding cars equipped with these kinds of technology (error tracing technology), should be carefully observed.
    1. Current framework


Privacy has been defined by Warren and Brandeis in 1890 as “the right to be left alone”.67 It is the fundamental right of citizens that sees to the protection of a (both virtual and physical) personal sphere, in which neither the government nor other citizens should interfere without permission. Many forms and types of privacy have been recognized in literature. These can be categorized for instance as relational privacy (comprising communicative, physical, medical and spatial privacy), versus information privacy.68 Information privacy is the right to control the use of the personal information held by others than the individual it concerns. It is this kind of information that can be collected, processed and stored through AIC tracing technology. In the European Union, a regulatory framework is in place harmonizing the (informational) privacy of citizens. Article 7 of the Charter of Fundamental Rights of the European Union (Charter), which is an integral part of the Lisbon Treaty, states that “Everyone has the right to respect for his or her private and family life, home and communications”.69 Article 8(1) Charter adds that “Everyone has the right to the protection of personal data concerning him or her”. 8(2) specifies that such data should only be processed for specified purposes, which may only be done after the consent of the person concerned, or on another basis “laid down by law”. Furthermore must everyone have access to the collected personal data, and everyone must have the right to rectify the data. Paragraph 3 adds that an independent supervisory authority should see to the compliance with the rules.
      1. EU Data Protection Directive


There are a number of? harmonization directives providing more detailed rules on information privacy, of which the Data Protection Directive dating from 1995 (DPD) forms the core.70 This Data Protection Directive is currently being revised, and will eventually be replaced by the Data Protection Regulation, which shall have a direct effect in the Member States.71
        1. Core values


The objectives of the DPD are mainly twofold: It seeks to create a free movement of personal data between the Member States, which has positive effects for the EU internal market, while on the other hand it sets strict rules for the processing of personal data, in line with the protection of the fundamental right to privacy of EU citizens.72 The protection of privacy in general, and personal data specifically, are observed to be beneficial for consumers trust in for instance cross border industry development, and thus innovation and growth.73

The DPD applies to the processing of personal data by both analogue and digital means.74 Personal data are defined as “any information relating to a identified or identifiable natural person”, a data-subject. A person is identifiable when an individual can be identified, directly or indirectly, “in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity”.75 Processing personal data is defined very broadly, any action or “operation” with regard to personal data, such as the collection, storage, adaptation and deletion are considered as processing.76 Persons (either legal or natural) processing data are “controllers” in sense of the DPD, those who process personal data on behalf of controllers are called “processors”.77 Personal data may only be processed “fairly and lawfully”,78 and may only be collected for “specified, explicit and legitimate purposes”, processing should always be in line with these purposes.79 Fair and lawful processing of data may be the case if for instance a data subject gave his unambiguous consent, if it is necessary for the performance of a contract the data subject entered into, or if processing is necessary to comply with legal obligations of processor.80 Special categories of data, concerning inter alia the race, ethnicity, political opinions, religion, health and sex life of a data subject, may not be processed unless the data subject has for instance given his explicit consent, the processor is carrying out obligations of employment law, or processing relates to data which had been made manifestly public by the data subject.81 As a remedy for damages of the data subject that relate to unlawful processing of personal data, the DPD provides that the controller is liable for compensation of these damages, unless he can prove not to have been responsible for the damage causing event.82 Furthermore, Member States are to set up a supervisory authority to inter alia enforce compliance with the DPD-rules, who are also equipped with “effective powers” to investigate and interfere with activities that are carried out contrary to the rules.83


4.3.1.2 Co-regulated: appropriate measures for protection and exporting personal data


With regard to processing personal data through networks such as the Internet, the Data Protection Directive provides that there are certain measures to be taken to protect the integrity of data, and at the same time it imposes limits to the export of data to countries which have a lower level of protection than the EU Member States.

The DPD states that controller should implement “appropriate technical and organizational measures to protect personal data against accidental or unlawful destruction or accidental loss, alteration, unauthorized disclosure or access, in particular where the processing involves the transmission of data over a network, and against all other unlawful forms of processing.”84 Security state of the art and implementation costs must be taken into account for the implementation of measures. Furthermore, these “shall ensure a level of security appropriate to the risks represented by the processing and the nature of the data to be protected”. The DPD does not define or indicate when measures are “appropriate”, and is merely left to the Member States. In the Netherlands for example, the national law on the protection of personal data (Wet bescherming persoonsgegevens, which implements the DPD), has delegated this partly to co-regulation by the supervisory authority (College Bescherming Persoonsgegevens), who drafted policy guidelines for assessing the appropriateness of technical and organizational measures for the protection of data.85

With regard to the export of personal data, the DPD provides that in principle the transfer of data to third countries, may only take place if that third country ensures an adequate level of protection. Should data be exported to non-EEA countries, the European Commission must be consulted to verify whether or not the importing country offers adequate protection.86 For example the United States of America, who do not have a regime similar to the DPD, do not guarantee an adequate level of protection in view of the DPD, according to the EC. However, in collaboration with the US Department of Commerce, the EC has developed a “safe harbour” framework of principles and frequently asked questions.87 US companies and institutions complying to the safe harbour rules, are considered to guarantee an adequate level of protection in sense of the DPD, and are therefore allowed to ‘import’ personal data from the EU.

        1. Qualification of AIC tracing technology


Are the data collected and processed using black box or telematics technology in Autonomous Intelligent Vehicles personal data? This may be the case if these data identify or can, even indirectly, identify a natural person. Location data of AIC do not directly identify a natural person. However, it can be (easily) constructed on the basis of combined data that for instance location data of the AIC on working days between 8.00 and 8.30, plotting the route from A (a house address) to B (a certain office address), can identify the owner of a vehicle who happens to live at A and work at B. In that case, these location data logged in a black box, or communicated and stored somewhere in the cloud through for instance V2I communication, are considered personal data. Should not only location data be processed, but for example the vehicle be equipped with camera’s monitoring the behavior of people inside, and maybe even in the surroundings of the car, data are recorded representing natural persons, who can be directly identified from the images. For also the race of the filmed persons can be deducted, these form special category data. Processing data, such as location data, for instance through recording and storing (in a black box), and/or through real time communicating with other vehicles (V2V) or an infrastructure (V2I) is only allowed insofar this is done on a legal basis, as long as processing takes place for a specified lawful goal. It is at least questionable if the general processing of “all data available” generated through AIC tracing technology would be proportional to a goal of for instance accident prevention or analysis.

A legal basis may for example stem from specific applicable legislation, which is to date not in place regarding autonomous vehicle information in the EU.88 Without a basis in legislation, controllers, such as manufacturers, insurers, employers or road supervisory authorities, should for example obtain the permission of the individuals (drivers, but maybe also passengers, and people outside the vehicle) in order to comply with the requisite “fair and lawful” basis for processing. Processing special category data is more strictly regulated: this may not take place in principle, unless a legal exception allows this specific processing, which may be the case inter alia if data subjects gave their explicit consent.

Personal data must be stored (either in black boxes or on Internet-servers) under such conditions that appropriate measures have been taken to protect these from amongst other things loss (deletion) and alteration, or from being hacked into. It is not clear on beforehand which measures are deemed appropriate, as a harmonized approach on this subject is to date not available. National regulations can be relevant in assessing the appropriateness of technical and organizational measures. Furthermore, AIC-data may only be processed in connection with a specified goal. Before collecting, processing and storing these data, data subjects must be informed for which goal(s) personal data will be used.

Export of personal data, when this data are for example sent over the Internet, or stored in (cloud) servers outside the country in which the processing originated, should in principle not take place outside the European Economic Area, unless the country to which data are transferred offers an adequate level of protection. This implicates that data may only be transferred to the United States if the company importing the data complies to the Safe Harbour rules.


      1. Forthcoming EU General Data Protection Regulation


Some observations can be made considering the forthcoming General Data Protection Regulation (GDPR), although it’s text is not yet definitive.89 The GDPR will be applicable to every company who wants to operate on the European market. It will state that ‘privacy by design’ and ‘privacy by default’ are general principles. Privacy must be taken into account from the beginning of the development of new services, and that the default settings of these services must be ‘privacy friendly’. A risk assessment with regard to privacy issues must be made in advance. Furthermore, data subjects are to be equipped with more rights, for instance with regard to access to data, and they will get a (non-absolute) right to erasure of their data from systems. The rules will incorporate some level of flexibility, and are to be applied more strictly towards for instance multinationals than to smaller enterprises. Supervisory authorities will gain more effective powers to enforce the rules: fines up to 5% of the annual worldwide turnover of a non-compliant company, or whenever greater, a fine of € 100.000.000,- can be imposed.


    1. Download 132.72 Kb.

      Share with your friends:
1   2   3   4   5




The database is protected by copyright ©ininet.org 2024
send message

    Main page