Privacy controls should be developed for users of connected and autonomous vehicles, say data watchdogs

Out-Law News | 06 Oct 2017 | 9:34 am | 1 min. read

Users of connected and autonomous vehicles should be able to control who is given access to the data generated by those vehicles, data protection watchdogs from across the globe have said.

At a recent international conference of data protection and privacy commissioners in Hong Kong, the watchdogs passed a resolution on data protection in automated and connected vehicles which urged vehicle users to be provided with "granular and easy to use privacy controls" to allow them to "where appropriate, grant or withhold access to different data categories in vehicles".

The users should also be able to "restrict the collection of data" through "technical means", and that "secure data storage devices" should be provided to give vehicle users "full control regarding the access to the data collected by their vehicles", it said.

The recommendations (5-page / 317KB PDF) were aimed at a range of organisations in the market, from vehicle and equipment manufacturers, providers of data-related services and standard-setting bodies, to public authorities and car rental companies.

The resolution also "seriously urged" the organisations to ensure that individuals are given "comprehensive information as to what data is collected and processed in the deployment of connected vehicles, for what purposes and by whom", and backed the use of data anonymisation or pseudonymisation measures to "minimise" the amount of personal data that is recorded.

The watchdogs also said that the technical means to "erase personal data" about vehicle users should be provided for when vehicles are sold on to new owners or returned to owners from temporary users.

Transparency over the functionality of "self-learning algorithms needed for automated and connected cars" was also advocated by the watchdogs, who also said the algorithms should be "subject to prior assessment by an independent body in order to reduce the risk of discriminatory automated decisions".