Out-Law Analysis | 26 Nov 2012 | 8:07 am | 2 min. read
IBM estimates that every day 2.5 quintillion bytes of data are produced but Economist Intelligence Unit's research suggests that over 75% of businesses are wasting more than half the data they already hold.
A lot of this wastage is caused by uncertainty and a lack of understanding about the precise requirements of EU data protection and security rules.
The overriding EU law on data protection states that personal data rendered anonymous falls outside the scope of data protection laws.
So a key issue for businesses is to consider to what extent they can demonstrate through internal risk assessment processes that they have effectively anonymised data.
This raises the issue of possible 're-identification' or 'the unanonymising of data' through matching data released by one company with other data that may be in the public domain or in the possession of another.
Official guidance to EU legislation states that if data is to exist outside of data protection laws re-identification must "no longer be possible".
But the ICO last week in new guidance stated in no uncertain terms that a business which wants to anonymise data need only prove that it has assessed the risk of re-identification, and having done so, can reasonably conclude that there is only a remote risk of re-identification. This approach follows UK case law.
The ICO's opinion that data exists outside the scope of data protection laws even if there is a remote risk of re-identification is good for business.
Businesses do not want to see their costs expended on innovative technologies that enable the harvesting and mining of data sets to be written off as wasted because of unrealistic approaches taken by regulators to data protection laws.
Privacy protection is no doubt an essential concern that must be respected. Privacy, respect for personal and family life and a person's reputation are all interests protected by EU human rights law.
But privacy is not a right that can be guaranteed, just as all other rights cannot be guaranteed.
We never like to think about it, but whenever an airplane carries passengers there is always a risk that a disaster may occur. We do not prohibit the carriage of passengers because the technology cannot guarantee each passenger's safety. To impose a higher standard on privacy protection processes would be extreme.
Other EU member state regulators should be encouraged to advise businesses to take a similar approach to the UK on anomymisation of data. At present though it looks like some may be tempted to take a more restrictive approach.
From Out-Law.com's initial discussions with German regulators, we gather that approaches suggested in the German academic legal literature differ greatly from the view taken by the ICO.
Some academics apparently suggest that the possibility of data being released and then being matched to other data obtain by illegal means would not count just as a remote risk, but as one that must be assessed by every business before they conclude that data has been truly anonymised.
For companies seeking to disclose anonymised data sets to have to prove that a partner to a cross-organisational arrangement to share data, a related company or a buyer will not go to the black market to re-identify individuals requires a standard of care well above that required in other legal contexts.
A high level of protection of privacy interests must be maintained but also balanced with the economic consequences of rigid insistence upon complete non-interference with data that could, in remote circumstances, identify someone.
The European economy in its current state cannot afford to miss out on the benefits of big data processing. Rigid interpretations to regulation hand non European markets a significant early mover advantage, and that is a loss that the EU cannot afford.
Luke Scanlon is a technology lawyer with Pinsent Masons, the law firm behind Out-Law.com