Undoing 40 years of progress on information rights

"Those who fail to learn from history are condemned to repeat it" Winston Churchill.

As always happens in the wake of a disaster, experts search for explanations as to how such a thing could have happened. Following the 2007/8 financial crisis, the dominant theme pointed towards light-touch financial regulation as the leading cause. Left to its own devices and free from government restraints, the 'market' acted in its own best interests, leading to excessive risk-taking, instability, and the near collapse of the entire financial system. 

Given the negative consequences that can result from minimal regulation, those operating in the field of data protection are sounding alarm bells that history risks repeating itself when it comes to protecting people's hard-won information rights and the consumer trust generated when organisations invest in ethical data practices. 

At issue are a number of concerns:

  • The Information Commissioner's Office's (ICO) inconsistent data protection enforcement track record,

  • A lack of understanding by politicians on the importance of information rights within a robust data protection legislative framework, including the need for an independent regulator,

  • The weakening of privacy protections contained within the Data Protection and Digital Information (DPDI) Bill and the reluctance of the UK government to regulate artificial intelligence (AI).

ICO lack of enforcement and inconsistency

The ICO has a wide range of enforcement powers available under the UK General Data Protection Regulation (GDPR), Privacy and Electronic Communications Regulations (PECR) and the Data Protection Act 2018 (DPA18). These powers include the ability to issue a warning for misdemeanours to reprimands, enforcement notices, and substantial administrative fines. The ICO is a Competent Authority with the power to prosecute offenders and, within its GDPR powers, has the power of Entry and Inspection (where necessary, the ICO requests a District or Circuit Judge to issue a warrant as per Domestic law).

Despite being one of the better-resourced data protection authorities, the ICO has always been pragmatic when it comes to regulatory enforcement compared to its international counterparts.

Under the direction of the current Information Commissioner, John Edwards, we are beginning to feel the consequences of an extreme version of a pragmatic approach. This takes the form of lax and inconsistent enforcement due to a stated desire to use discretion to reduce monetary penalty amounts or apply a different regulatory action. Monetary penalties will only be issued for more serious violations—an approach initially earmarked for public sector organisations  has, in fact, been applied liberally across the board. 

The basic idea behind a Monetary Penalty is to penalise previous “bad” behaviour in the processing of personal data; this is different to an Enforcement Notice which aims to ensure compliance with data protection rules in any future processing.  If both types of enforcement are effectively ignored, then the objectives of the legislation to protect data subjects from inappropriate processing of their personal data is much diminished.

In March, the Open Rights Group (ORG) issued a stinging rebuke of the ICO's "poor track record" on enforcement, highlighting that "in 2021-22 it did not serve a single GDPR enforcement notice, secured no criminal convictions, and issued only four GDPR fines totalling just £633,000, even though it received over 40,000 data subject complaints."

This was followed by an analysis of official personal data breach reports made to the ICO by international law firm Mishcon de Reya, revealing an alarming 8000% increase in the number of people affected by financial data breaches in the central government between 2019 and 2023. In 2023, approximately 195 million data subjects in the UK were likely put at risk by breaches of data security in central government relating to economic or financial data. 

Last call for information rights departure lounge

While highly alarming, these reports lack the impact of an article in The Telegraph in late 2023, which confirmed that the ICO had closed its investigation into the low-cost airline EasyJet following a personal data breach in 2020. The data breach was so severe that it resulted in the theft of personal details and travel itineraries of nine million people, including credit card numbers of 2,200 individuals. 

Since the announcement was made, data protection officers (DPOs) and compliance professionals have started to observe a noticeable shift. In the absence of a genuine threat of regulatory enforcement, their advice, warnings and requests to colleagues and their executive leadership have started being ignored. Whether by design or an unintended consequence, the message to industry practitioners is that their work is less important than the pursuit of innovation.  

Commenting at the time, Dr Chris Pounder, director at Amberhawk Training Limited, said:

"The ICO is risking a return to the management attitudes under the Data Protection Act 1984, where controllers considered non-compliance with data protection obligations as an acceptable risk that could be taken. Its enforcement strategy; therefore, risks undermining compliance with the data protection regime".

Joyce Allen, director at Freevacy, highlighted the extent of the feelings of being let down and abandoned:

"As educators, we receive regular comments from practitioners saying how difficult it has become to persuade other business functions to implement and adhere to compliance practices when all they can see is the watering down of data protection standards."

Trampling over the integrity and independence of the ICO

It should hardly require stating, but the purpose of independent regulatory enforcement of data protection law is to ensure actions and decisions are not swayed by pressure driven by short-term political strategy. 

Unfortunately, the DPDI Bill illustrates the lack of understanding by politicians regarding the implications of weakening privacy and information rights and the ICOs independence. The focus of the Bill appears to be driven by a desire to weaken the UK data protection legislative framework in order to increase data sharing in the public sector, productivity in the use of personal data, easier export of personal data to other countries, and commercial reuse of personal data. 

In making these changes, the Bill treats data protection as a barrier to progress; disregarding the fact that barriers (e.g. Thames Barrier) protect society from harm.  In this case, data protection is the necessary buttress that protects the Article 8 right to respect for private and family life which is arising from, for example, novel AI techniques, modern tracking technology or the use of facial recognition CCTV.

History has shown time and again that loosening regulations only results in executives taking greater risks and pushing to the limit of what is permissible to maximise profit. This fact, along with the destruction of people's lives that can result from personal data breaches and other regulatory infringements, seems to have escaped the attention of many in government.

The government should be acknowledged for removing a provision in the Bill granting the Secretary of State powers to effectively veto ICO codes of practice. However, recent analysis by Dr Pounder reveal other concerning powers contained in Schedule 13 that have the potential to transfer data protection enforcement to the Secretary of State through the appointment of Non-Executive Members to the newly formed Information Commission (IC), as it will be known. The proposed Commission's voting structure, from the direction of data protection policy to the appointment of the Chief Executive and other Committee Members, will be subject to the approval of these Ministerial appointees. 

The Commissioner will not become the Chief Executive with day-to-day responsibility for executive actions, but will instead become Chair of the Board. A figurehead with no real powers. 

Concerned about the UK’s intention, the European Parliament Committee on Civil Liberties, Justice and Home Affairs has written to Lord Peter Rickets, Chair of the UK House of Lords European Affairs Committee, warning that changes to the independence of the ICO could impact the EU-UK adequacy agreement.

Mixed messaging on AI

If there was ever a moment to consider whether the direction of travel the government is heading down is the right path, a quick review of how its plans to regulate AI have changed in less than 18 months is a good a place to start.

The UK government's white paper, AI regulation: A pro-innovative approach, made clear that politicians were reluctant to overregulate the AI industry. However, in an apparent U-turn, the Financial Times reported that the government has begun crafting AI legislation only months after the prime minister vowed not to for fear it may stifle innovative growth. The mood has clearly changed, as this is in response to the growing concerns about the harms an unregulated AI industry could unleash. 

Although the above is positive news, it seems to contradict provisions in the DPDI Bill that lower data protection standards, particularly around automated decision making in the context of AI and the definition of personal data. The Bill is an opportunity to address these discrepancies rather than risking further confusion in an already complex regulatory area. 

Just like with the introduction of the GDPR, the European Union has set the gold standard with its proposed Artificial Intelligence Act (AI Act). As countries around the world race to enforce the use of responsible AI through regulation, the UK would do well to pay attention to the approach taken by its closest neighbours. 

If you agree with this sentiment, sign a recent petition calling for the Bill to be amended so that it mirrors the AI Act. 

Liz Taylor, Managing Director of Tkm Consulting and PhD student with University College London said:

The need to safeguard individuals and their personal data in increasingly complex operating environments with robust controls and regulation remains as critical as it ever has been.  For those of us working in the sector, a comprehensive framework for regulation continues to be a necessity to develop appropriate and effective controls, including skills and competences.”

Recommendations

While there is still time, the government is urged to: 

  1. Require the ICO to make use of its full range of enforcement powers and end the failed light-touch regulatory approach experiment,

  2. Remove any provisions in the DPDI Bill that compromise the independence of the ICO or shift its focus away from monitoring and enforcing data protection and privacy regulations,

  3. Remove any provisions in the DPDI Bill that weaken the definition of personal data.

  4. Revisit the DPDI Bill to include provisions for AI regulation.

  5. Remove the provisions in the DPDI Bill that weaken the A.8 human rights regime.

In light of the government's announcement to hold a general election on 4 July 2024, it is vitally important the DPDI Bill is not rushed through as part of a 'washup' before Parliament is dissolved. Under these circumstances, the Bill should be scrapped, and any future government should consider the points raised above before introducing further legislation to regulate the use of personal information and artificial intelligence technologies.

This article has been collectively written by:

Thumbnail image for this blog is by Markus Spiske on Unsplash 

Next
Next

A response to Jon Ungoed-Thomas' article for the Guardian "Revealed: key files shredded as UK government panic grew over infected blood deaths lawsuit"