The UK Information Commissioner’s Office (ICO) recently published draft guidance on privacy enhancing technologies (PETs). It forms the fifth chapter of a draft suite of guidance on anonymisation, pseudonymisation and PETs.

PETs are technologies intended to minimise the use of information about individuals, improve the security of data or give individuals greater control over data relating to them.

When deployed correctly PETs can assist an organisation’s compliance with various requirements of UK data protection law, including:

  • demonstrating a ‘data protection by design and by default’ approach;
  • ensuring only the minimum necessary amount of personal data is processed (data minimisation principle);
  • restrictions on reusing data for new purposes that are incompatible with those for which it was originally collected (purpose limitation principle);
  • implementing appropriate security measures; and
  • minimising risks that may result from a breach of security or confidentiality.

PETs can be particularly helpful in contexts where sensitive data needs to be shared by organisations or where data will be collected or analysed on a large scale (such as in cloud computing services, artificial intelligence or the Internet of Things).

UK data protection law is heavily based on the EU’s GDPR regime, so the draft guidance may also be useful to EU and other international organisations.

Traps for the unwary

The draft guidance emphasises that PETs are not a ‘silver bullet’ for data protection compliance. The general requirements of data protection law continue to apply even if a PET is deployed, so organisations must consider their suitability on a case-by-case basis and watch out for some pitfalls:


Further detail

Lack of maturity

The technology may be insufficiently developed in terms of scalability or robustness. The draft guidance contains suggestions for maturity assessments (eg by using ‘Technology Readiness Levels’ to categorise PETs into discrete categories from conceptual to market-ready products).

Lack of expertise

The organisation must ensure it has appropriate skills to utilise and configure the PET correctly.

Implementation errors

For example, attacks and vulnerabilities should be monitored regularly.

Overestimating anonymisation

Truly anonymised data no longer relates to an identified or identifiable individual and therefore falls outside of the UK and EU data protection regimes. Many PETs do not result in anonymisation.

The UK’s Centre for Data Ethics and Innovation (CDEI) has also highlighted how the use of PETs may lull organisations into a false sense of security. By itself, implementing a PET will not prevent unethical data gathering or outcomes.

According to the ICO organisations must therefore undertake a case-by-case assessment of whether the PET is suitable in a particular context based on factors such as the nature, scope, purpose and context of the data processing and the maturity of the PET. EU regulators have likewise warned controllers subject to the EU’s GDPR that they must assess whether any PET used is appropriate in the circumstances.

Examples of PETs

The draft guidance gives an overview of several types of PET, together with detailed commentary on when each may be appropriate, standards that apply, how the PET assists with data protection compliance, factors to consider during implementation, and associated risks and weaknesses. These PETs are:



Homomorphic encryption (HE)

Allows encrypted computation on encrypted data without decryption. May be ‘fully’ (FHE), ‘somewhat’ (SHE) or ‘partial’ (PHE).

Secure multiparty computation (SMPC)

A set of rules for transmitting data between computers that allows multiple parties to jointly perform processing on combined data without any party needing to share all its respective data with the other(s). The output of the processing is shared.

Private set intersection

 A type of SMPC that allows parties to identify aspects that two data sets have in common without sharing all the data.

Federated learning (FL)

Allows multiple parties to train artificial intelligence models using their own local data and then share certain patterns those models have identified into a more accurate shared model (without revealing all the local data). The draft guidance explains how FL may rely on a centralised or decentralised approach.

Trusted execution environments

A secure part of the computer’s central processing unit that is isolated from the rest of the system and in which code can be run and data accessed. This PET requires specialist hardware and software elements.

Zero-knowledge proofs

A set of rules for transmitting data between computers in which a party is able to prove possession of particular data (eg information proving age) without sharing the specific data (only proof that it is in possession of the specific data).

Differential privacy

A method that injects randomised ‘noise’ into data with the aim of making it impossible to confidently determine that information in relation to a particular individual is present in the data. ‘Local differential privacy’ is where individual users add ‘noise’ to individual records before aggregation, whereas ‘global differential privacy’ adds noise during the aggregation.

Synthetic data

‘Artificial’ data generated in a manner that replicates patterns found in real data, so that analysis of the synthetic data should result in similar results to analysis of the real data.

Global relevance and next steps

Regulators, industry and government bodies around the world are showing great interest in PETs and their ability to facilitate more secure, confidential and privacy-compliant use and sharing of data. For example:

  • the EU Agency for Cybersecurity (ENISA) has focused on PETs for many years and has produced a maturity assessment methodology for PETs;
  • in June 2022, the US and UK announced a collaboration on prize challenges to help advance the maturity of PETs to combat financial crime;
  • in July 2022 regulators in Singapore launched a ‘PET Sandbox’ to support businesses wishing to pilot PET projects addressing common business challenges; and
  • in September 2022, the US Office of Science and Technology Policy requested comments on PETs to help inform US national policy.

The ICO has requested feedback on all chapters of the draft guidance on anonymisation, pseudonymisation and PETs by 31 December 2022.

The ICO is also calling for the development of industry-led governance (eg codes and certification schemes) to encourage appropriate use and development of PETs.