Skip to main content

The Application of Data Privacy & Protection to Tech Platforms

Almost everything we do these days involves sharing your personal data. Every time you buy something online, post on social media, search for something on a website or send an email, you share some data. It is convenient and easy, but what are the risks to individuals and tech companies?

Potential employers might not like what they see on social media. Your carefully guarded social image can be severely damaged by “old information” that pops up in search engine results. If you’ve received negative coverage in the media or press, the impact can be devastating on your personal and business life. So, is there anything you can do to control how your data is used?

Do you have any protection for how your data is used or controlled?
In the UK, the Data Protection Act (DPA) came into effect in 2018 to retain the EU General Data Protection Regulations (GDPR) post-Brexit. The Act controls how organisations, businesses and the government use your personal information. The message to tech companies is clear – they must pay attention to how they process and control data. The Act also sets out the Information Commissioner’s Office (ICO) role, which is responsible for regulating data protection in the UK.

What are the general data protection principles? 

Anyone responsible for using personal data must ensure that the information is:

  • used fairly, lawfully, and transparently;
  • used for specific, explicit purposes;
  • used in a way that is adequate, relevant, and limited to only what is necessary;
  • accurate and, where necessary, kept up to date;
  • kept for no longer than is necessary;
  • handled in a way that ensures appropriate security, including protection against unlawful or unauthorised processing, access, loss, destruction, or damage.

There is more substantial protection for sensitive information, like biometrics, political opinions, religious beliefs, and health. And there are separate safeguards for data relating to criminal convictions and offences.

These principles place a heavy burden on those who collect, control, and process data. Before we look at how this plays out in real life, we need to consider what rights the DPA provides for individuals to protect their data.

What are my rights as an individual under the DPA? 

In general, you have a right to know what information an organisation stores about you. It includes specific rights such as:

  • the right to be informed about how they use your data;
  • access to personal data;
  • the right to have incorrect data updated;
  • have data erased; and
  • restrict or stop the processing of your data.

Since these rights were enshrined in statute law by the Data Protection Act in 2018, the courts have delivered a few interesting judgments that will undoubtedly impact tech companies and the further development of data protection in the UK.

Loss of control” of data and the judgment in Lloyd v Google 

Last year the UK Supreme Court delivered a long-awaited judgment in the case of Lloyd v Google LLC [2021] UKSC 50. This case was a class action against Google, alleging that it breached its duties as a data controller under the DPA. It accused Google of bypassing the privacy settings of over 4 million Apple iPhone users to track their internet activity without their consent.

In a judgment that tech companies should welcome, the Supreme Court held that a claimant is not entitled under section 13 of the DPA to compensation for the unlawful processing, or “loss of control”, of data per se: “The term “damage” refers to material damage (such as financial loss) or mental distress distinct from, and caused by, unlawful processing of personal data in contravention of the Act and not to such unlawful processing itself.”

Based on Lloyd, to succeed with a compensation claim, claimants must therefore prove:

  1. There was a breach of the data controller’s obligations under sec 4(4) of the DPA 1998.
  2. The data subject suffered damages as a consequence of the breach.

The court concluded as follows: “The attempt to recover damages without proving either what, if any, unlawful processing of personal data occurred in the case of any individual or that the individual suffered material damage or mental distress as a result of such unlawful processing is therefore unsustainable.”

Other points of interest for tech companies in the Lloyd judgment:
  • The right to compensation is qualified and will only succeed if the tech company does not take reasonable care when processing the data. It is not strict liability.
  • Claims for trivial breaches of the DPA are excluded.
  • The judgment limits the scope of representative actions of this kind.

Although Lloyd provides some relief for tech companies and is now the leading authority on damages for breaching the DPA, it should be noted that this judgment specifically dealt with a claim under the DPA. It was not a claim under the tort of misuse of private information. It also dealt with the law that applied pre-1998 when the litigation started. It should be interesting to see how the law around data protection develops in future.

Tech companies and the right to have data erased

The first delisting order based on the “right to be forgotten” was delivered in April 2018 by the England and Wales High Court in NT1 and NT2 v Google [2018] EWHC 799 (QB). Initially, Google rejected a request by the claimants to remove specific information.

The court then ordered Google to delist eleven URLs that came up in search results referring to the spent conviction of a businessman, NT2. A similar request by NT1, also a businessman, was rejected by the court.

Both men were convicted of criminal offences many years ago, and reports of the cases were published at the time. Links to these reports were available on Google searches. The men now claimed that the information in search results was old, inaccurate, irrelevant, and of no public interest. Alternatively, it was an interference with their data protection and/or privacy rights. They claimed under the DPA and the tort of misuse of private information.

The court gave judgment based on the “right to be forgotten” or, more accurately, the right to have personal information erased, delisted, or deindexed by search engine operators.

The right to be forgotten and public interest

The court rejected NT1’s claim because he was a public figure. His crime and punishment could not be regarded as a private matter. It was a matter of public interest, especially since it was a business crime. Members of the public had an interest in the information when assessing his honesty as a businessman.

However, the court did uphold NT2’s delisting claim since his crime did not involve dishonesty, and the information was now outdated and no longer relevant. The court found that there was no sufficient legitimate interest to users of Google to justify continued availability on searches. NT2’s past offending was of very little relevance to anyone wanting to assess his suitability for engaging in his current business. The current business was in a different field to the business he was engaged in at the time of the conviction.

The court further held that Google could not rely on section 32 of the DPA’s journalism exemption.

However, although the court upheld NT2’s “right to be forgotten”, it also accepted Google’s defence under sec 13(3) of the DPA. The court held that Google had taken reasonable care by having a process in place to comply with the relevant requirements for data processing. No damages were awarded.

Although right to be forgotten claims are becoming increasingly common against search engine operators since it was enshrined in statute law by the DPA as the “right to erasure”, it is never simple or straightforward. As illustrated in NT1 & NT2, the outcome will always depend on the case’s particular circumstances.

Can search engines deny a request to delist information? 

The right to be forgotten is always balanced with the search engine’s legitimate interest in processing data lawfully and the right to freedom of expression.

The search engine can deny a request to delist information if they can prove that:

  • it is in the public interest to keep the information available;
  • the information is still relevant; or
  • there is another legitimate reason to continue processing the information.

Google’s Transparency Report indicates that of the 4,898,654 URLs requested to be delisted, they delisted 48,6% from May 2014 to April 2022.

Where does this leave tech companies? 

It is critical that tech companies that process and control personal data understand their obligations and comply with the UK GDPR and the DPA. Data controllers and processors must pay attention to the rights of data subjects and the principle that personal data must be processed fairly and lawfully, used in a way that is adequate, relevant, no more than necessary, and must be accurate. Failure to do so can have costly consequences.

In practice, tech companies seem to be paying attention to privacy-enhancing technology; they’re simplifying privacy notices and terms and conditions, for example. However, the value of private data to prominent tech platforms, and the insight they gain from the analysis of that data, cannot be understated. From Netflix recommendations to Meta’s (formerly Facebook) finely tuned and targeted advertising, the tension between access and legal rights is set to remain.

As we advance, both tech companies and individuals will have to keep track of developments in data protection laws and individual rights. Data protection is a complex area of the law. We can expect the law will keep evolving to keep up with modern trends in a society that is ever more driven by data.

Individuals who are concerned about their rights and how to assert them can seek support us here at CTT Law.