• News
  • Key issues
  • Privacy

How they get away with it: nuances in GDPR

There are a few grey areas, or ‘nuances’ within GDPR that can, and have, been used to get away with some pretty questionable things. Let's take a look at the main ones.
Man with information sack over his back riding data rope in and out of "GDPR" letters loopholes

GDPR is often cited as the strongest and most comprehensive data protection law in the world and in history. But, it’s not as clear-cut as you might think.

Like any law that covers such a complex topic there are a few grey areas, or ‘nuances’, within it that can, and have, been used to get away with some pretty questionable things. Let's take a look at the main ones.

Selling vs. sharing data

The ‘we don’t sell your data...but we do share it’ nuance

Google, Amazon and Facebook all say that they don’t ‘sell’ your data, but they do ‘share’ it with advertisers... to make tens of billions of pounds a year.

This is a completely legal way to phrase it, but it’s also pretty disingenuous. Google, for example, acknowledges that a ‘sale’ of data is occurring at some point, but says it’s the responsibility of each app and website when it happens. However, Google does two main things to facilitate the sale:

  1. It uses data to build ‘data profiles’ on individuals and lets advertisers target groups of people based on these profiles.
  2. It also allows real-time bidding (RTB); sites and apps auction off ad space and share personal data including geolocation, cookies, and browsing history to advertisers to bid on individual ads.

But, again, this isn’t ‘selling’ data.

Delete your data from companies

Inferring your sensitive data from your online behaviour

This is justified by the ‘definition’ nuance

The definitions in GDPR can be up for debate. For example, mental health data is classed as ‘sensitive’ data and so cannot be sold to advertisers. But, just because you search ‘what is depression’ or ‘am I depressed’ doesn’t necessarily mean that you have depression, and so your search history could be sold with the defence that this isn’t necessarily health data.

You can read more about the selling of mental health data in our other blog.

What are the privacy rights of the dead?

The ‘natural persons’ nuance

GDPR protects the processing of personal data that belongs to ‘natural persons’, this means anyone who is a living and breathing person.

But, public archives are an example of when a grey area can occur. If you found out, for example, by looking through public archives that a politician’s dad (who has passed away so is not protected by GDPR) was a murderer, releasing this information would certainly affect the politician and so could count as his personal data. But, there’s nothing stopping you from releasing this information- as long as it’s for ‘journalistic, academic, artistic and literary’ purposes.

The grey area around the 'public interest'

The ‘journalistic’ nuance

If organisations process data for journalistic, academic, artistic and literary purposes, they have special exemptions from data privacy laws. This is because the Human Rights Council has always valued free speech as an essential pillar of democracy.

But, it can be difficult to find the line between the public interests’ right to information and the protection of an individual’s own data privacy.

Wikileaks is a classic example of this. The people who leak or hack information are committing a crime for breaching data, but Wikileaks has to-date never been prosecuted for publishing this breached information. Wikileaks can make the claim that publishing this information is in the public interest. But, if they leak data that doesn’t redact, for example, the full names of US intelligence workers, potentially putting their lives in danger, is this leak necessary for the public interest? It’s another grey area that’s up for debate.

Another example would be when Naomi Campbell brought an action against The Mirror for breach of confidence. The Mirror revealed she was receiving treatment for drug addiction and how many times a week she went to therapy. Although this happened pre-GDPR, the old DPA 1998 law had similar rules that won the newspaper’s case- the newspaper argued that since Campbell was a public figure and role model who previously denied having a drug addiction, they were acting in the public’s interest by ‘setting the record straight’.

Voice-controlled devices

The ‘phone listening into conversations’ nuance

It’s not quite wire-tapping, but companies like Apple share Siri voice recordings with human contractors, Facebook did the same with audio chats in messenger, and Amazon Alexa and Google Assistant did the same. Tech companies can be pretty quiet about this, possibly because needing humans to check over their processing can make their abilities seem weaker or raise privacy concerns. Contractors are used to transcribe a random selection of voice recordings to help improve performance.

What’s worrying, is that although users give consent to this in the terms and conditions, or even sometimes by opting in, many of these conversions are captured accidentally when the device hears the wrong ‘trigger word’ and activates. For example, a whistleblower working for Apple said to the Guardian:

‘There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters...It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.’

Worryingly, the same whistleblower said there are no procedures in place to report the breach of this sensitive data, 'we're encouraged to hit targets, and get through work as fast as possible'; they only have to report technical problems. The sharing of sensitive information from voice recordings happens frequently, and legally, which most people may not know when they give their consent.

Find out what companies know about you

Difficulty enforcing GDPR

The ‘enforcing the law’ nuance

It can be quite hard to fully enforce GDPR. Remember, as a law it’s still relatively new.

As an everyday example, privacy policies should be easy to understand using ‘clear and plain language’. In most cases, it should be clear enough for a child who’s using the internet to understand e.g. ‘Hi, we collect and sell this kind of data about you to advertisers xyz so they can strategically target you with ads. Do you consent? Yes or No.’ It can be difficult to ensure that every site and app does this, and whether something is ‘easy to understand’ can be pretty subjective, so unfortunately many policies remain as they are.

It's also pretty hard for the ICO to collect money from the fines it's issued. You may remember the headlines last year of the £183m fine on British Airways and the £99m on the Marriott hotel chain, both for failing to keep customers’ data safe from hackers, but so far the ICO has only collected around £39 million from all the fines its placed since GDPR (2018).

Many DPAs (Data Protection Authorities) are also very overwhelmed and under-resourced. There’s a general lack of adequate skill sets, specialised lawyers and ‘techies’ - they’re in high demand and not cheap. For example, the MET police have a backlog of over 1,000 open and active SARs (Subject Access Requests) with over 600 of them over 3 months old, way past the time limit of 1 month to respond to them. They cited the reason as being understaffed. It’s especially important for the police to have skilled data protection staff as releasing certain data in SARs could tip off a suspect or compromise an investigation.

Exceptions for the police

The ‘police power’ nuance

The police, or anyone who processes data for the purpose of the ‘security of the State’, are given certain exceptions when it comes to fair processing of data under the law. For example, they have exemptions when it comes to how ‘transparent’ their data processing is and how long they store data for.

It makes sense for the police not to be transparent about the undercover work they do, or to hold data for years because it can be used as evidence in a related case. But, transparency can be very important in making sure police power isn't abused. A recent win in this area was when the police were forced to be transparent in California when the First Amendment Coalition won an appeal saying the public had the right to see ‘police misconduct' and 'excessive force' records that had been unavailable for decades before.

As long as data processing is ‘fair and reasonable’, a phrase that’s always open to interpretation, the police have a lot of power when it comes to data processing. Unfortunately, in reality this can lead to racist or other dangerous nuances in the law. A related article in the Guardian urged for national guidance ‘to oversee the police’s use of data-driven technology amid concerns that it could lead to discrimination’. You can read more about this in our other blog.

The same article revealed that the police take too much data from phones belonging to witnesses and victims. Whether due to an incorrect interpretation of the law, lack of resources or intentional abuse of power, the IC (Information Commissioner) says that there’s ‘excessive amounts of personal data often being extracted, stored, and made available to others, without an appropriate basis in existing data protection law’. There’s ‘little or no justification or demonstration of strict necessity and proportionality’. The article speculates that this could be deterring people from reporting crimes. ‘Hundreds’ of rape victims, continued the IC, mostly female, are ‘distressed’ when they’re forced to hand over their personal data after being warned that otherwise the suspect won’t be prosecuted. This was coined the ‘digital stripsearch’ and has since been challenged.

Essentially, the problem is trying to find a balance between law enforcement and people’s data protection rights, particularly difficult when the widespread use of technology and being able to access the data that’s generated from that is relatively new.

How anonymous is data, really?

The ‘anonymized data’ nuance

Personal data is any information that can identify you as an individual, so what some researchers and organisations do is make data ‘anonymous’ e.g. by removing your name and address. Then they can process the data without the restraints of GDPR.

But, a recent study found that data can easily be de-anonymised. ‘99.98% of Americans’ could be ‘correctly re-identified in any dataset using 15 demographic attributes’. 15 data points isn’t actually that many, to put it into context, data broker Experian sold Alteryx (a data science and analytics company) ‘access to a de-identified dataset containing 248 attributes per household for 120M Americans’. The study found that it’s practically impossible to successfully anonymise any complex data-set

For example, in 2018 researchers at the DEF CON hacking conference demonstrated how easy it was to acquire the ‘anonymous’ browsing history of 3 million Germans and then quickly, freely and legally de-anonymize parts of it. The researchers were able to uncover, for instance, the very personal browsing history of a German judge.

Delete your personal data today

How 'reasonable care' is defined

The ‘reasonable care’ nuance

As long as a company takes ‘reasonable care’ in protecting data, they can sometimes successfully argue that they aren’t liable for the data breaches that occur in their company, no matter how massive. For example, Morrisons was found not liable when an employee leaked the payroll data of about 100,000 employees. You can read more about this case in our class actions blog, but amongst other reasons, this ruling happened because the employee’s motive was a grudge against Morrisons and not to further the business, and he acted outside of his ‘field of activities’ because he wasn’t authorised to do this. The five Supreme court justices all found that Morrisons could not have reasonably expected to prepare for this and so were not liable.

Nick McAleenan, a data rights specialist lawyer who represented the group of 9,000 claimants in the class action against Morrisons, said: ‘My clients entrusted their personal information to their employer, Morrisons, in good faith...my clients are of course hugely disappointed by the decision’

'Similar products and services' leaves a lot of wiggle room

The ‘lack of clarification’ nuance

Another example relates to those seemingly random telemarketing messages you may be getting. Although under GDPR you can only use a customer’s contact details to market ‘similar products and services’ to them that they initially agreed to, there’s no definition of what this phrase means. So it can be up for debate.

The law also says that you have to provide an identity for the person responsible for this telemarketing communication, but there’s no law against ‘disguised or concealed identities’ i.e. a fake identity.

Another lack of clarification is that GDPR and the E-Privacy Regulation says that you should verify the identity of the person making a SAR, but it doesn’t specify how. An experiment found that after an individual made a SAR request on behalf of his wife, ‘approximately one in four firms shared his partner’s personal information without consent’.

Final thoughts 💡

The main nuances in GDPR arise when trying to balance two rights, particularly an individual’s fundamental right to data privacy vs. public interest, the right to be informed, and freedom of expression. Other types of nuances that arise are when it’s hard to define a term or hard to enforce the law.

But, it’s important to remember that all of these examples are very case-by-case and its extremely important for all businesses to put in place, and always document, strong data protections measures. And, for all individuals to know their rights.

We've got lots of tips about how to protect yourself online and learning more about data in our blog. In the meantime, if you have any questions, our friendly support team would be happy to help.

If you're concerned about your data, and want to take control over it. You can find out where it is and tell companies to delete it, here.

By Klara Lee

Related Articles