Analysis of privacy policies show potential problems.
The General Data Protection Regulation (GDPR) was adopted by the EU in April 2016, with a two-year “grace” period before enforcement began in May 2018. Organizations have had two years to make the policy and procedural changes required to be in compliance with its requirements, and it seems as if that’s been all that many companies have been focused on as the deadline approached.
According to a survey of C-level security executives, 35.8 percent estimated the cost of compliance to their businesses would be between $50,000 to $100,000, while 23.8 percent pegged the cost at $100,000 to $1 million, and over 10 percent said their org would spend more than a million dollars to comply with the Regulation.
With all that time and all that money, you would think it’s a good bet that the majority of companies have nothing to worry about in terms of GDPR enforcement. However, TechRepublic predicted just before the deadline, in April, that 60 percent of companies were not going to be compliant in time. It looks as though that number was not far off base.
MetricStream research said on June 8 that approximately 55 percent of enterprises missed the deadline, and the European consumer group Bureau Européen des Unions de Consommateurs, more commonly known as BEUC, analyzed the privacy policies of top Internet companies found that over a month after the enforcement deadline, many of the top international companies still didn’t meet the requirements of the GDPR.
Consent is key
There are several lawful reasons for companies to collect, store, and process the personal data of individuals that are authorized by the GDPR in Article 6, Lawfulness of Processing. The criteria for lawfulness that is at the top of the list, and the one relied upon by many companies, is “the data subject has given consent to the processing of his or her personal data for one or more specific purposes.”
However, the regulators recognize that not all so-called consent is created equal. Criminal and civil statutes have long stipulated that consent given under duress or obtained by trickery is not valid consent. In the medical world, the term “informed consent” was formulated to specify that consent can be granted only when the consenting party has full and accurate knowledge of what he/she is giving permission for and what the possible consequences are.
Thus, the GDPR expands on the consent criteria by providing a whole separate article – Article 7, Conditions for consent, that lays out some strict requirements for the giving of consent to collect, store, and or process one’s personal data.
Subsection 2 says “If the data subject’s consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language.” In other words, you can’t “bundle” or “hide” consent for the use of personal data in with something else, such as a long agreement to abide by the site’s terms of service, and have just one check box for both.
Subsection 4 references the requirement that consent must be freely given, and says that when assessing whether that is the case, “utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.” That means if, for example, your phone number is not needed for you to be able to set up a social media page, but the site requires you to provide it in order to use the service, your consent to give them that information is not freely given.
The recitals are part of the GDPR designed to explain the text of the law and are essential to fully understanding the Articles. Recital 43 makes it even more clear that “Consent is presumed not to be freely given if it does not allow separate consent to be given to different personal data processing operations despite it being appropriate in the individual case, or if the performance of a contract, including the provision of a service, is dependent on the consent despite such consent not being necessary for such performance.”
Subtle manipulation is not allowed
Facebook, Google, and Amazon are among the technology companies whose privacy policies and the way they’re applied were found lacking. The first two, were slapped with complaints on the same day that enforcement began. Although the companies implemented new privacy settings in response, consumer groups still accused both of using “dark pattern tactics.” This refers to the presentation of privacy options in a way that encourages users to choose one option over another.
For example, if the option to give consent to use your personal data is prominent in the process of setting up your account, but the option to NOT allow that use is difficult to find – only available to change after account setup is complete and only by clicking through numerous settings – that is manipulative.
But there’s more. Recital 32 states very plainly that “Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement.”
Remember the old adage that “silence implies consent?” Well, that is decidedly not true in the context of the GDPR. The recital goes on to say “Silence, pre-ticked boxes or inactivity should not therefore constitute consent.”
You know all those times when you see the “yes” box already checked for your “convenience?” Under the GDPR that doesn’t constitute valid consent because it’s too likely that people in a hurry won’t even stop to think about it but will just accept that “default” answer.
Another example of a problematic policy clause would be statements that “By using our service, you agree to our collection, use, disclosure, and transfer of information you provide us.” The GDPR says you must consent by a “statement or clear affirmative action.”
Vague and insufficient
The BEUC’s analysis of current privacy policies (those in effect after the GDPR deadline) was conducted using artificial intelligence to scan all those policies. In describing the privacy policies of tech giants Google, Amazon, Facebook, Apple, Twitter, Microsoft, Uber, AirBnB, Netflix, and more (fourteen policies in all), they used words like “unclear,” “vague,” and “insufficient,” charging that the companies don’t properly explain why and how they use personal data and threaten that some features won’t be available if you don’t give up your data – without clarifying what those features are so users can make an informed decision. Specifically, the BEUC report identified three broad problems with the companies’ policies:
“Companies do not provide all the information which is required under the GDPR transparency obligations. For example, companies do not always inform users properly regarding the third parties with whom they share or get data from.
“Processing of personal data often does not happen according to GDPR requirements. For instance, clauses stating that that by simply using the website of the company, the user agrees to its privacy policy.
“Policies are formulated using vague and unclear language, which makes it very hard for consumers to understand the actual content of the policy and how their data is used in practice.”
The golden standard
The statements above are made in part by comparing the companies’ policies to a “golden standard” formulated by the BEUC that describes what the perfect privacy policy should look like. The golden standard measures comprehensiveness of the information, substantive compliance with the types of processing allowed by the GDPR, and clarity of expression – whether the policy is understandable by the average user.
All three of those criteria must be met for a policy to be considered perfectly in compliance with the GDPR. The BEUC report referenced above contains a multiplicity of examples of privacy policy clauses that do and don’t meet the golden standard, with explanations as to the reasons for the classification of meeting standard, problematic, or failing to meet standard.
Something of note is how in some cases, policy clauses that originally met standards ended up after being revised in May 2018 – presumably with the goal of GDPR compliance – being problematic or failing to meet the standard altogether. The takeaway here is that in GDPR compliance as in so many aspects of life, sometimes the actions you take to attempt to fix a problem only make it worse.
The report is 64 pages long, but if you make your way through it, you will find excellent guidance as to how to create GDPR compliant policies (and what to avoid).