Consent is only one of the six legal grounds for processing personal data under the GDPR, but it is certainly the most well-known. While it might look safe and solid at first sight, it is becoming the weakest link of the GDPR compliance chain.

First, consent can be withdrawn at any time, and the process for withdrawal must be as easy as the process for providing consent. Thus, a system built only on consent can fall apart quite quickly.

More importantly, consent can be considered invalid at any time, in which case the breakdown is immediate.

One example of consent being invalidated is a Belgian retailer that required the use of a customer’s e-ID as a prerequisite for the issuance of a loyalty card. While the merchant claimed consent as legal ground, the DPA ruled that such consent could not be freely given and that it was therefore invalid.

A second example is the recent judgment of the Court of Justice of the European Union stating that a pre-ticked checkbox cannot be considered as an active, unambiguous consent of the user. The consent, which was required as an ePrivacy requirement for the use of tracking cookies, was therefore invalid.

The impact of such invalidation should not be underestimated, as it leaves you without a valid legal ground and, thus, no way to continue the processing of personal data. If you need the personal data for your core business processes, the operational consequences can be enormous.

So how can you fortify this weak link? Make sure that you can demonstrate that users have a real choice and are fully in control when providing consent. This is a crucial step both for the validity of the consent and the fairness of the processing.

Consent without such choice or control can never be solid, and you just can’t build a castle on quicksand and expect it not to sink.

On October 1, 2019, the Court of Justice of the European Union (CJEU) issued a final ruling in the Planet49 case (case C-673/17 – available here).

Following a request for preliminary ruling from the German Federal Court of Justice, the Bundesgerichtshof, the CJEU interpreted the consent requirement of Directive 2002/58/EC, as amended by Directive 2009/136/EC (hereafter the “e-Privacy Directive”) in light of former Directive 95/46/EU (hereafter the “Data Protection Directive”) as well as in light of its successor – the General Data Protection Regulation (GDPR).

The Court made it clear that the placing and reading of tracking cookies on a user’s terminal equipment requires an active and unambiguous consent of the user. A pre-ticked checkbox does not meet these requirements and therefore does not constitute a valid consent. Also, the Court underlined that consent must be specific. In the case at hand, the act of selecting a button to participate in a promotional online lottery cannot be construed as consent of the user to the storage of cookies.

Moreover, the Court clarified that these requirements regarding the consent of the user for usage of cookies are applicable regardless of whether the information stored or consulted on the user’s device constitutes “personal data.”

Finally, the Court held that cookie consent must be “informed” as per the GDPR, which means that service providers must also provide information on the duration of the operation of cookies, as well as in relation to any third party access to those cookies.

The facts

Continue Reading Court of Justice of the European Union Finds that Pre-Ticked Checkboxes Are Not Valid Consents under GDPR

Executive summary

On September 17, 2019, the Belgian Data Protection Authority (DPA) issued a fine of EUR 10,000 for a breach of the General Data Protection Regulation’s (GDPR). The case related to a merchant who required the use of an electronic identity card as the sole means for the issuance of loyalty cards.

The DPA found that this practice did not comply with GDPR’s standards on (a) data minimization, as the electronic identity card contains much more information about the holder than is necessary for the purposes of creating a loyalty card; and (b) consent, because customers were not offered a real choice on whether they should provide access to the data on their electronic identity card in exchange for a loyalty card. As a result, the customers’ consent was not considered as freely given and therefore invalid.

The DPA also found that the merchant had not done enough to inform customer about its data processing activities, and thereby violated its information duties under the GDPR.

The facts

Continue Reading Belgian Data Protection Authority Finds Merchant Violated GDPR by Requiring Customers to Provide Electronic ID to Receive Loyalty Card

On 29 July 2019, the Court of Justice of the European Union (CJEU) issued a decision in the Fashion ID case, a case referred to it by a German court. In this blog post we will focus on what this case means with regard to joint controllership when you have social media plug-ins on your website. To go directly to the section on the implications of this case, please click here.

Background to the Fashion ID case

The Fashion ID case was brought before the CJEU by means of a reference for preliminary ruling by the Higher Regional Court of Düsseldorf, Germany (Oberlandesgericht Düsseldorf).

The national case concerned a dispute between GmbH & Co. KG and Verbraucherzentrale NRW eV about Fashion ID’s embedding of a social plugin provided by Facebook Ireland Ltd on the website of Fashion ID.

The case was referred to the CJEU in January 2017, i.e., before the General Data Protection Regulation became applicable on May 25, 2018, and it was assessed with reference to the then applicable Directive 95/46. Nonetheless, the Court’s findings remain relevant.

Questions for preliminary ruling

In order to decide on the case, the Higher Regional Court of Düsseldorf referred the following questions to the CJEU.

  1. Do the rules in Articles 22, 23 and 24 of Directive [95/46] preclude national legislation which, in addition to the powers of intervention conferred on the data-protection authorities and the remedies available to the data subject, grants public-service associations the power to take action against the infringer in the event of an infringement in order to safeguard the interests of consumers?
  2. If Question 1 is answered in the negative: In a case such as the present one, in which someone has embedded a programming code in his website which causes the user’s browser to request content from a third party and, to this end, transmits personal data to the third party, is the person embedding the content the “controller” within the meaning of Article 2(d) of Directive [95/46] if that person is himself unable to influence this data-processing operation?
  3. If Question 2 is answered in the negative: Is Article 2(d) of Directive [95/46] to be interpreted as meaning that it definitively regulates liability and responsibility in such a way that it precludes civil claims against a third party who, although not a “controller”, nonetheless creates the cause for the processing operation, without influencing it?
  4. Whose “legitimate interests”, in a situation such as the present one, are the decisive ones in the balancing of interests to be undertaken pursuant to Article 7(f) of Directive [95/46]? Is it the interests in embedding third-party content or the interests of the third party?
  5. To whom must the consent to be declared under Articles 7(a) and 2(h) of Directive [95/46] be given in a situation such as that in the present case?
  6. Does the duty to inform under Article 10 of Directive [95/46] also apply in a situation such as that in the present case to the operator of the website who has embedded the content of a third party and thus creates the cause for the processing of personal data by the third party?’

Answers given by the CJEU

The questions for preliminary ruling were answered as follows:

  1. Articles 22 to 24 of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data must be interpreted as not precluding national legislation which allows consumer-protection associations to bring or defend legal proceedings against a person allegedly responsible for an infringement of the protection of personal data.
  2. The operator of a website, such as Fashion ID GmbH & Co. KG, that embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor can be considered to be a controller, within the meaning of Article 2(d) of Directive 95/46. That liability is, however, limited to the operation or set of operations involving the processing of personal data in respect of which it actually determines the purposes and means, that is to say, the collection and disclosure by transmission of the data at issue.
  3. In a situation such as that at issue in the main proceedings, in which the operator of a website embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor, it is necessary that that operator and that provider each pursue a legitimate interest, within the meaning of Article 7(f) of Directive 95/46, through those processing operations in order for those operations to be justified in respect of each of them.
  4. Articles 2(h) and 7(a) of Directive 95/46 must be interpreted as meaning that, in a situation such as that at issue in the main proceedings, in which the operator of a website embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor, the consent referred to in those provisions must be obtained by that operator only with regard to the operation or set of operations involving the processing of personal data in respect of which that operator determines the purposes and means. In addition, Article 10 of that directive must be interpreted as meaning that, in such a situation, the duty to inform laid down in that provision is incumbent also on that operator, but the information that the latter must provide to the data subject need relate only to the operation or set of operations involving the processing of personal data in respect of which that operator actually determines the purposes and means.

Implications of the case as regards joint controllership

The decision contains interesting clarification with regard to joint controllership that goes far beyond the specifics of the case and is relevant for the interpretation of the notion of a joint controller. Even though the case was assessed under Directive 95/64 it is relevant for the interpretation of the notion of joint controller under the GDPR. Indeed, the definition of controller under Directive 95/46 already referred to ‘joint’ determination of the purposes and the means, and the definition remains unchanged under the GDPR.

What we can take away from this decision is the following:

  • An entity can be considered a joint controller without having access to the personal data concerned.

Fashion ID argued that it could not be considered a controller as it does not have influence “either over the data transmitted by the visitor’s browser from its website or over whether and, where applicable, how Facebook Ireland uses those data.

The CJEU does not follow this argument. With reference to earlier case-law (C-210/16 and C-25/17) it states that “the joint responsibility of several actors for the same processing (…) does not require each of them to have access to the personal data concerned”.

This reasoning also applies to a single controller. Access to personal data is not a criterion to determine whether someone is a controller. The only relevant criterion is whether the entity determines the purposes and the means.

  • The liability/responsibility of the joint controller is limited to the processing activity for which it determines the purposes and the means and depends on the degree of involvement in the processing.

The CJEU recognizes that the degree of involvement of a joint controller may vary and that joint controllers are not necessarily involved in all stages of the relevant processing activity. As a consequence, the level of liability must be assessed on the basis of the specific circumstances of the case.

Furthermore, and this goes without saying, an entity will only be considered a controller, and hence be responsible/liable, for the processing activities for which it determines the purposes and the means. If the processing activity is part of a larger chain of preceding and subsequent processing activities for which the entity does not determine the purposes and the means, the entity will not be considered a controller for those preceding and subsequent processing activities.

  • Joint controllership may be inferred from a mutual commercial benefit in data sharing.

It appears that a mutual commercial benefit in data sharing may trigger joint controllership. In its decision, the CJEU expressly refers to the mutual commercial benefit of Fashion ID and Facebook: “As to the purposes of those operations involving the processing of personal data, it appears that Fashion ID’s embedding of the Facebook ‘Like’ button on its website allows it to optimise the publicity of its goods by making them more visible on the social network Facebook when a visitor to its website clicks on that button. The reason why Fashion ID seems to have consented, at least implicitly, to the collection and disclosure by transmission of the personal data of visitors to its website by embedding such a plugin on that website is in order to benefit from the commercial advantage consisting in increased publicity for its goods; those processing operations are performed in the economic interests of both Fashion ID and Facebook Ireland, for whom the fact that it can use those data for its own commercial purposes is the consideration for the benefit to Fashion ID”.

  • Each joint controller needs a legitimate ground for processing personal data.

It follows from the CJEU’s answer to the fourth question that both Fashion ID and Facebook need to demonstrate a legitimate interest. In other words, each controller in a joint controllership needs to have a valid legal ground for the processing activities performed as a joint controller. This makes perfect sense. The obligation only to process personal data on the basis of one of the legal grounds exhaustively listed in the law rests with the individual controller. There is no exception in the case of joint controllership.

  • Where consent is required, the joint controller who first enters into contact with the data subject must request consent. The same goes for the information required.

Where consent is required, it must be acquired before the start of the processing. Similarly, pursuant to the information requirement under Directive 95/64, the data subject must receive information about the processing of personal data at the latest at the time of collection of the personal data.

Given these time constraints, the consent and information requirements must be fulfilled by the joint controller who first enters into contact with the data subject. In the case at hand, this was the website operator, Fashion ID.

Under the GDPR, the roles and responsibilities having an impact on compliance with these requirements would typically need to be determined in the joint controller arrangement referred to in article 26.

Practical guidance for website operators using social media plugins

As you will be considered a joint controller with the provider of the social media plugins, it is recommended that you seek assurance regarding the provider’s GDPR compliance and ensure that:

  • Your privacy policy contains appropriate information on data collection and data sharing via the social media plugins;
  • You request consent where tracking mechanisms are used;
  • You comply with any terms and conditions of the social media plugin provider; and
  • The agreement with the social media plugin provider contains appropriate co-controllership arrangements.

On August 8, 2019, the U.S. Court of Appeals for the Ninth Circuit issued yet another decision adopting relaxed standing requirements in privacy litigation, this time in a decision permitting a plaintiff to pursue claims under Illinois’s Biometric Information Privacy Act (BIPA). In Patel v. Facebook, the Ninth Circuit rejected arguments from Facebook Inc. (Facebook) that claims under the BIPA require assertions of real-world harm, and that BIPA claims only apply to conduct within Illinois. The ruling creates a circuit split on the standard for establishing Article III standing in BIPA litigation, which could prompt the U.S. Supreme Court to take up the issue.

Background

Continue Reading Ninth Circuit Rejects Facebook’s Article III Argument; Biometric Lawsuit Will Proceed

The National Institute of Standards and Technology (“NIST”) has extended the comment period on its recently released draft documents, NIST SP 800-171 Revision 2 and NIST SP 800-171B. The comment period for both NIST SP 800-171 Revision 2 and NIST SP 800-171B was initially open until July 19, 2019. It was recently extended to August 2, 2019.

NIST SP 800-171 Revision 2 contains only minor editorial revisions from the previous version and does not make any changes to the basic and derived security requirements outlined in Chapter Three. In comparison, NIST SP 800-171B contains new security recommendations for protecting Controlled Unclassified Information (“CUI”) in nonfederal systems and organizations where there is a higher than usual risk of exposure. The risk of exposure is heightened when CUI is part of a high value asset (“HVA”) or a critical program because it can become a target for sophisticated adversaries. In recent years, attacks on these HVAs and critical programs have increased, thus spurring the Department of Defense to ask NIST for greater protections. The resulting NIST SP 800-171 B is intended to be implemented in addition to the basic requirements laid out in NIST SP 800-171. These enhanced requirements are only applicable for a nonfederal system or organization when mandated by a federal agency in a contract, grant, or other agreement.

Oregon has recently passed a new cybersecurity statute, joining California in requiring manufacturers of “connected devices” to equip qualifying technology with “reasonable security features.” The new law will go into force on January 1, 2020. For further analysis, visit our recent client alert.

The European Union’s (“EU”) General Data Protection Regulation (“GDPR”) turned one year old on May 25th. European data protection regulators celebrated by continuing to work through a rising number of complaints and infractions, and by stepping up their monitoring for violations. US companies are directly in the crosshairs. Whether based in the EU or not, a company is potentially subject to the GDPR (and its stiff fines up to 4% of annual global revenue) if it offers goods or services to data subjects located in the EU, or monitors individuals’ online behavior or personal information in the EU. This means that a US company engaged in the common business practice of collecting data from its EU customers must assess and implement business practices to ensure GDPR compliance.

The US and EU engaged in approximately $1.3 trillion dollars in trade last year. With that level of economic activity, and accompanying data flows, many US companies should already have in place the basic structures for GDPR compliance. However, recent surveys suggest that a significant number of companies impacted by the GDPR are still grappling with compliance. In a recent Forrester Research study, “Security Through Simplicity,” over half of the responding IT decision-makers revealed that their companies had not yet carried out even basic GDPR compliance steps such as vetting third-party vendors, hiring data protection officers, training employees, setting up mechanisms for the “72-hour data breach notification” requirement, and collecting evidence and documenting efforts to address GDPR compliance risks. Further, only about 4,650 US companies are currently registered and self-certified with the EU-US Privacy Shield framework (compared to the over 100,000 mid- to large-sized companies in the US, according to business census data). Such certification goes a long way toward permitting a US company to receive certain EU data in a GDPR compliant manner.

Continue Reading At the GDPR’s First Anniversary, the Impact on US Companies Grows

The District of Columbia Bar Rules of Professional Conduct Review Committee (“Committee”) recently released recommended changes to D.C. Bar rules 1.1, 1.6, and 4.4 to address the increased focus and evolving landscape of E-Discovery and technology in law. All D.C. practitioners should take notice of these potential rule changes, and ensure they stay current—or engage those with appropriate expertise—on these quickly changing areas of practice.

The proposed changes are as follows: Continue Reading Amendments Proposed To D.C. Rules of Professional Conduct to Address

As the country’s new Congress settles into its term, several technology issues are coming to the forefront. A number of Senators recently questioned the Department of Justice over how it is collecting cellphone-location data in the wake of the Supreme Court’s landmark Carpenter decision. Carpenter v. United States, 138 S. Ct. 2206 (2018). The House of Representatives is considering a renewed version of legislation that would strengthen the security of “Internet of Things” technologies used by the federal government. And politicians and pundits throughout Capitol Hill are asking whether this will be the year that comprehensive federal privacy legislation becomes law. As it turns out though, some of the nation’s top courts are already tackling these tough issues. In fact, the Seventh Circuit’s opinion last year in Naperville Smart Meter Awareness v. City of Naperville, 900 F.3d 521 (7th Cir. 2018), has received relatively little reporting, but its impact will be broad when it comes to how courts interpret the Fourth Amendment in the era of big data.

In Naperville, the Seventh Circuit heard an appeal concerning the city’s “smart meter” program. Without residents’ permission, Naperville had been replacing traditional energy meters on its grid with “smart meters” for homes. Each smart meter collected thousands of readings a month, as opposed to just the previous single monthly readings. According to the plaintiffs, the repeated readings of the smart meters collected data at such a granular level that they revealed what appliances were present in homes and when they were used. Considering the potential privacy impact, the Seventh Circuit found that Naperville’s collection of smart meter data from residents’ homes constituted a “search” under the Fourth Amendment. Continue Reading Seventh Circuit Wades into Big Data Case Law