Crowell & Moring has released Litigation Forecast 2020: What Corporate Counsel Need to Know for the Coming Year. The eighth-annual Forecast provides forward-looking insights from leading Crowell & Moring lawyers to help legal departments anticipate and respond to challenges that might arise in the year ahead.

For 2020, the Forecast focuses on how the digital revolution is giving rise to new litigation risks, and it explores trends in employment non-competes, the future of stare decisis, the role of smartphones in investigations and litigation, and more.

The cover story, “A Tangled Web: How the Internet of Things and AI Expose Companies to Increased Tort, Privacy, and Cybersecurity Litigation,” explores how the digital revolution is transforming not only high-tech companies, but also traditional industries with products, business models, and workforces that are being affected by increased connectivity, artificial intelligence, and the ability to gather and use tremendous amounts of data.

Be sure to follow the conversation on Twitter with #LitigationForecast.

On January 13, 2020, U.S. District Court Judge Castel of the Southern District of New York in SEC v. Telegram Group Inc. et al., No. 19 Civ. 9439 (PKC) granted the motion of the U.S. Securities and Exchange Commission (“SEC”) to compel Telegram Group Inc., a technology company best known for its secure messaging app, to produce overseas bank records (Dkt. 67). The SEC had sought these records “fully unredacted” on an expedited basis in support of its claim that Telegram engaged in an unregistered securities offering (Dkt. 52). Telegram objected to any production, asserting that the records were of questionable relevance, that they contained banking and personal information protected by a host of foreign laws, and that it would be unduly burdensome to “to cull through these records and redact the personal information of non-U.S. persons and entities subject to foreign data privacy law protections.” (Dkt. 55). In a short decision, the Court ordered Telegram to produce the records on a tight timeline, holding that “[o]nly redactions necessitated by foreign privacy laws shall be permitted, and a log stating the basis for any redaction shall be produced at the same time the redacted documents are produced.”

There are a few key takeaways from this decision. First, the Court recognized foreign data privacy laws as legitimate grounds for withholding otherwise discoverable information. Defendant was not given a blank check to redact; rather, the Court required Telegram to log the basis for any privacy assertions, and one can expect the SEC will closely question Telegram on the redactions. At the same time, the Court clearly did not agree with the SEC’s characterization of data privacy laws as “blocking statutes” to be ignored, and was not swayed by its complaints that Telegram had not shown that such laws require deference. This is consistent with an observed general heightened sensitivity to data privacy and data security interests in the U.S. and abroad.

Judge Castel’s approach represents a change from U.S. courts’ prior dismissive treatment of similar disclosure objections. Courts traditionally would apply a multi-factor comity analysis that generally prioritized U.S. discovery interests over those of conflicting foreign laws and ultimately required unredacted production. See, e.g., Laydon v. Mizuho Bank, Ltd., 183 F. Supp.3d 409 (S.D.N.Y. 2016) (requiring unredacted production of data protected by the then EU privacy regulation, the 1995 EU Directive 95/46/EC, based on comity analysis set out in Société Nationale Industrielle Aerospatiale v. U.S. Dist. Court for S. Dist. of Iowa, 482 U.S. 522, 544 n.29 (1987) (hereinafter “Aerospatiale”)). Certainly, the SEC pushed for the customary approach, but Judge Castel appears implicitly to have to have resolved in short form (or skipped over) the Aerospatiale comity analysis and accepted the legitimacy of foreign restrictions on disclosure in U.S. proceedings.

Continue Reading Burden of Compliance With Foreign Data Privacy Laws Does Not Justify Withholding of Banking Records

On January 1, 2020, California’s landmark privacy law, the California Consumer Privacy Act (CCPA), took effect. The CCPA imposes various obligations on covered businesses and provides extensive rights to consumers with respect to controlling the collection and use of their personal information. While some companies have largely completed their CCPA compliance efforts, many others are still digesting the CCPA and draft proposed regulations, and taking steps to meet the CCPA’s myriad compliance obligations.

Confusion persists about how businesses can comply with certain provisions of the CCPA. In October 2019, the California Attorney General issued proposed regulations that provide guidance on a number of key areas, but the regulations are not yet final. If adopted, violations of the proposed regulations will be treated the same as violations of the CCPA itself, with the same penalties. We have summarized the proposed regulations in previous alerts:

Comments on the proposed regulations can be viewed here.

Continue Reading California’s Landmark Privacy Law Now in Effect

GN Netcom, Inc. v. Plantronics, Inc., 930 F.3d 76 (3d. Cir. 2019)

The Third Circuit’s decision in GN Netcom illustrates how Federal Rule of Civil Procedure 37(e) has elevated the bar to obtaining a default judgment based on spoliation, raising the question of what level of egregious conduct would justify that penalty. The decision also is notable for its exploration of the evidentiary support that aggrieved parties should be permitted to submit when the lesser penalty of a permissive adverse inference instruction is ordered. In a split decision, the appellate court granted a new trial because plaintiff’s expert was precluded from testifying as to the degree of spoliation, which might have impacted the outcome of the case.

Defendant’s Spoliation of Evidence

Continue Reading Prohibition on Expert Testimony Results in New Trial

This time of year, everything tends to be more scary and spooky, but one thing doesn’t have to be – creating a defensible privilege log! Creating a privilege log can be one of the most time consuming, labor intensive and expensive parts of litigation. The last thing you want is to have to spend additional time and money defending or re-doing work on your privilege log.

Federal Rule of Civil Procedure 26(b)(5) only requires that the party withholding material based on a claim of privilege “(i) expressly make the claim; and (ii) describe the nature of the documents, communications, or tangible things not produced or disclosed – and do so in a manner that, without revealing information itself privileged or protected, will enable other parties to assess the claim.” Although this seems simple enough, in practice this can actually be more trick than treat.

Here are some things to keep in mind when creating a privilege log to help make it more defensible and less likely to lead to additional time and money making extensive revisions to the privilege log entries.

Continue Reading Tips For Making Privilege Logs Less Scary

Consent is only one of the six legal grounds for processing personal data under the GDPR, but it is certainly the most well-known. While it might look safe and solid at first sight, it is becoming the weakest link of the GDPR compliance chain.

First, consent can be withdrawn at any time, and the process for withdrawal must be as easy as the process for providing consent. Thus, a system built only on consent can fall apart quite quickly.

More importantly, consent can be considered invalid at any time, in which case the breakdown is immediate.

One example of consent being invalidated is a Belgian retailer that required the use of a customer’s e-ID as a prerequisite for the issuance of a loyalty card. While the merchant claimed consent as legal ground, the DPA ruled that such consent could not be freely given and that it was therefore invalid.

A second example is the recent judgment of the Court of Justice of the European Union stating that a pre-ticked checkbox cannot be considered as an active, unambiguous consent of the user. The consent, which was required as an ePrivacy requirement for the use of tracking cookies, was therefore invalid.

The impact of such invalidation should not be underestimated, as it leaves you without a valid legal ground and, thus, no way to continue the processing of personal data. If you need the personal data for your core business processes, the operational consequences can be enormous.

So how can you fortify this weak link? Make sure that you can demonstrate that users have a real choice and are fully in control when providing consent. This is a crucial step both for the validity of the consent and the fairness of the processing.

Consent without such choice or control can never be solid, and you just can’t build a castle on quicksand and expect it not to sink.

On October 1, 2019, the Court of Justice of the European Union (CJEU) issued a final ruling in the Planet49 case (case C-673/17 – available here).

Following a request for preliminary ruling from the German Federal Court of Justice, the Bundesgerichtshof, the CJEU interpreted the consent requirement of Directive 2002/58/EC, as amended by Directive 2009/136/EC (hereafter the “e-Privacy Directive”) in light of former Directive 95/46/EU (hereafter the “Data Protection Directive”) as well as in light of its successor – the General Data Protection Regulation (GDPR).

The Court made it clear that the placing and reading of tracking cookies on a user’s terminal equipment requires an active and unambiguous consent of the user. A pre-ticked checkbox does not meet these requirements and therefore does not constitute a valid consent. Also, the Court underlined that consent must be specific. In the case at hand, the act of selecting a button to participate in a promotional online lottery cannot be construed as consent of the user to the storage of cookies.

Moreover, the Court clarified that these requirements regarding the consent of the user for usage of cookies are applicable regardless of whether the information stored or consulted on the user’s device constitutes “personal data.”

Finally, the Court held that cookie consent must be “informed” as per the GDPR, which means that service providers must also provide information on the duration of the operation of cookies, as well as in relation to any third party access to those cookies.

The facts

Continue Reading Court of Justice of the European Union Finds that Pre-Ticked Checkboxes Are Not Valid Consents under GDPR

Executive summary

On September 17, 2019, the Belgian Data Protection Authority (DPA) issued a fine of EUR 10,000 for a breach of the General Data Protection Regulation’s (GDPR). The case related to a merchant who required the use of an electronic identity card as the sole means for the issuance of loyalty cards.

The DPA found that this practice did not comply with GDPR’s standards on (a) data minimization, as the electronic identity card contains much more information about the holder than is necessary for the purposes of creating a loyalty card; and (b) consent, because customers were not offered a real choice on whether they should provide access to the data on their electronic identity card in exchange for a loyalty card. As a result, the customers’ consent was not considered as freely given and therefore invalid.

The DPA also found that the merchant had not done enough to inform customer about its data processing activities, and thereby violated its information duties under the GDPR.

The facts

Continue Reading Belgian Data Protection Authority Finds Merchant Violated GDPR by Requiring Customers to Provide Electronic ID to Receive Loyalty Card

On 29 July 2019, the Court of Justice of the European Union (CJEU) issued a decision in the Fashion ID case, a case referred to it by a German court. In this blog post we will focus on what this case means with regard to joint controllership when you have social media plug-ins on your website. To go directly to the section on the implications of this case, please click here.

Background to the Fashion ID case

The Fashion ID case was brought before the CJEU by means of a reference for preliminary ruling by the Higher Regional Court of Düsseldorf, Germany (Oberlandesgericht Düsseldorf).

The national case concerned a dispute between GmbH & Co. KG and Verbraucherzentrale NRW eV about Fashion ID’s embedding of a social plugin provided by Facebook Ireland Ltd on the website of Fashion ID.

The case was referred to the CJEU in January 2017, i.e., before the General Data Protection Regulation became applicable on May 25, 2018, and it was assessed with reference to the then applicable Directive 95/46. Nonetheless, the Court’s findings remain relevant.

Questions for preliminary ruling

In order to decide on the case, the Higher Regional Court of Düsseldorf referred the following questions to the CJEU.

  1. Do the rules in Articles 22, 23 and 24 of Directive [95/46] preclude national legislation which, in addition to the powers of intervention conferred on the data-protection authorities and the remedies available to the data subject, grants public-service associations the power to take action against the infringer in the event of an infringement in order to safeguard the interests of consumers?
  2. If Question 1 is answered in the negative: In a case such as the present one, in which someone has embedded a programming code in his website which causes the user’s browser to request content from a third party and, to this end, transmits personal data to the third party, is the person embedding the content the “controller” within the meaning of Article 2(d) of Directive [95/46] if that person is himself unable to influence this data-processing operation?
  3. If Question 2 is answered in the negative: Is Article 2(d) of Directive [95/46] to be interpreted as meaning that it definitively regulates liability and responsibility in such a way that it precludes civil claims against a third party who, although not a “controller”, nonetheless creates the cause for the processing operation, without influencing it?
  4. Whose “legitimate interests”, in a situation such as the present one, are the decisive ones in the balancing of interests to be undertaken pursuant to Article 7(f) of Directive [95/46]? Is it the interests in embedding third-party content or the interests of the third party?
  5. To whom must the consent to be declared under Articles 7(a) and 2(h) of Directive [95/46] be given in a situation such as that in the present case?
  6. Does the duty to inform under Article 10 of Directive [95/46] also apply in a situation such as that in the present case to the operator of the website who has embedded the content of a third party and thus creates the cause for the processing of personal data by the third party?’

Answers given by the CJEU

The questions for preliminary ruling were answered as follows:

  1. Articles 22 to 24 of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data must be interpreted as not precluding national legislation which allows consumer-protection associations to bring or defend legal proceedings against a person allegedly responsible for an infringement of the protection of personal data.
  2. The operator of a website, such as Fashion ID GmbH & Co. KG, that embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor can be considered to be a controller, within the meaning of Article 2(d) of Directive 95/46. That liability is, however, limited to the operation or set of operations involving the processing of personal data in respect of which it actually determines the purposes and means, that is to say, the collection and disclosure by transmission of the data at issue.
  3. In a situation such as that at issue in the main proceedings, in which the operator of a website embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor, it is necessary that that operator and that provider each pursue a legitimate interest, within the meaning of Article 7(f) of Directive 95/46, through those processing operations in order for those operations to be justified in respect of each of them.
  4. Articles 2(h) and 7(a) of Directive 95/46 must be interpreted as meaning that, in a situation such as that at issue in the main proceedings, in which the operator of a website embeds on that website a social plugin causing the browser of a visitor to that website to request content from the provider of that plugin and, to that end, to transmit to that provider personal data of the visitor, the consent referred to in those provisions must be obtained by that operator only with regard to the operation or set of operations involving the processing of personal data in respect of which that operator determines the purposes and means. In addition, Article 10 of that directive must be interpreted as meaning that, in such a situation, the duty to inform laid down in that provision is incumbent also on that operator, but the information that the latter must provide to the data subject need relate only to the operation or set of operations involving the processing of personal data in respect of which that operator actually determines the purposes and means.

Implications of the case as regards joint controllership

The decision contains interesting clarification with regard to joint controllership that goes far beyond the specifics of the case and is relevant for the interpretation of the notion of a joint controller. Even though the case was assessed under Directive 95/64 it is relevant for the interpretation of the notion of joint controller under the GDPR. Indeed, the definition of controller under Directive 95/46 already referred to ‘joint’ determination of the purposes and the means, and the definition remains unchanged under the GDPR.

What we can take away from this decision is the following:

  • An entity can be considered a joint controller without having access to the personal data concerned.

Fashion ID argued that it could not be considered a controller as it does not have influence “either over the data transmitted by the visitor’s browser from its website or over whether and, where applicable, how Facebook Ireland uses those data.

The CJEU does not follow this argument. With reference to earlier case-law (C-210/16 and C-25/17) it states that “the joint responsibility of several actors for the same processing (…) does not require each of them to have access to the personal data concerned”.

This reasoning also applies to a single controller. Access to personal data is not a criterion to determine whether someone is a controller. The only relevant criterion is whether the entity determines the purposes and the means.

  • The liability/responsibility of the joint controller is limited to the processing activity for which it determines the purposes and the means and depends on the degree of involvement in the processing.

The CJEU recognizes that the degree of involvement of a joint controller may vary and that joint controllers are not necessarily involved in all stages of the relevant processing activity. As a consequence, the level of liability must be assessed on the basis of the specific circumstances of the case.

Furthermore, and this goes without saying, an entity will only be considered a controller, and hence be responsible/liable, for the processing activities for which it determines the purposes and the means. If the processing activity is part of a larger chain of preceding and subsequent processing activities for which the entity does not determine the purposes and the means, the entity will not be considered a controller for those preceding and subsequent processing activities.

  • Joint controllership may be inferred from a mutual commercial benefit in data sharing.

It appears that a mutual commercial benefit in data sharing may trigger joint controllership. In its decision, the CJEU expressly refers to the mutual commercial benefit of Fashion ID and Facebook: “As to the purposes of those operations involving the processing of personal data, it appears that Fashion ID’s embedding of the Facebook ‘Like’ button on its website allows it to optimise the publicity of its goods by making them more visible on the social network Facebook when a visitor to its website clicks on that button. The reason why Fashion ID seems to have consented, at least implicitly, to the collection and disclosure by transmission of the personal data of visitors to its website by embedding such a plugin on that website is in order to benefit from the commercial advantage consisting in increased publicity for its goods; those processing operations are performed in the economic interests of both Fashion ID and Facebook Ireland, for whom the fact that it can use those data for its own commercial purposes is the consideration for the benefit to Fashion ID”.

  • Each joint controller needs a legitimate ground for processing personal data.

It follows from the CJEU’s answer to the fourth question that both Fashion ID and Facebook need to demonstrate a legitimate interest. In other words, each controller in a joint controllership needs to have a valid legal ground for the processing activities performed as a joint controller. This makes perfect sense. The obligation only to process personal data on the basis of one of the legal grounds exhaustively listed in the law rests with the individual controller. There is no exception in the case of joint controllership.

  • Where consent is required, the joint controller who first enters into contact with the data subject must request consent. The same goes for the information required.

Where consent is required, it must be acquired before the start of the processing. Similarly, pursuant to the information requirement under Directive 95/64, the data subject must receive information about the processing of personal data at the latest at the time of collection of the personal data.

Given these time constraints, the consent and information requirements must be fulfilled by the joint controller who first enters into contact with the data subject. In the case at hand, this was the website operator, Fashion ID.

Under the GDPR, the roles and responsibilities having an impact on compliance with these requirements would typically need to be determined in the joint controller arrangement referred to in article 26.

Practical guidance for website operators using social media plugins

As you will be considered a joint controller with the provider of the social media plugins, it is recommended that you seek assurance regarding the provider’s GDPR compliance and ensure that:

  • Your privacy policy contains appropriate information on data collection and data sharing via the social media plugins;
  • You request consent where tracking mechanisms are used;
  • You comply with any terms and conditions of the social media plugin provider; and
  • The agreement with the social media plugin provider contains appropriate co-controllership arrangements.

On August 8, 2019, the U.S. Court of Appeals for the Ninth Circuit issued yet another decision adopting relaxed standing requirements in privacy litigation, this time in a decision permitting a plaintiff to pursue claims under Illinois’s Biometric Information Privacy Act (BIPA). In Patel v. Facebook, the Ninth Circuit rejected arguments from Facebook Inc. (Facebook) that claims under the BIPA require assertions of real-world harm, and that BIPA claims only apply to conduct within Illinois. The ruling creates a circuit split on the standard for establishing Article III standing in BIPA litigation, which could prompt the U.S. Supreme Court to take up the issue.

Background

Continue Reading Ninth Circuit Rejects Facebook’s Article III Argument; Biometric Lawsuit Will Proceed