This has not been a joyful winter for energy industry executives. They have repeatedly awoken to alerts that substations in the Northwest and Southeast have been physically attacked and that a major engineering firm was the subject of a ransomware cyberattack that may have compromised utility data.

Federal regulators are taking notice. On December 7, the Federal Energy Regulatory Commission (FERC) and the Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response (CESER) held a joint technical conference to discuss supply chain risk management in light of increasing threats to the Bulk Power System. Multiple government participants identified the possible need to normalize the use of software bill of materials and hardware bill of materials in the electric industry. Several days later, FERC directed the North American Electric Reliability Corporation (NERC) to re-examine its Physical Security Reliability Standard, CIP-014-1. Congress, for its part, responded to growing cybersecurity threats to energy infrastructure by increasing CESER’s budget by almost 7.5% in the recent omnibus appropriations bill and appropriating $20 million for the Cyber Testing for Resilient Industrial Control Systems program.

Cybersecurity attacks on distributed energy resources (DERs) including electric vehicles are also proliferating. In its recent report, Cybersecurity Considerations for Distributed Energy Resources on the U.S. Electric Grid, CESER identified the cybersecurity threat to DER operators, vendors, developers, owners and aggregators as posing a significant and growing risk. The Department of Energy will also soon release a report, mandated by Congress in the Infrastructure Investment and Jobs Act, identifying policies and procedures for enhancing the physical and cybersecurity of distributed resources and the electric distribution system.

The recent physical and cybersecurity incidents targeting critical infrastructure have exposed significant vulnerabilities of some companies, and both customers and the federal government are pushing the private sector to mitigate those threats as a condition for doing business.  The federal government, in particular, expects their private sector partners to adopt better security hygiene, assess supply chain risks, and prepare for quick responses to incidents, including rapid notifications to customers, regulators and the public.  Here are some best practices for energy sector companies to have on their radar for 2023:

  • Compliance with NERC’s Critical Infrastructure Protection (CIP) Standards. Violations of applicable NERC CIP reliability standards subject users, owners and operators of bulk power system facilities to civil penalties of up to $1,496,035 per violation, per day.
  • Comprehensive Assessments of Key IT and OT Systems. Conducting comprehensive assessments of current and potential system vulnerabilities is a leading cybersecurity industry practice that energy sector companies may consider adopting. They can do so by, for example, engaging in regular inventory of Information Technology and Operational Technology systems, including by assessing patch management processes, performing information security and physical risk assessments, and documenting and regularly reviewing system security plans and related operational documents.
  • Clear Roles and Responsibilities. Establishing clear cybersecurity-related roles and responsibilities can help position the enterprise to respond efficiently and effectively to cyber risk, for example by ensuring that corporate executives, the legal team, and key personnel such as the as the Chief Information Security Officer, the Chief Information Officer, the Chief Compliance Officer, and the Chief Privacy Officer are on notice of their respective roles and have clear guidance as to their duties both during “business as usual” operations and in the event that a potential cybersecurity incident occurs. 
  • Cybersecurity Incident Response Plans. Developing a cybersecurity Incident Response Plan (or “IRP”) is a leading cybersecurity industry practice and may even be a regulatory requirement for certain companies. IRPs are “playbooks” that are developed prior to a cybersecurity incident occurring to provide guidance for responsible stakeholders to respond to a potential incident and guide the company through that response in an organized and effective way.  IRPs typically include key components, such as individuals’ and teams’ roles and responsibilities, contact lists, details about the internal escalation process (e.g., regarding notifications to government entities), and guideposts for technical teams.  Companies may supplement their IRPs with supporting materials, for example check lists for key executives and personnel, and take steps to integrate their IRPs with other related policies, such as all-hazards crisis management plans and communications plans.
  • Cybersecurity Tabletop Exercises. Tabletop exercises are simulations designed to test a company’s response to a potential cybersecurity incident and application of their Incident Response Plan.  These exercises are often facilitated by counsel and conducted under privilege.  Notably, the Ponemon Institute, in a report issued by IBM Security, reported that companies that had incident response teams and tested their Plans with tabletop exercises or simulations incurred an average of $2.66 million less in data breach-related costs than those that did not. 
  • Supply Chain Risk Mitigation. A company’s supply chain can heighten exposure to cyber threats, including data leaks, supply chain breaches, and malware attacks; however, strategies to mitigate these risks are available, for example implementing protocols to continually assess and monitor third-party risk, understanding and controlling who has access to the company’s most valuable and sensitive data, and ensuring that third-party contracts include cybersecurity requirements.  The federal government has acknowledged the importance of addressing such supply chain risk, and 2021 Executive Order 14028, Improving the Nation’s Cybersecurity, and a 2022 OMB Memorandum both impose standards on governmental entities for the security and integrity of the software supply chain, and also require third-party software suppliers to comply with standards issued by the National Institute of Standards and Technology whenever their software is used on government information systems or affects government information, including that shared with government contractors.
  • Information Sharing Opportunities. Last March, Congress passed the Cyber Incident Reporting for Critical Infrastructure Act of 2022 (CIRCIA) requiring critical infrastructure to report significant cyber incidents and ransomware payments to the Cybersecurity & Infrastructure Security Agency (CISA) within tight time frames.  Although CISA has not yet promulgated the rules to implement CIRCIA, it has provided stakeholders with guidance about sharing cyber event information that emphasized the importance of information sharing to our collective defense and for strengthening cybersecurity for the nation. In addition to federally mandated information sharing requirements, companies may also consider sharing information in a trusted setting, including with their Information Sharing and Analysis Centers (ISACs). 

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

The European Commission launched the formal process to adopt an adequacy decision for the EU-U.S. Data Privacy Framework on December 13, 2022. The framework will replace the Privacy Shield, which was invalidated by the Court of Justice of the European Union’s (“CJEU”) Schrems II ruling on July 16, 2020 (CJEU C-311/18, discussed in this client alert). The draft adequacy decision aims to foster transatlantic data flows and to address the concerns raised in Schrems II. The draft adequacy decision is therefore important for businesses on both sides of the Atlantic.

An adequacy decision is a formal decision by the European Commission which recognizes a comparable level of personal data protection to that of the European Union in a non-EU country, territory, or international organization. As a result of such decision, personal data can flow freely and safely from the European Economic Area (“EEA”) to that recognized location without being subject to any further conditions or authorizations.

The EU’s proposal to launch a formal process to adopt an adequacy decision follows President Biden’s decision to sign an Executive Order in October 2022 which introduced new binding safeguards that address concerns raised in Schrems II. In Schrems II, the CJEU held that the U.S. Privacy Shield did not provide protection that was “essentially equivalent” to that of the EU because EU residents did not have effective remedies for privacy violations and because U.S. intelligence agencies had access to the data that was too-broad. As a reaction to invalidating the Privacy Shield, the Executive Order now imposes limitations and safeguards on access to data by U.S. intelligence agencies and establishes an independent and impartial redress mechanism.

President Biden’s Executive Order forms an essential element of the draft adequacy decision and the European Commission’s assessment that the U.S. legal framework now ensures an adequate level of protection of personal data transferred from EU organizations to U.S. certified organizations.

More specifically, the European Commission considers that:

  • The EU-U.S. Data Privacy Framework Principles, including the Supplemental Principles, issued by the U.S. Department of Commerce (“Principles”, see annex I of the draft adequacy decision) ensures effective protection that is essentially equivalent to the protection guaranteed by the GDPR;
  • The effective application of the Principles is guaranteed by transparency obligations and the administration of the EU-U.S. Data Privacy Framework by the U.S. Department of Commerce;
  • The oversight mechanisms and redress avenues in U.S. law enable infringements of data protection rules to be identified and punished in practice and offer legal remedies to data subjects (including EU residents) to exercise their data subject rights; and that
  • Any interference in the public interest by U.S. public authorities, particularly for criminal law enforcement and national security purposes with the fundamental rights of data subjects will be limited to what is necessary and proportionate to protect national security, and that effective legal protection against such interference exists.

To benefit from the draft adequacy decision, U.S. companies will have to certify that they are participating in the EU-U.S. Data Privacy Framework on an annual basis.

The draft adequacy decision will now be reviewed by the European Data Protection Board, and by a committee composed of representatives of EU Member States under the comitology procedure. The European Parliament also has a right to scrutinize the draft adequacy decision and may do so. The European Commission can adopt the final version of the adequacy decision only after all these stakeholders have given a green light to the draft. Once the final decision is published, which is not expected before spring 2023, European companies will be able to rely on this framework for sharing data with certified companies in the U.S.

One final note: an adequacy decision is not the only mechanism to legitimize international data transfers. Companies can still rely on other transfer tools for transfers to the U.S., such as the standard contractual clauses for international data transfers adopted by the European Commission last year. The European Commission emphasizes that the safeguards that the U.S. Government has put in place in the Executive Order, namely the limitations and safeguards to data accessed by U.S. intelligence agencies will be available for all EU-transfers to U.S. organizations, regardless of the mechanism used for the specific transfer. Companies relying on the standard contractual clauses for their international transfers to the U.S. will consequently benefit from these provisions as well.

Crowell and Moring will continue to follow developments on these issues and provide ongoing updates.

On November 10, 2022 the European Parliament adopted a resolution on esports and video games. In this resolution the European Parliament calls on the Commission and the Council to acknowledge the value of the video game ecosystem as a major cultural and creative industry (“CCI”) with strong potential for further growth and innovation. The video game ecosystem has become a leading CCI all over the world, with an estimated European market size of EUR 23,3 billion in 2021, including more than 4 900 game studios and 200 game publishers. It has great potential for growth, innovation, creativity and triggering positive change for the whole CCI sector, but, the resolution suggests, would benefit from additional harmonized data, definitions and legal frameworks required to enable them to embrace their full potential.

The European Parliament envisages a long-term European video game strategy, which should benefit all actors involved fairly and adequately, while considering the particularities of video game competitions in order to support EU actors and EU start-ups in the sector. The resolution notes that European video game industry is mainly made up of small and medium-sized enterprises of vital importance to the European economy. In 2020, the industry deployed approximately 98 000 people in Europe, of whom only an estimated 20% are women. Getting more women into video games and esports is a strategic priority for the European Parliament.

Definition of esports

The resolution defines ‘esports’ as “competitions where individuals or teams play video games – typically in front of spectators – either in-person or online, for entertainment, prizes or money”. The definition of esports encompasses a human element (the players), a digital element (the games), and a competitive element (the competition).

Benefits of esports and video games

Esports are an increasingly popular entertainment activity. Owing to their wide audience and digital component, video gaming and esports have significant social and cultural potential to connect Europeans of all ages, genders and backgrounds, including older people and people with disabilities. Moreover, video games and esports have great potential to further promote European history, identity, heritage, values and diversity through immersive experiences, and the European Parliament believes that they also have the potential to contribute to the EU’s soft power.

Furthermore, the European Parliament recognizes the great potential of video games and esports for use in EU educational policies and lifelong learning. Video games in the classroom often encourage students to pursue careers in science, technology, engineering, arts and mathematics, and esports can help to develop several skills that are essential in a digital society. The European Parliament insists that video games and esports can be a valuable teaching tool for actively involving learners in a curriculum and for developing digital literacy, soft skills and creative thinking.

Challenges for a truly integrated European esports and video game sector

The European Parliament sets out different areas that could be addressed by the European Commission and the Council for the creation of a truly integrated European esports and video games sector. These include, amongst others:

  1. The need to safeguard esports from problems with match-fixing, illegal gambling and performance enhancement, including doping;
  2. The protection of data privacy and cybersecurity challenges, without losing sight of the esports phenomenon;
  3. Fair consumer monetization of video games through micro-transactions, in-game currencies and loot boxes to ensure robust consumer protection;
  4. The protection of video game IP and the cross-border enforcement of IP rights of game producers;
  5. The ongoing battle against stereotypical representation of women in video games, and in general, the promotion of a framework for attaining greater equality for women in all positions in the value chain.

Need for a charter to promote European values in esports

Finally, the European Parliament distinguishes esports from sports, not least because the video games used for competitive gaming (i.e. esports) are played in a digital environment and belong to private entities that enjoy full legal control and all exclusive and unrestricted rights over the video games themselves.

Howeverthe European Parliament stresses that it believes that both sectors can complement and learn from each other and promote similar positive values and skills, such as fair play, non-discrimination, teamwork, leadership, solidarity, integrity, antiracism, social inclusion and gender equality. To this end, the European Parliament calls on the Commission to develop a charter to promote European values in esports competitions, in partnership with publishers, team organizations, clubs and tournament organizers.

Crowell & Moring will continue to follow (e)sports-related initiatives and provide ongoing updates.

In a judgment of August 1, 2022, the Court of Justice of the European Union (CJEU) provided further guidance on two important aspects of the General Data Protection Regulation (GDPR) (CJEU C-184/20). In summary, the CJEU held that, first, for a national law that imposes a legal obligation to process personal data to be able to constitute a legal basis for processing, it needs to be lawful, meaning that it must meet an objective of public interest and be proportionate to the legitimate aim pursued, and second, that non-sensitive data that are liable to reveal sensitive personal data need to be protected by the strengthened protection regime for processing of special categories of personal data.

The judgment followed the request for a preliminary ruling from the Vilnius Regional Administrative Court (Lithuania) concerning a Lithuanian anti-corruption law that required individuals working in the public service and the public interests of society to declare their private interests by lodging a declaration of private interests. The declarant was obliged to provide details about him- or herself and his or her spouse, cohabitee or partner, such as name, personal identification number, employment status, membership or undertakings, and information about certain financial transactions. Most of this information, including the name of the declarant’s partner, was published by the Chief Official Ethics Commission on a public website.

The main take-aways from the judgment can be summarized as follow.

I. A national law that imposes a legal obligation to process personal data can only constitute a legal basis for processing when it meets an objective of public interest and is proportionate to the legitimate aim pursued

The CJEU recognizes that the Lithuanian law that required the declaration of private interests serves an objective of public interest, i.e. guaranteeing the proper management of public affairs and public property, by ensuring that public sector decision makers perform their duties impartially and objectively and preventing them from being influenced by considerations relating to private interests. Combating corruption is an objective of public interest and, accordingly, legitimate.

On the other hand, the CJEU emphasizes that Member States need to consider the principle of proportionality in setting out the requirements for achieving such a legitimate objective. This means that the measures to achieve the objective need to be appropriate, adequate and strictly necessary.

While the measure—the declaration of private interests—is appropriate for contributing to the achievement of the objectives of general interest that it pursues, it is not strictly necessary to publish the content of the declarations of private interest on a public website. The objective could be achieved as effectively if the Chief Ethics Commission would review the content of the declarations instead of publishing them. Not having sufficient human resources to check effectively all the declarations cannot justify the publication of the declarations.

Moreover, an objective of general interest may not be pursued without having regard to the fact that it must be reconciled with the fundamental rights affected by the measure. This means that, for the purpose of assessing the proportionality of the processing, it is necessary to measure the seriousness of the interference with the fundamental rights to respect for private life and to the protection of personal data that that processing involves and to determine whether the importance of the objective of general interest pursued by the processing is proportionate to the seriousness of the interference.

In this context, the CJEU stresses a number of contextual elements. First, the public disclosure, online, of name-specific data relating to the declarant’s partner, or to persons who are close relatives of the declarant, are liable to reveal information on certain sensitive aspects of the data subjects’ private life, including, for example, their sexual orientation. Second, the declaration also concerns persons who are not public sector decision makers, but who are related to the declarant in another than his/her public sector capacity, and in respect of whom the objectives pursued by the law are not imperative in the same way as for the declarant. Third, the cumulative effect of the personal data that are published may still increase the seriousness of the infringement, since combining them enables a particularly detailed picture of the data subjects’ private lives to be built up. The CJEU further points out that the publication of the content of the declaration implies that the personal data are made freely accessible on the internet to the whole of the general public and, accordingly, to a potentially unlimited number of persons.

All this leads to a serious interference with the fundamental rights of data subjects to respect for private life and to the protection of personal data. The seriousness of that interference must be weighed against the importance of the objectives of preventing conflicts of interest and corruption in the public sector. In that regard, the CJEU confirms again the great importance of the objective of combating corruption, but concludes that the publication online of the majority of personal data contained in the declaration of private interests of any head of an establishment receiving public funds, does not meet the requirement of a proper balance. The interference following from the publication of the declaration is considerably more serious than the interference that would follow from a declaration coupled with a check of the declaration’s content by the Chief Ethics Commission. The court stresses that it is up to the Member State to ensure the effectiveness of such check with the means necessary for that purpose.

II. Non-sensitive data that are liable to reveal sensitive personal data need to be protected by the strengthened protection regime for processing of special categories of data

As set out above, the declaration of private interests also contained details about individuals that are related to the declarant. Some of these details, such as the name of the partner of the declarant, are liable to reveal information on certain sensitive aspects of the data subjects’ private life, such as their sexual orientation. The CJEU recognizes that non-sensitive personal data may reveal indirectly, following an intellectual operation involving deduction or cross-referencing, sensitive personal data that are protected by a strengthened protection regime.

In this regard, the CJEU first confirms the wide interpretation of the terms “special categories of personal data” and “sensitive data”, and consequently rules that personal data that are liable to disclose indirectly special categories of personal data of a natural person, need to be protected by the strengthened protection regime for processing of special categories of personal data, if the effectiveness of that regime and the protection of the fundamental rights and freedoms of natural persons that it is intended to ensure are not to be compromised.

III. Key points to remember

  1. Even where processing can be based on a legal obligation to which the controller is subject, the legal obligation may not constitute a legal basis if it, in itself, is not lawful.
  2. A lack of resources cannot justify a controller’s choice for achieving a legitimate aim with more intrusive means.
  3. Non-sensitive data may reveal indirectly, following an intellectual operation involving deduction or cross-referencing, sensitive personal data.
  4. Personal data that are liable to reveal sensitive data need to be protected by the strengthened protection regime for processing of special categories of personal data.

Crowell and Moring will continue to follow developments on these issues and provide ongoing updates.

The DOJ has long expressed concern about the impact of personal messaging – in particular of encrypted and ephemeral message apps – on its ability to effectively conduct investigations (and rely on the results of company investigations). Close on the heels of the well-publicized SEC enforcement sweeps of financial industry message retention practices, Deputy Attorney General Lisa Monaco recently issued a Corporate Crime Advisory Group Memo (the “Monaco Memo”) that articulates raised DOJ expectations for companies’ to retain and disclose employee personal device data. The DOJ’s expectations, however, may clash with practical limits on companies’ ability to control personal devices and with international data protection laws, and may increase companies’ preservation and disclosure risks in other proceedings.

Implementation of Personal Device and Third-Party Messaging Policies

In providing guidance to prosecutors on evaluating individual and corporate accountability, the Monaco Memo devotes an entire subsection to the “Use of Personal Devices and Third-Party Applications”. The Memo notes that the explosive growth in use for business purposes of personal smartphones, computers and other devices pose “significant corporate compliance risks” to a company’s and regulators’ ability to monitor misconduct and recover relevant data for an investigation.  A similar risk is posed by third-party messaging platforms, which may feature ephemeral and encrypted messaging.

A primary factor in prosecutors’ assessments of compliance is whether the corporation has taken sufficient steps to “ensure” it can timely preserve, collect and disclose “all non-privileged responsive documents … including … data contained on phones, tablets, or other devices that are used by its employees for business purposes.” Compliance programs must consider how that may be accomplished “given the proliferation of personal devices and messaging platforms that can take key communications off-system in the blink of an eye.” Markers of a robust compliance program include meaningful personal use policies, clear training and effective enforcement.  

Importance of Self-Disclosure

The DOJ wants to investigate and move to charging decisions quickly, and urges companies to structure their systems, processes and responses to this end. From the Miller Keynote: “Collectively, this new guidance should push prosecutors and corporate counsel alike to feel they are ‘on the clock’ to expedite investigations.… If a cooperating company discovers hot documents or evidence, its first reaction should be to notify the prosecutors”. Such “self-disclosure is often only possible when a company has a well-functioning Compliance Program that can serve as an early warning system and detect the misconduct early.” Ironically, the DOJ reportedly is simultaneously instructing prosecutors to “collect less evidence” because it purportedly is drowning in data. The DOJ seems to be looking to square this circle by increasing reliance on companies to review the expected torrent of personal device data that requires collection and assessment, and make rapid self-disclosures.

Impact of Foreign Data Privacy Laws

The Monaco Memo also makes clear that companies are expected to work hard to overcome any impediments to full disclosure posed by international and regional data privacy and protection laws. When faced with such conflicts, “the cooperating corporation bears the burden of establishing the existence of any restriction on production and of identifying reasonable alternatives to provide the requested facts and evidence, and is expected to work diligently to identify all available legal bases to preserve, collect, and produce such documents, data, and other evidence expeditiously.”

While not instructing companies to ignore foreign laws, the DOJ will credit companies that can successfully navigate such issues and produce relevant documents. Moreover, it cautions against any company that “actively seeks to capitalize on data privacy laws and similar statutes to shield misconduct inappropriately from detection and investigation by U.S. law enforcement,” noting that prosecutors may draw “an adverse inference as to the corporation’s cooperation … if such a corporation subsequently fails to produce foreign evidence.” Companies in this predicament are well advised to proactively consult with experienced cross-border data transfer counsel as to their obligations and options for response.

Does this mean companies have to be in control of their employees’ phones?

Companies revisiting their BYOD and compliance policies in light of the Monaco Memo will need to be alert for unintended consequences. There can be tension between expectations of aggressive corporate compliance measures and companies’ actual ability to control and access personal devices, as well as litigation risks and duties that may accompany such control. In some jurisdictions there may be no obligation to preserve and collect data from employee phones absent a “legal right” to obtain it (e.g., through contract or policy), while other courts hold that a company’s “practical ability” to obtain the data from the employee may suffice. See generally The Sedona Conference, Commentary on Rule 34 and Rule 45 “Possession, Custody, or Control,” 17 Sedona Conf. J. 467 (2016). For example, the court in In re Pork Antitrust Litig., No. 18-CV-1776 (JRT/HB), 2022 WL 972401 (D. Minn. Mar. 31, 2022) recently refused to compel a defendant to produce employee text messages because, inter alia, its BYOD policy did not expressly provide for company ownership of the texts or its right to access personal phones to obtain them. The court also reasoned that defendant “should not be compelled to terminate or threaten employees who refuse to turn over their devices for preservation or collection”. After the Monaco Memo, that is perhaps not the approach a prosecutor would take to a company looking for cooperation credit.

Takeaways

This wave of regulatory guidance and activity (more is forecast to be issued soon) reflect the DOJ’s emphasis on holding individuals accountable for corporate misconduct, and its need to fill off-channel gaps in the ability to perform such assessments. Cooperating corporations are expected to show sustained and comprehensive efforts to ensure that even occluded data sources like personal devices and messaging applications used for business are available for monitoring, review and disclosure. Companies should consider updating their policies to limit business communications to onboarded systems and platforms that are subject to retention; provide a process for spotting and reviewing business messages that nevertheless go through non-conforming channels; as well as providing enhanced training, auditing and enforcement. Compliance programs should be tested to confirm their effectiveness in the field, and not just on paper. To really motivate action, the DOJ is urging that executives have skin in the game – to tie compensation and promotion decisions to their fidelity to corporate use and retention policies. This would occasion a significant change in culture for many companies.

On October 7, 2022, President Biden signed an executive order implementing the EU-U.S. Data Privacy Framework.   Announced in March, this framework replaces the Privacy Shield program that the EU Court of Justice invalidated in July 2020 with its Schrems II decision. That decision stated that the United States did not provide a level of data protection that was “essentially equivalent” to that provided within the EU because signal intelligence surveillance by U.S. agencies was considered too broad and EU residents were not provided with effective remedies.  

The new framework is intended to facilitate the cross-border transfer of personal information from the EU to the U.S. in compliance with the EU’s General Data Protection Regulation (GDPR).  The executive order specifically addresses the process by which the U.S. intelligence community handles the personal data of EU residents and responds to complaints from EU residents.  Detailing the commitments made in the March announcement, the executive order provides the basis for the EU to proceed with an “adequacy” decision under the GDPR regarding cross-border data transfers.  With these additional protections in place, it is expected that a revised cross-border transfer framework can be finalized in the next few months.

According to the White House Fact Sheet accompanying the March announcement, the new framework requires that U.S. intelligence agencies may only conduct data-gathering operations that are necessary to advance legitimate national security objectives, and which do not disproportionately impact individual privacy and civil liberty interests.   The independent Privacy and Civil Liberties Oversight Board is charged with reviewing the U.S. intelligence community’s implementation of the new principles and procedures, including the outcome of redress decisions, and conducting annual compliance reviews.

The revised framework establishes a multi-tiered process by which EU residents can seek redress for alleged violations, replacing the government “ombudsperson” process rejected as inadequate by the EU court.  As a first step, EU residents can lodge complaints with the Civil Liberties Protection Officer (CLPO) in the Office of the Director of National Intelligence, who will perform an initial investigation and make binding decisions.  As a second level of review, the U.S. Department of Justice will establish an independent Data Protection Review Court comprised of independent judges who will review the CLPO’s decisions and “have full authority to adjudicate claims and direct remedial measures as needed.”   EU residents may file complaints via “special advocates” to represent their interests.

More than 5,300 companies participated in the Privacy Shield program before it was invalidated. Further, the decision invalidating Privacy Shield raised concerns about the adequacy of alternative data transfer mechanisms, including standard contractual clauses and binding corporate rules.  The safeguards and provisions contained in the March announcement and October 7 executive order would also apply to data transferred under these alternative mechanisms.

The next step is for the EU to conduct a determination as to whether the U.S. commitments meet GDPR’s “adequacy” standard for the transfer of personal data, a process anticipated to take about six months.  Once ratified by the European Commission, participation in the revised framework will require that companies self-certify their adherence with the U.S. Department of Commerce.  Although any adequacy determination is likely to be challenged in the EU courts, the new framework will create much greater certainty for the many organizations that depend on cross-border data flows to drive the trillions of dollars in annual cross-border commerce. 

Crowell and Moring will continue to follow developments on these issues and provide ongoing updates.

This is Part 4 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Parts 1, 2, and 3).

The DGA introduces two new types of “intermediaries” – data intermediation service providers and data altruism organizations – to help with the legal and technical practicalities and facilitate data sharing between data holders and data users. These new intermediaries will be able to garner the necessary expertise to establish a contractual and technical framework that fosters trust among data holders, data subjects and users. 

Both types of organization are intended to support data holders or data subjects in making their data available for re-use by third parties. However, data intermediation service providers may operate in a commercial context, while data altruism organizations are not-for-profit entities pursuing general interest objectives.

Although it is not yet entirely clear exactly what types of organizations may qualify as these intermediaries, new notions in the European legal order, and the purpose and the contours of the regulation are becoming apparent. The DGA does provides a general description of the type of organization that will qualify as a “data intermediation service” or a “data altruism organization”. It also imposes some restrictions regarding the conditions for data re-use and, importantly, it introduces new regulatory mechanisms handled by national authorities.

Data intermediation services

Providers of data intermediation services help data subjects and data holders establish commercial relationships with data users for the purpose of “data sharing” (i.e., the provision of data for the purpose of joint or individual use, based on voluntary agreements or Union or national law, in this case through an intermediary, under commercial or open license terms). 

The intermediation service may organize data pooling or the bilateral exchange of data. On the data provider side, the permitted number of data subjects or data holders is undetermined. Data cooperatives are covered but closed groups, such as consortia, are not. Only actual “intermediaries” are targeted: entities that aggregate, enrich, or otherwise add value to datasets in order to exploit the result for their own purposes, such as data brokers or consultancies, are not within the DGA’s scope. Similarly, providers of copyright protected content (such as streaming services) are not considered to be data intermediaries.

Data intermediation service providers will put in place the technical, legal or other means for the data holders/data subjects and the data users to enter into a commercial relationship. The DGA explicitly mentions the case of data subjects exercising their rights regarding their personal data through a data intermediation service: before the data subject gives consent to the data user, the intermediary should inform and even advise on the intended use of the data and the conditions of such use. It may then also provide tools to facilitate the giving and withdrawing of consent.

Because of their quality as intermediaries, providers of these services may not use the data for any purpose other than putting them at the disposal of data users. They may not use the data holders’/data subjects’ data for their own purposes, nor may they make the data intermediation service dependent on other services they may offer. Similarly, the meta-data relating to the use of their services may only be used for developing the data intermediation service. These restrictions are intended to foster a climate of trust, something that would be jeopardized were the trusted intermediary to be at the same time a data user.

Data intermediation service providers must offer access to their services on transparent, non-discriminatory terms (including price). Where the data contain personal data, the DGA explicitly provides that the intermediaries should pursue the data subjects’ best interests. 

Data intermediation service providers also have a role to play on the technical level, in particular as concern the data’s format and the tools available to the data holders and data subjects (e.g., conversion, curation, anonymization or pseudonymization).       

As far as the data intermediation service itself is concerned, the providers must take sufficient security measures, ensure interoperability with other service providers (e.g., open standards) and ensure a continuity of service (and the possibility for the data subjects/data holders to retrieve their data, in case of insolvency).

Data intermediation service providers are subject to new regulatory obligations: they must notify the (new) national authority of their intent, according to a procedure set out in the DGA, before they are allowed to start offering their services. Although no permit or prior authorization is required, data intermediation service providers may obtain a declaration from the competent national authority confirming compliance with the notification obligations. Much like the GDPR, this notification procedure targets service providers with activities in several Member States and service providers established in third countries (which must then designate a representative in the EU).

Data Altruism Organizations

Immense quantities of data (including health data) are needed in order to advance research into technologies that can be used for the public good (such as AI-based health tech applications). At the same time, the GDPR imposes a strict framework for the processing of personal data, which complicates the use and especially the re-use of personal data (for secondary purposes), even if a data subject consents and even if the processing operations pursue non-commercial or public interest purposes.

For example, a data subject may agree to the re-use of their medical results in the context of non-commercial, scientific research, without knowing in advance for which precise research projects the data will be used. GDPR data processing principles, such as purpose limitation or data minimization, complicate such open-purpose processing.

To address this issue, the DGA has introduced data altruism organizations. These organizations may organize the sharing of personal or non-personal data, for general interest purposes (e.g., healthcare, climate change, mobility), scientific research or statistics, without financial compensation for the data subject or data holder (beyond compensation related to the costs that they incur). Importantly, the sharing of such data is voluntary and based on the consent of the data subject or the permission of the data holder. 

However, the DGA does not specify how the data altruism organizations should collect the data from the data subjects and data holders, or which conditions must be met. It merely imposes some conditions and restrictions as to the use of the data in the general interest.

Data altruism organizations must comply with specific requirements to safeguard the rights and interest of both data subjects and data holders. They have certain information obligations (e.g., to provide information, before the data processing, concerning the purposes and location of the intended processing, and to inform data holders and data subjects about a data breach) and they may not use the data for other objectives than the general interest objectives for which the data processing is allowed. From a technical point of view, they must provide tools for obtaining and withdrawing consent, in addition to their security obligations.

The DGA imposes an obligation upon data altruism organizations to register with a Member State “competent authority”, which must verify whether the organization meets the requirements as to its activities, its legal persona and its general interest objectives, and the organization of its activities (in an independent, functionally separate entity from other activities). Like the GDPR, the DGA provides rules on the registration of data altruism organizations with activities in several Member States, or with an establishment outside the EU.

Data altruism organizations are subject to transparency obligations, meaning that they have to keep extensive records of the data users and the data use (date, period, purposes, fees), and draft an annual activity report.

Yesterday, the Office of Management and Budget (OMB) released Memorandum M-22-18, implementing software supply chain security requirements that will have a significant impact on software companies and vendors in accordance with Executive Order 14028, Improving the Nation’s Cybersecurity.  The Memorandum requires all federal agencies and their software suppliers to comply with the NIST Secure Software Development Framework (SSDF)NIST SP 800-­218, and the NIST Software Supply Chain Security Guidance whenever third-party software is used on government information systems or otherwise affects government information.  The term “software” includes firmware, operating systems, applications, and application services (e.g., cloud-based software), as well as products containing software.  It is critical to note that these requirements will apply whenever there is a major version update or new software that the government will be using. 

The Memorandum requires agencies to take the following actions:

  • within 90 days, agencies must inventory all software subject to the Memorandum;
  • within 120 days, agencies will have developed a process to communicate requirements to vendors and ensure that vendor attestation letters can be collected in a central agency system;
  • within 180 days, agencies must assess training needs and develop plans for the review and validation of attestation documents;
  • within 270 days for critical software and within 365 days for all others, agencies will require self-attestations from all software producers; and
  • as needed, obtain from software producers a Software Bill of Materials (SBOM) or other artifact(s) that demonstrate conformance to secure software development practices. 

To comply with the Memorandum, software producers must attest that they adhere to the NIST software supply chain frameworks and guidance.  In lieu of a self-attestation, software producers may also submit third-party assessments of compliance with the software security standards conducted by a certified FedRAMP assessor or an assessor approved by the agency.

Software producers or vendors providing software to the federal government should begin reviewing their security practices and their overall software development lifecycle immediately to ensure that they can attest to compliance with the applicable NIST standards in the very near future.   

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

This is Part 3 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Part 1 and Part 2).

In this post we will consider business to government (B2G and G2B) relations and examine how the European legislature intends to facilitate data sharing here.

As a general rule, data holders are free to decide whether to share their data with public authorities – except where specific legal obligations require the legal or natural person to provide information to tax, administrative or public prosecution authorities. 

The Commission gave some guidance on the conditions for the re-use by public authorities of voluntarily shared private sector data in its 2018 communication and associated staff working document.

The DA and the DGA add to this path and contain provisions making it possible for public authorities to gain access to data held by private entities in case of “exceptional need”, and by allowing certain data to become available to third parties (such as researchers) even when the Open Data Directive does not apply.

B2G data sharing in case of “exceptional need”

The DA imposes a new obligation upon data holders (except SMEs) to make data available if public sector bodies or EU institutions, agencies or bodies have an “exceptional need” for the data. The data may also be re-used for non-commercial research or statistical purposes in this context.

Such “exceptional need” may exist in case of a “public emergency”, defined as an “exceptional situation negatively affecting the population of the Union, a Member State or part of it, with a risk of serious and lasting repercussions on living conditions or economic stability, or the substantial degradation of economic assets in the Union or the relevant Member State(s).” A pandemic or a war may qualify as a “public emergency”.

More broadly, an “exceptional need” may exist where a public authority does not have the data it needs to fulfil a specific task in the public interest, despite having tried to obtain such data in accordance with market conditions or by virtue of other legal provisions. Although data made available in response to a public emergency must be provided free of charge, compensation can be claimed for data provided in other cases of exception need.

To take a concrete example, during an emergency like the COVID 19 pandemic, a government agency competent for public health would be able to collect aggregated telecom data if these data were necessary in order to respond to or recover from the epidemic (e.g., to predict or analyse its development). What’s more, the public authority would be able to share such data with researchers working on an urgent vaccine who needed access to medical data – provided that this data re-use remained within the purposes for which the public authority had requested the data.

The DA sets out an elaborate procedure by which public authorities must request data, and data holders must comply, decline or modify such requests.

Once a public authority has gained access to the requested data, the data may be used for the stated purposes (this principle is similar to the purpose limitation principle contained in the GDPR). Public authorities may not use the DA data sharing obligations to gain access to or re-use data in the context of criminal, customs or tax proceedings. Moreover, the acquired data may not be made available to the public as “open data”, although its re-use for non-commercial research or statistical purposes is permitted in the context of exceptional need. Public authorities must take all necessary measures to protect personal data and trade secrets, and they must destroy data after use (this is analogous to the “storage limitation” principle in the GDPR).

G2B – access to public sector data

It has long been acknowledged that public sector information must be accessible to the public, citizens and undertakings alike. Not only does such access safeguard the transparency of public administrations and governments, information obtained through the investment of public means can also be a considerable asset to the private sector. 

The 2019 Open Data Directive (which replaced the 2003 Directive on the re-use of public sector information) requires Member States to promote the use of open data and stimulate innovation in products and services by establishing minimum rules for the re-use of public sector information. As a result, a national meteorological institution, for example, if financed by public means, may be under an obligation to make “high value” sets of weather data available to the public in a machine-readable form, via an application programming interface, and, where possible, for download. However, the Open Data Directive contains important exceptions covering, for example, information protected under intellectual property rights and trade secrets, and personal data: public authorities in the Member States are under no obligation to make such information accessible to the public.

Although the DGA does not oblige Member State public authorities to allow the re-use of information that is outside the Open Data Directive, it does create a legal framework for the re-use of “data” in general (which includes data protected on grounds of commercial or statistical confidentiality, third-party intellectual property or personal data).

Where a public sector body (PSB) agrees to make such data available for re-use, the data should normally be made available to all third parties, without restrictions or exclusivity. Only if the exclusivity is required for the provision of a service or product in the general interest may the PSB consider granting an exclusive right – which should in any event be limited to a maximum of 12 months.

The PSB may impose conditions for the re-use of data upon the re-user (e.g., fees, measures to protect personal data or creations subject to intellectual property rights or trade secrets) but there must be transparency, and the PSB must make sure that the conditions, which must be fair, non-discriminatory, proportionate and objectively justified, are publicly accessible.

A re-user who agrees to such conditions will be held by a confidentiality obligation, must comply with intellectual property rights, and may not identify data subjects. Importantly, a re-user who intends to make international data transfers must notify the PSB (even if no personal data are involved).

The DA and DGA thus acknowledge both the importance of data for the public sector and the secondary use of public sector data by the private sector, while attempting to safeguard third party rights. This could result in a complex web of legal and contractual restrictions, which could make it difficult for both the PSB and the data acquirer to understand which use is permitted and under which conditions. Much will depend on the whether the PSBs can adapt to their new role: to clear all third-party rights and to formulate such rights and interests in clear contractual conditions (and warranties) for the data users.

Part 4 in this series of blog posts will look at the role of the newly defined data intermediaries that are intended to facilitate data sharing.

On August 24, 2022, the California Attorney General’s Office announced a settlement with Sephora, Inc. (Sephora), a French multinational personal care and beauty products retailer. The settlement resolved Sephora’s alleged violations of the California Consumer Privacy Act (CCPA) for allegedly failing to: disclose to consumers that the company was selling their personal information, process user requests to opt out of sale via user-enabled global privacy controls, and cure these violations within the 30-day period currently allowed by the CCPA.

As part of the settlement, Sephora is required to pay $1.2 million in penalties and comply with injunctive terms, specifically:

  • Clarifying its online disclosures and privacy policy to include an affirmative representation that it sells personal information;
  • Providing mechanisms for consumers to opt out of the sale of personal information, including via the Global Privacy Control (GPC)
  • Conforming its service provider agreements to the CCPA’s requirements; and 
  • Providing reports to the Attorney General relating to its sale of personal information, the status of its service provider relationships, and its efforts to honor GPC.

The settlement is the among the most significant enforcement actions taken in the effort to ensure businesses comply with California’s privacy law – the first of its kind in the United States. Through the CCPA, consumers can ask businesses to stop selling their personal information to third parties, including those signaled by the GPC.  GPC is a third-party tool that could be used by consumers to opt out of the sale of their personal information by automatically sending a signal to any site that is visited by the consumer.

People of the State of California v. Sephora USA, Inc.

The complaint filed by the California Office of the Attorney General (OAG) stated that the Attorney General commenced an enforcement sweep of large retailers to determine whether they continued to sell personal information when a consumer signaled an opt-out via the GPC. According to the complaint, the Attorney General found that activating the GPC signal had no effect when a consumer would visit the Sephora website and that data continued to flow to third party companies, including advertising and analytics providers.  That led to the Attorney General’s conclusion that Sephora’s website allegedly was not configured to detect or process any global privacy control signals, such as GPC, and that Sephora allegedly took no action to block the sharing of personal information when a California consumer signaled their opt-out using the GPC.  The complaint further highlighted the need for businesses to be transparent regarding their use of third-party trackers on their websites and mobile applications.  

The complaint further alleged that when Sephora sells products online, it collects personal information about consumers, including products that consumers view and purchase, consumers’ geolocation data, cookies and other user identifiers, and technical information about consumers’ operating systems and browser types. It then makes this data available to third parties such as advertising networks, business partners, and data analytics providers by installing (or allowing the installation of) third-party trackers in the form of cookies, pixels, software development kits, and other technologies, which automatically send data about consumers’ online behavior to the third-party companies.

By allowing third-party companies access to its customers’ online activities, the complaint alleged that Sephora received discounted or higher-quality analytics and other services derived from the data about consumers’ online activities, including the option to target advertisements to customers that had merely browsed for products online. The complaint alleged that Sephora’s website and mobile app failed to inform consumers that it sells their personal information and that they have the right to opt-out of this sale, that it failed to provide a clear and conspicuous “Do Not Sell My Personal Information” link on their site, and that it failed to provide two or more designated methods for submitting requests to opt-out.  Under Cal. Civ. Code § 1798.140, the CCPA defines a “sale” of personal information to include a disclosure for monetary or other valuable consideration. 

Sephora also allegedly did not have valid service provider contracts in place with each third party that collected personal information when Sephora installed or allowed the use of cookies or relevant code on its website or app, which is one exception to “sale” under the CCPA. Once notified of its CCPA violations, Sephora had 30 days to cure as outlined under the law. However, the company allegedly failed to cure the alleged violations within the time period, thereby prompting the Attorney General to initiate an investigation which led to the enforcement action.

Key Takeaways

The settlement outlines that the “sale” of personal information includes the trade of consumers’ personal information with third parties in exchange for analytics services or placing third party advertising cookies on a website, and other automatic data collection technologies that allow access to consumers’ online activities in exchange for advertising or analytic services. Moreover, such activities will subsequently be considered as either a “sale” or “share” of information under the California Privacy Rights Act (CPRA), effective January 1, 2023. The settlement also drives home the importance of complying with a customer’s request to opt-out of the sale of information, particularly through GPC.

The Attorney General’s enforcement action in the Sephora case aligns with many of the CCPA Enforcement Case Examples previously published by the OAG, which revolve around the disclosure of material terms, consumer consent, cookie options, opt-out mechanisms, and the need to maintain an up-do-date privacy policy. In this enforcement action, OAG pays particular focus on compliance with a consumer’s exercise of their privacy rights.

Businesses should take note of the higher scrutiny devoted to the treatment of consumer data and make efforts to comply with the California privacy laws, including:

  • Assessing whether it uses cookies or other technologies that may be considered a “sale” or “sharing” of personal information for targeted advertising, analytics, or in exchange of other forms of value.
  • Ensuring that its privacy policies are transparent as to the collection, processing, sale and sharing of personal information. A company’s privacy policy should clearly state whether personal information is sold.
  • Confirming that it has established opt-out mechanisms to allow consumers the ability to exercise their opt-out rights. This can take the form of a “Do Not Sell My Personal Information” link at the bottom of the company’s website. More importantly, should a consumer exercise their opt-out rights, a business should ensure that it has an established mechanism to process the request. This would include reviewing website capabilities to recognize any Global Privacy Control signals issued by a consumer’s browser. The settlement makes clear that a business must ensure that any user who has “user-enabled global privacy controls” is treated that same as users who have clicked the “Do Not Sell My Personal Information” link. The impetus behind this requirement stems from the desire to give consumers the ability to stop their data from being sold and allow such consumer to universally opt-out of all online sales in one fell swoop, without the need to click each time on an opt-out link. Businesses should assess their website’s capability to recognize signals triggered by GPC and recognize that an enforcement action is possible if the business does not implement adequate mechanisms to comply with consumer’s opt-out requests.
  • Reviewing the obligations under the California Privacy Rights Act, which will be effective January 1, 2023.

Accordingly, businesses should be diligent in assessing their compliance with the California privacy law. Looking to the future, businesses may also want to review the recently introduced American Data Privacy and Protection Act, a federal legislation aimed at creating a comprehensive federal consumer privacy framework. While not yet adopted, this may provide additional information of how privacy at the federal level may unfold in the coming years.

* * *

Crowell & Moring LLP has a robust California Consumer Privacy Act Practice and is highly experienced at advising companies of all sizes on compliance with state privacy laws. Crowell also has an extensive library of alerts and resources associated with California’s privacy laws, including: CCPA 2.0? California Adopts Sweeping New Data Privacy Protections, California AG Interprets Inferences Under CCPA, and Enforcement of The California Consumer Privacy Act Via Letters Noticing Noncompliant Loyalty Programs and Online Tool for Consumers to Notify Businesses of Potential Violations. If you have questions about this alert or similar issues, please contact one of the Crowell & Moring attorneys listed below, or your regular Crowell & Moring contact.