In a judgment of August 1, 2022, the Court of Justice of the European Union (CJEU) provided further guidance on two important aspects of the General Data Protection Regulation (GDPR) (CJEU C-184/20). In summary, the CJEU held that, first, for a national law that imposes a legal obligation to process personal data to be able to constitute a legal basis for processing, it needs to be lawful, meaning that it must meet an objective of public interest and be proportionate to the legitimate aim pursued, and second, that non-sensitive data that are liable to reveal sensitive personal data need to be protected by the strengthened protection regime for processing of special categories of personal data.

The judgment followed the request for a preliminary ruling from the Vilnius Regional Administrative Court (Lithuania) concerning a Lithuanian anti-corruption law that required individuals working in the public service and the public interests of society to declare their private interests by lodging a declaration of private interests. The declarant was obliged to provide details about him- or herself and his or her spouse, cohabitee or partner, such as name, personal identification number, employment status, membership or undertakings, and information about certain financial transactions. Most of this information, including the name of the declarant’s partner, was published by the Chief Official Ethics Commission on a public website.

The main take-aways from the judgment can be summarized as follow.

I. A national law that imposes a legal obligation to process personal data can only constitute a legal basis for processing when it meets an objective of public interest and is proportionate to the legitimate aim pursued

The CJEU recognizes that the Lithuanian law that required the declaration of private interests serves an objective of public interest, i.e. guaranteeing the proper management of public affairs and public property, by ensuring that public sector decision makers perform their duties impartially and objectively and preventing them from being influenced by considerations relating to private interests. Combating corruption is an objective of public interest and, accordingly, legitimate.

On the other hand, the CJEU emphasizes that Member States need to consider the principle of proportionality in setting out the requirements for achieving such a legitimate objective. This means that the measures to achieve the objective need to be appropriate, adequate and strictly necessary.

While the measure—the declaration of private interests—is appropriate for contributing to the achievement of the objectives of general interest that it pursues, it is not strictly necessary to publish the content of the declarations of private interest on a public website. The objective could be achieved as effectively if the Chief Ethics Commission would review the content of the declarations instead of publishing them. Not having sufficient human resources to check effectively all the declarations cannot justify the publication of the declarations.

Moreover, an objective of general interest may not be pursued without having regard to the fact that it must be reconciled with the fundamental rights affected by the measure. This means that, for the purpose of assessing the proportionality of the processing, it is necessary to measure the seriousness of the interference with the fundamental rights to respect for private life and to the protection of personal data that that processing involves and to determine whether the importance of the objective of general interest pursued by the processing is proportionate to the seriousness of the interference.

In this context, the CJEU stresses a number of contextual elements. First, the public disclosure, online, of name-specific data relating to the declarant’s partner, or to persons who are close relatives of the declarant, are liable to reveal information on certain sensitive aspects of the data subjects’ private life, including, for example, their sexual orientation. Second, the declaration also concerns persons who are not public sector decision makers, but who are related to the declarant in another than his/her public sector capacity, and in respect of whom the objectives pursued by the law are not imperative in the same way as for the declarant. Third, the cumulative effect of the personal data that are published may still increase the seriousness of the infringement, since combining them enables a particularly detailed picture of the data subjects’ private lives to be built up. The CJEU further points out that the publication of the content of the declaration implies that the personal data are made freely accessible on the internet to the whole of the general public and, accordingly, to a potentially unlimited number of persons.

All this leads to a serious interference with the fundamental rights of data subjects to respect for private life and to the protection of personal data. The seriousness of that interference must be weighed against the importance of the objectives of preventing conflicts of interest and corruption in the public sector. In that regard, the CJEU confirms again the great importance of the objective of combating corruption, but concludes that the publication online of the majority of personal data contained in the declaration of private interests of any head of an establishment receiving public funds, does not meet the requirement of a proper balance. The interference following from the publication of the declaration is considerably more serious than the interference that would follow from a declaration coupled with a check of the declaration’s content by the Chief Ethics Commission. The court stresses that it is up to the Member State to ensure the effectiveness of such check with the means necessary for that purpose.

II. Non-sensitive data that are liable to reveal sensitive personal data need to be protected by the strengthened protection regime for processing of special categories of data

As set out above, the declaration of private interests also contained details about individuals that are related to the declarant. Some of these details, such as the name of the partner of the declarant, are liable to reveal information on certain sensitive aspects of the data subjects’ private life, such as their sexual orientation. The CJEU recognizes that non-sensitive personal data may reveal indirectly, following an intellectual operation involving deduction or cross-referencing, sensitive personal data that are protected by a strengthened protection regime.

In this regard, the CJEU first confirms the wide interpretation of the terms “special categories of personal data” and “sensitive data”, and consequently rules that personal data that are liable to disclose indirectly special categories of personal data of a natural person, need to be protected by the strengthened protection regime for processing of special categories of personal data, if the effectiveness of that regime and the protection of the fundamental rights and freedoms of natural persons that it is intended to ensure are not to be compromised.

III. Key points to remember

  1. Even where processing can be based on a legal obligation to which the controller is subject, the legal obligation may not constitute a legal basis if it, in itself, is not lawful.
  2. A lack of resources cannot justify a controller’s choice for achieving a legitimate aim with more intrusive means.
  3. Non-sensitive data may reveal indirectly, following an intellectual operation involving deduction or cross-referencing, sensitive personal data.
  4. Personal data that are liable to reveal sensitive data need to be protected by the strengthened protection regime for processing of special categories of personal data.

Crowell and Moring will continue to follow developments on these issues and provide ongoing updates.

The DOJ has long expressed concern about the impact of personal messaging – in particular of encrypted and ephemeral message apps – on its ability to effectively conduct investigations (and rely on the results of company investigations). Close on the heels of the well-publicized SEC enforcement sweeps of financial industry message retention practices, Deputy Attorney General Lisa Monaco recently issued a Corporate Crime Advisory Group Memo (the “Monaco Memo”) that articulates raised DOJ expectations for companies’ to retain and disclose employee personal device data. The DOJ’s expectations, however, may clash with practical limits on companies’ ability to control personal devices and with international data protection laws, and may increase companies’ preservation and disclosure risks in other proceedings.

Implementation of Personal Device and Third-Party Messaging Policies

In providing guidance to prosecutors on evaluating individual and corporate accountability, the Monaco Memo devotes an entire subsection to the “Use of Personal Devices and Third-Party Applications”. The Memo notes that the explosive growth in use for business purposes of personal smartphones, computers and other devices pose “significant corporate compliance risks” to a company’s and regulators’ ability to monitor misconduct and recover relevant data for an investigation.  A similar risk is posed by third-party messaging platforms, which may feature ephemeral and encrypted messaging.

A primary factor in prosecutors’ assessments of compliance is whether the corporation has taken sufficient steps to “ensure” it can timely preserve, collect and disclose “all non-privileged responsive documents … including … data contained on phones, tablets, or other devices that are used by its employees for business purposes.” Compliance programs must consider how that may be accomplished “given the proliferation of personal devices and messaging platforms that can take key communications off-system in the blink of an eye.” Markers of a robust compliance program include meaningful personal use policies, clear training and effective enforcement.  

Importance of Self-Disclosure

The DOJ wants to investigate and move to charging decisions quickly, and urges companies to structure their systems, processes and responses to this end. From the Miller Keynote: “Collectively, this new guidance should push prosecutors and corporate counsel alike to feel they are ‘on the clock’ to expedite investigations.… If a cooperating company discovers hot documents or evidence, its first reaction should be to notify the prosecutors”. Such “self-disclosure is often only possible when a company has a well-functioning Compliance Program that can serve as an early warning system and detect the misconduct early.” Ironically, the DOJ reportedly is simultaneously instructing prosecutors to “collect less evidence” because it purportedly is drowning in data. The DOJ seems to be looking to square this circle by increasing reliance on companies to review the expected torrent of personal device data that requires collection and assessment, and make rapid self-disclosures.

Impact of Foreign Data Privacy Laws

The Monaco Memo also makes clear that companies are expected to work hard to overcome any impediments to full disclosure posed by international and regional data privacy and protection laws. When faced with such conflicts, “the cooperating corporation bears the burden of establishing the existence of any restriction on production and of identifying reasonable alternatives to provide the requested facts and evidence, and is expected to work diligently to identify all available legal bases to preserve, collect, and produce such documents, data, and other evidence expeditiously.”

While not instructing companies to ignore foreign laws, the DOJ will credit companies that can successfully navigate such issues and produce relevant documents. Moreover, it cautions against any company that “actively seeks to capitalize on data privacy laws and similar statutes to shield misconduct inappropriately from detection and investigation by U.S. law enforcement,” noting that prosecutors may draw “an adverse inference as to the corporation’s cooperation … if such a corporation subsequently fails to produce foreign evidence.” Companies in this predicament are well advised to proactively consult with experienced cross-border data transfer counsel as to their obligations and options for response.

Does this mean companies have to be in control of their employees’ phones?

Companies revisiting their BYOD and compliance policies in light of the Monaco Memo will need to be alert for unintended consequences. There can be tension between expectations of aggressive corporate compliance measures and companies’ actual ability to control and access personal devices, as well as litigation risks and duties that may accompany such control. In some jurisdictions there may be no obligation to preserve and collect data from employee phones absent a “legal right” to obtain it (e.g., through contract or policy), while other courts hold that a company’s “practical ability” to obtain the data from the employee may suffice. See generally The Sedona Conference, Commentary on Rule 34 and Rule 45 “Possession, Custody, or Control,” 17 Sedona Conf. J. 467 (2016). For example, the court in In re Pork Antitrust Litig., No. 18-CV-1776 (JRT/HB), 2022 WL 972401 (D. Minn. Mar. 31, 2022) recently refused to compel a defendant to produce employee text messages because, inter alia, its BYOD policy did not expressly provide for company ownership of the texts or its right to access personal phones to obtain them. The court also reasoned that defendant “should not be compelled to terminate or threaten employees who refuse to turn over their devices for preservation or collection”. After the Monaco Memo, that is perhaps not the approach a prosecutor would take to a company looking for cooperation credit.

Takeaways

This wave of regulatory guidance and activity (more is forecast to be issued soon) reflect the DOJ’s emphasis on holding individuals accountable for corporate misconduct, and its need to fill off-channel gaps in the ability to perform such assessments. Cooperating corporations are expected to show sustained and comprehensive efforts to ensure that even occluded data sources like personal devices and messaging applications used for business are available for monitoring, review and disclosure. Companies should consider updating their policies to limit business communications to onboarded systems and platforms that are subject to retention; provide a process for spotting and reviewing business messages that nevertheless go through non-conforming channels; as well as providing enhanced training, auditing and enforcement. Compliance programs should be tested to confirm their effectiveness in the field, and not just on paper. To really motivate action, the DOJ is urging that executives have skin in the game – to tie compensation and promotion decisions to their fidelity to corporate use and retention policies. This would occasion a significant change in culture for many companies.

On October 7, 2022, President Biden signed an executive order implementing the EU-U.S. Data Privacy Framework.   Announced in March, this framework replaces the Privacy Shield program that the EU Court of Justice invalidated in July 2020 with its Schrems II decision. That decision stated that the United States did not provide a level of data protection that was “essentially equivalent” to that provided within the EU because signal intelligence surveillance by U.S. agencies was considered too broad and EU residents were not provided with effective remedies.  

The new framework is intended to facilitate the cross-border transfer of personal information from the EU to the U.S. in compliance with the EU’s General Data Protection Regulation (GDPR).  The executive order specifically addresses the process by which the U.S. intelligence community handles the personal data of EU residents and responds to complaints from EU residents.  Detailing the commitments made in the March announcement, the executive order provides the basis for the EU to proceed with an “adequacy” decision under the GDPR regarding cross-border data transfers.  With these additional protections in place, it is expected that a revised cross-border transfer framework can be finalized in the next few months.

According to the White House Fact Sheet accompanying the March announcement, the new framework requires that U.S. intelligence agencies may only conduct data-gathering operations that are necessary to advance legitimate national security objectives, and which do not disproportionately impact individual privacy and civil liberty interests.   The independent Privacy and Civil Liberties Oversight Board is charged with reviewing the U.S. intelligence community’s implementation of the new principles and procedures, including the outcome of redress decisions, and conducting annual compliance reviews.

The revised framework establishes a multi-tiered process by which EU residents can seek redress for alleged violations, replacing the government “ombudsperson” process rejected as inadequate by the EU court.  As a first step, EU residents can lodge complaints with the Civil Liberties Protection Officer (CLPO) in the Office of the Director of National Intelligence, who will perform an initial investigation and make binding decisions.  As a second level of review, the U.S. Department of Justice will establish an independent Data Protection Review Court comprised of independent judges who will review the CLPO’s decisions and “have full authority to adjudicate claims and direct remedial measures as needed.”   EU residents may file complaints via “special advocates” to represent their interests.

More than 5,300 companies participated in the Privacy Shield program before it was invalidated. Further, the decision invalidating Privacy Shield raised concerns about the adequacy of alternative data transfer mechanisms, including standard contractual clauses and binding corporate rules.  The safeguards and provisions contained in the March announcement and October 7 executive order would also apply to data transferred under these alternative mechanisms.

The next step is for the EU to conduct a determination as to whether the U.S. commitments meet GDPR’s “adequacy” standard for the transfer of personal data, a process anticipated to take about six months.  Once ratified by the European Commission, participation in the revised framework will require that companies self-certify their adherence with the U.S. Department of Commerce.  Although any adequacy determination is likely to be challenged in the EU courts, the new framework will create much greater certainty for the many organizations that depend on cross-border data flows to drive the trillions of dollars in annual cross-border commerce. 

Crowell and Moring will continue to follow developments on these issues and provide ongoing updates.

This is Part 4 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Parts 1, 2, and 3).

The DGA introduces two new types of “intermediaries” – data intermediation service providers and data altruism organizations – to help with the legal and technical practicalities and facilitate data sharing between data holders and data users. These new intermediaries will be able to garner the necessary expertise to establish a contractual and technical framework that fosters trust among data holders, data subjects and users. 

Both types of organization are intended to support data holders or data subjects in making their data available for re-use by third parties. However, data intermediation service providers may operate in a commercial context, while data altruism organizations are not-for-profit entities pursuing general interest objectives.

Although it is not yet entirely clear exactly what types of organizations may qualify as these intermediaries, new notions in the European legal order, and the purpose and the contours of the regulation are becoming apparent. The DGA does provides a general description of the type of organization that will qualify as a “data intermediation service” or a “data altruism organization”. It also imposes some restrictions regarding the conditions for data re-use and, importantly, it introduces new regulatory mechanisms handled by national authorities.

Data intermediation services

Providers of data intermediation services help data subjects and data holders establish commercial relationships with data users for the purpose of “data sharing” (i.e., the provision of data for the purpose of joint or individual use, based on voluntary agreements or Union or national law, in this case through an intermediary, under commercial or open license terms). 

The intermediation service may organize data pooling or the bilateral exchange of data. On the data provider side, the permitted number of data subjects or data holders is undetermined. Data cooperatives are covered but closed groups, such as consortia, are not. Only actual “intermediaries” are targeted: entities that aggregate, enrich, or otherwise add value to datasets in order to exploit the result for their own purposes, such as data brokers or consultancies, are not within the DGA’s scope. Similarly, providers of copyright protected content (such as streaming services) are not considered to be data intermediaries.

Data intermediation service providers will put in place the technical, legal or other means for the data holders/data subjects and the data users to enter into a commercial relationship. The DGA explicitly mentions the case of data subjects exercising their rights regarding their personal data through a data intermediation service: before the data subject gives consent to the data user, the intermediary should inform and even advise on the intended use of the data and the conditions of such use. It may then also provide tools to facilitate the giving and withdrawing of consent.

Because of their quality as intermediaries, providers of these services may not use the data for any purpose other than putting them at the disposal of data users. They may not use the data holders’/data subjects’ data for their own purposes, nor may they make the data intermediation service dependent on other services they may offer. Similarly, the meta-data relating to the use of their services may only be used for developing the data intermediation service. These restrictions are intended to foster a climate of trust, something that would be jeopardized were the trusted intermediary to be at the same time a data user.

Data intermediation service providers must offer access to their services on transparent, non-discriminatory terms (including price). Where the data contain personal data, the DGA explicitly provides that the intermediaries should pursue the data subjects’ best interests. 

Data intermediation service providers also have a role to play on the technical level, in particular as concern the data’s format and the tools available to the data holders and data subjects (e.g., conversion, curation, anonymization or pseudonymization).       

As far as the data intermediation service itself is concerned, the providers must take sufficient security measures, ensure interoperability with other service providers (e.g., open standards) and ensure a continuity of service (and the possibility for the data subjects/data holders to retrieve their data, in case of insolvency).

Data intermediation service providers are subject to new regulatory obligations: they must notify the (new) national authority of their intent, according to a procedure set out in the DGA, before they are allowed to start offering their services. Although no permit or prior authorization is required, data intermediation service providers may obtain a declaration from the competent national authority confirming compliance with the notification obligations. Much like the GDPR, this notification procedure targets service providers with activities in several Member States and service providers established in third countries (which must then designate a representative in the EU).

Data Altruism Organizations

Immense quantities of data (including health data) are needed in order to advance research into technologies that can be used for the public good (such as AI-based health tech applications). At the same time, the GDPR imposes a strict framework for the processing of personal data, which complicates the use and especially the re-use of personal data (for secondary purposes), even if a data subject consents and even if the processing operations pursue non-commercial or public interest purposes.

For example, a data subject may agree to the re-use of their medical results in the context of non-commercial, scientific research, without knowing in advance for which precise research projects the data will be used. GDPR data processing principles, such as purpose limitation or data minimization, complicate such open-purpose processing.

To address this issue, the DGA has introduced data altruism organizations. These organizations may organize the sharing of personal or non-personal data, for general interest purposes (e.g., healthcare, climate change, mobility), scientific research or statistics, without financial compensation for the data subject or data holder (beyond compensation related to the costs that they incur). Importantly, the sharing of such data is voluntary and based on the consent of the data subject or the permission of the data holder. 

However, the DGA does not specify how the data altruism organizations should collect the data from the data subjects and data holders, or which conditions must be met. It merely imposes some conditions and restrictions as to the use of the data in the general interest.

Data altruism organizations must comply with specific requirements to safeguard the rights and interest of both data subjects and data holders. They have certain information obligations (e.g., to provide information, before the data processing, concerning the purposes and location of the intended processing, and to inform data holders and data subjects about a data breach) and they may not use the data for other objectives than the general interest objectives for which the data processing is allowed. From a technical point of view, they must provide tools for obtaining and withdrawing consent, in addition to their security obligations.

The DGA imposes an obligation upon data altruism organizations to register with a Member State “competent authority”, which must verify whether the organization meets the requirements as to its activities, its legal persona and its general interest objectives, and the organization of its activities (in an independent, functionally separate entity from other activities). Like the GDPR, the DGA provides rules on the registration of data altruism organizations with activities in several Member States, or with an establishment outside the EU.

Data altruism organizations are subject to transparency obligations, meaning that they have to keep extensive records of the data users and the data use (date, period, purposes, fees), and draft an annual activity report.

Yesterday, the Office of Management and Budget (OMB) released Memorandum M-22-18, implementing software supply chain security requirements that will have a significant impact on software companies and vendors in accordance with Executive Order 14028, Improving the Nation’s Cybersecurity.  The Memorandum requires all federal agencies and their software suppliers to comply with the NIST Secure Software Development Framework (SSDF)NIST SP 800-­218, and the NIST Software Supply Chain Security Guidance whenever third-party software is used on government information systems or otherwise affects government information.  The term “software” includes firmware, operating systems, applications, and application services (e.g., cloud-based software), as well as products containing software.  It is critical to note that these requirements will apply whenever there is a major version update or new software that the government will be using. 

The Memorandum requires agencies to take the following actions:

  • within 90 days, agencies must inventory all software subject to the Memorandum;
  • within 120 days, agencies will have developed a process to communicate requirements to vendors and ensure that vendor attestation letters can be collected in a central agency system;
  • within 180 days, agencies must assess training needs and develop plans for the review and validation of attestation documents;
  • within 270 days for critical software and within 365 days for all others, agencies will require self-attestations from all software producers; and
  • as needed, obtain from software producers a Software Bill of Materials (SBOM) or other artifact(s) that demonstrate conformance to secure software development practices. 

To comply with the Memorandum, software producers must attest that they adhere to the NIST software supply chain frameworks and guidance.  In lieu of a self-attestation, software producers may also submit third-party assessments of compliance with the software security standards conducted by a certified FedRAMP assessor or an assessor approved by the agency.

Software producers or vendors providing software to the federal government should begin reviewing their security practices and their overall software development lifecycle immediately to ensure that they can attest to compliance with the applicable NIST standards in the very near future.   

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

This is Part 3 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Part 1 and Part 2).

In this post we will consider business to government (B2G and G2B) relations and examine how the European legislature intends to facilitate data sharing here.

As a general rule, data holders are free to decide whether to share their data with public authorities – except where specific legal obligations require the legal or natural person to provide information to tax, administrative or public prosecution authorities. 

The Commission gave some guidance on the conditions for the re-use by public authorities of voluntarily shared private sector data in its 2018 communication and associated staff working document.

The DA and the DGA add to this path and contain provisions making it possible for public authorities to gain access to data held by private entities in case of “exceptional need”, and by allowing certain data to become available to third parties (such as researchers) even when the Open Data Directive does not apply.

B2G data sharing in case of “exceptional need”

The DA imposes a new obligation upon data holders (except SMEs) to make data available if public sector bodies or EU institutions, agencies or bodies have an “exceptional need” for the data. The data may also be re-used for non-commercial research or statistical purposes in this context.

Such “exceptional need” may exist in case of a “public emergency”, defined as an “exceptional situation negatively affecting the population of the Union, a Member State or part of it, with a risk of serious and lasting repercussions on living conditions or economic stability, or the substantial degradation of economic assets in the Union or the relevant Member State(s).” A pandemic or a war may qualify as a “public emergency”.

More broadly, an “exceptional need” may exist where a public authority does not have the data it needs to fulfil a specific task in the public interest, despite having tried to obtain such data in accordance with market conditions or by virtue of other legal provisions. Although data made available in response to a public emergency must be provided free of charge, compensation can be claimed for data provided in other cases of exception need.

To take a concrete example, during an emergency like the COVID 19 pandemic, a government agency competent for public health would be able to collect aggregated telecom data if these data were necessary in order to respond to or recover from the epidemic (e.g., to predict or analyse its development). What’s more, the public authority would be able to share such data with researchers working on an urgent vaccine who needed access to medical data – provided that this data re-use remained within the purposes for which the public authority had requested the data.

The DA sets out an elaborate procedure by which public authorities must request data, and data holders must comply, decline or modify such requests.

Once a public authority has gained access to the requested data, the data may be used for the stated purposes (this principle is similar to the purpose limitation principle contained in the GDPR). Public authorities may not use the DA data sharing obligations to gain access to or re-use data in the context of criminal, customs or tax proceedings. Moreover, the acquired data may not be made available to the public as “open data”, although its re-use for non-commercial research or statistical purposes is permitted in the context of exceptional need. Public authorities must take all necessary measures to protect personal data and trade secrets, and they must destroy data after use (this is analogous to the “storage limitation” principle in the GDPR).

G2B – access to public sector data

It has long been acknowledged that public sector information must be accessible to the public, citizens and undertakings alike. Not only does such access safeguard the transparency of public administrations and governments, information obtained through the investment of public means can also be a considerable asset to the private sector. 

The 2019 Open Data Directive (which replaced the 2003 Directive on the re-use of public sector information) requires Member States to promote the use of open data and stimulate innovation in products and services by establishing minimum rules for the re-use of public sector information. As a result, a national meteorological institution, for example, if financed by public means, may be under an obligation to make “high value” sets of weather data available to the public in a machine-readable form, via an application programming interface, and, where possible, for download. However, the Open Data Directive contains important exceptions covering, for example, information protected under intellectual property rights and trade secrets, and personal data: public authorities in the Member States are under no obligation to make such information accessible to the public.

Although the DGA does not oblige Member State public authorities to allow the re-use of information that is outside the Open Data Directive, it does create a legal framework for the re-use of “data” in general (which includes data protected on grounds of commercial or statistical confidentiality, third-party intellectual property or personal data).

Where a public sector body (PSB) agrees to make such data available for re-use, the data should normally be made available to all third parties, without restrictions or exclusivity. Only if the exclusivity is required for the provision of a service or product in the general interest may the PSB consider granting an exclusive right – which should in any event be limited to a maximum of 12 months.

The PSB may impose conditions for the re-use of data upon the re-user (e.g., fees, measures to protect personal data or creations subject to intellectual property rights or trade secrets) but there must be transparency, and the PSB must make sure that the conditions, which must be fair, non-discriminatory, proportionate and objectively justified, are publicly accessible.

A re-user who agrees to such conditions will be held by a confidentiality obligation, must comply with intellectual property rights, and may not identify data subjects. Importantly, a re-user who intends to make international data transfers must notify the PSB (even if no personal data are involved).

The DA and DGA thus acknowledge both the importance of data for the public sector and the secondary use of public sector data by the private sector, while attempting to safeguard third party rights. This could result in a complex web of legal and contractual restrictions, which could make it difficult for both the PSB and the data acquirer to understand which use is permitted and under which conditions. Much will depend on the whether the PSBs can adapt to their new role: to clear all third-party rights and to formulate such rights and interests in clear contractual conditions (and warranties) for the data users.

Part 4 in this series of blog posts will look at the role of the newly defined data intermediaries that are intended to facilitate data sharing.

On August 24, 2022, the California Attorney General’s Office announced a settlement with Sephora, Inc. (Sephora), a French multinational personal care and beauty products retailer. The settlement resolved Sephora’s alleged violations of the California Consumer Privacy Act (CCPA) for allegedly failing to: disclose to consumers that the company was selling their personal information, process user requests to opt out of sale via user-enabled global privacy controls, and cure these violations within the 30-day period currently allowed by the CCPA.

As part of the settlement, Sephora is required to pay $1.2 million in penalties and comply with injunctive terms, specifically:

  • Clarifying its online disclosures and privacy policy to include an affirmative representation that it sells personal information;
  • Providing mechanisms for consumers to opt out of the sale of personal information, including via the Global Privacy Control (GPC)
  • Conforming its service provider agreements to the CCPA’s requirements; and 
  • Providing reports to the Attorney General relating to its sale of personal information, the status of its service provider relationships, and its efforts to honor GPC.

The settlement is the among the most significant enforcement actions taken in the effort to ensure businesses comply with California’s privacy law – the first of its kind in the United States. Through the CCPA, consumers can ask businesses to stop selling their personal information to third parties, including those signaled by the GPC.  GPC is a third-party tool that could be used by consumers to opt out of the sale of their personal information by automatically sending a signal to any site that is visited by the consumer.

People of the State of California v. Sephora USA, Inc.

The complaint filed by the California Office of the Attorney General (OAG) stated that the Attorney General commenced an enforcement sweep of large retailers to determine whether they continued to sell personal information when a consumer signaled an opt-out via the GPC. According to the complaint, the Attorney General found that activating the GPC signal had no effect when a consumer would visit the Sephora website and that data continued to flow to third party companies, including advertising and analytics providers.  That led to the Attorney General’s conclusion that Sephora’s website allegedly was not configured to detect or process any global privacy control signals, such as GPC, and that Sephora allegedly took no action to block the sharing of personal information when a California consumer signaled their opt-out using the GPC.  The complaint further highlighted the need for businesses to be transparent regarding their use of third-party trackers on their websites and mobile applications.  

The complaint further alleged that when Sephora sells products online, it collects personal information about consumers, including products that consumers view and purchase, consumers’ geolocation data, cookies and other user identifiers, and technical information about consumers’ operating systems and browser types. It then makes this data available to third parties such as advertising networks, business partners, and data analytics providers by installing (or allowing the installation of) third-party trackers in the form of cookies, pixels, software development kits, and other technologies, which automatically send data about consumers’ online behavior to the third-party companies.

By allowing third-party companies access to its customers’ online activities, the complaint alleged that Sephora received discounted or higher-quality analytics and other services derived from the data about consumers’ online activities, including the option to target advertisements to customers that had merely browsed for products online. The complaint alleged that Sephora’s website and mobile app failed to inform consumers that it sells their personal information and that they have the right to opt-out of this sale, that it failed to provide a clear and conspicuous “Do Not Sell My Personal Information” link on their site, and that it failed to provide two or more designated methods for submitting requests to opt-out.  Under Cal. Civ. Code § 1798.140, the CCPA defines a “sale” of personal information to include a disclosure for monetary or other valuable consideration. 

Sephora also allegedly did not have valid service provider contracts in place with each third party that collected personal information when Sephora installed or allowed the use of cookies or relevant code on its website or app, which is one exception to “sale” under the CCPA. Once notified of its CCPA violations, Sephora had 30 days to cure as outlined under the law. However, the company allegedly failed to cure the alleged violations within the time period, thereby prompting the Attorney General to initiate an investigation which led to the enforcement action.

Key Takeaways

The settlement outlines that the “sale” of personal information includes the trade of consumers’ personal information with third parties in exchange for analytics services or placing third party advertising cookies on a website, and other automatic data collection technologies that allow access to consumers’ online activities in exchange for advertising or analytic services. Moreover, such activities will subsequently be considered as either a “sale” or “share” of information under the California Privacy Rights Act (CPRA), effective January 1, 2023. The settlement also drives home the importance of complying with a customer’s request to opt-out of the sale of information, particularly through GPC.

The Attorney General’s enforcement action in the Sephora case aligns with many of the CCPA Enforcement Case Examples previously published by the OAG, which revolve around the disclosure of material terms, consumer consent, cookie options, opt-out mechanisms, and the need to maintain an up-do-date privacy policy. In this enforcement action, OAG pays particular focus on compliance with a consumer’s exercise of their privacy rights.

Businesses should take note of the higher scrutiny devoted to the treatment of consumer data and make efforts to comply with the California privacy laws, including:

  • Assessing whether it uses cookies or other technologies that may be considered a “sale” or “sharing” of personal information for targeted advertising, analytics, or in exchange of other forms of value.
  • Ensuring that its privacy policies are transparent as to the collection, processing, sale and sharing of personal information. A company’s privacy policy should clearly state whether personal information is sold.
  • Confirming that it has established opt-out mechanisms to allow consumers the ability to exercise their opt-out rights. This can take the form of a “Do Not Sell My Personal Information” link at the bottom of the company’s website. More importantly, should a consumer exercise their opt-out rights, a business should ensure that it has an established mechanism to process the request. This would include reviewing website capabilities to recognize any Global Privacy Control signals issued by a consumer’s browser. The settlement makes clear that a business must ensure that any user who has “user-enabled global privacy controls” is treated that same as users who have clicked the “Do Not Sell My Personal Information” link. The impetus behind this requirement stems from the desire to give consumers the ability to stop their data from being sold and allow such consumer to universally opt-out of all online sales in one fell swoop, without the need to click each time on an opt-out link. Businesses should assess their website’s capability to recognize signals triggered by GPC and recognize that an enforcement action is possible if the business does not implement adequate mechanisms to comply with consumer’s opt-out requests.
  • Reviewing the obligations under the California Privacy Rights Act, which will be effective January 1, 2023.

Accordingly, businesses should be diligent in assessing their compliance with the California privacy law. Looking to the future, businesses may also want to review the recently introduced American Data Privacy and Protection Act, a federal legislation aimed at creating a comprehensive federal consumer privacy framework. While not yet adopted, this may provide additional information of how privacy at the federal level may unfold in the coming years.

* * *

Crowell & Moring LLP has a robust California Consumer Privacy Act Practice and is highly experienced at advising companies of all sizes on compliance with state privacy laws. Crowell also has an extensive library of alerts and resources associated with California’s privacy laws, including: CCPA 2.0? California Adopts Sweeping New Data Privacy Protections, California AG Interprets Inferences Under CCPA, and Enforcement of The California Consumer Privacy Act Via Letters Noticing Noncompliant Loyalty Programs and Online Tool for Consumers to Notify Businesses of Potential Violations. If you have questions about this alert or similar issues, please contact one of the Crowell & Moring attorneys listed below, or your regular Crowell & Moring contact.

This is Part 2 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Part 1).

Broadly speaking, the purpose of both the DGA and the DA is to encourage “data sharing” and create a level playing field in this area. This concept covers several types of acts such as: “making data accessible”, “accessing” or “using” data, “sharing” data with third parties, and “receiving” data.

It is the proposed DA that sets out the specific data sharing provisions and provides a framework for other laws that impose data sharing. It considers data sharing according to different models in B2C and B2B relations, and relies on generic personas such as the “user”, the “data holder”, “data recipient” (each with their own abstract definition). In particular, it imposes an obligation to share data that is generated by the use of connected devices, and it creates an obligation, in case of exceptional need, to share data with certain public authorities. This “exceptional need” obligation will be examined in more detail in Part 3 of this blog series.

B2C and B2B Data Sharing – Connected Devices

The DA looks at data sharing according to different models for B2C and B2B relations. Its main purpose is to make data that are generated by connected devices (“products” in DA terminology) available to the users of the devices. Widely diverse situations are targeted, from a company using Internet of Things (IoT) devices for tracking shipped goods, to the owner of a wind power plant, to a person measuring their heart rate with a medical tracker and its associated app. 

All these different “users” are entitled to have access to the data generated by the use of the connected device and any indispensable digital services. The design of the IoT device should, if possible, allow the data to be directly accessed by the user. Alternatively, the data holder must ensure that the data are available either to the user, or, upon the user’s request, to a chosen third party. 

“Third party” is not defined in the DA, but would cover, for example, a doctor who reads the data from a glucose monitor to get a more detailed view of their diabetic patient’s condition, or the provider of maintenance services (e.g., for connected cars) who may seek to optimize the planning and performance of the maintenance services using the data generated by the car. 

As a result of the DA, the user, the data holder and third parties could have simultaneous access to the same data, generated by the use of the connected device. This would leave them vulnerable to each other: e.g., access to the data could reveal technical details about the IoT services, or sensitive information about the operations of the IoT user.

In order to control these risks and to establish trust within the IoT ecosystem, the DA imposes certain restrictions upon the use of the IoT data. 

  • The data holder must share the data, thus losing its privileged position regarding data exclusivity. However, if the data holder itself produces connected devices and related services, it is protected to the extent that neither the user, nor their elected third party, may use the data to develop a competing product.
  • Conversely, the data holder may not generate any insights regarding the user’s or third party’s economic situation, assets or production methods that could undermine their commercial position in the market.
  • The user’s interests are protected in the sense that the data holder and the third party may only use the user’s non-personal data if the user agrees. The DA is also wary about the power that a third party may wield over the user, and it explicitly prohibits the third party’s use of “dark patterns” and the profiling of natural persons, and the data (even non-personal, raw, aggregated or derived data) may not be made available to other third parties. User lock-in is also limited since a third party may not preclude the user from making the same data available to other third parties.

B2B – Mandatory Data Sharing (Legal Obligation)

In some situations, data holders may be subject to a legal obligation to make data from connected devices available to “data recipients” (this broad term covers, but is not limited to, a user’s chosen third party). Specific legal obligations may appear in sector regulation (e.g., repair and maintenance information concerning connected motor vehicles).

If the data holder is legally obliged to share data (but not if it does so as a result of a voluntary agreement), it must make the data available on “fair, reasonable and non-discriminatory terms.” 

The data holder must conclude an agreement (covering issues such as access, use, liability, termination and “reasonable” compensation) with the data recipient. Micro, small or medium-sized enterprises are protected as data recipients against abusive practices inter alia by a black and grey list of unfair contractual terms relating to data sharing. Where no agreement can be reached, the parties should have access to a national dispute settlement mechanism.

New Legal Restrictions on the Use of Information (Far-Reaching Sanctions)

Although the DA does not create a new exclusive right to data, it does provide for new legal restrictions on the use and re-use of “data”, without requiring that any substantive threshold be met. This means that contracts governing any IoT ecosystem must be adapted to reflect this protection of the various interests involved.

Moreover, the DA provides that if a data recipient makes unauthorised use or disclosure of data (e.g. they don’t meet the (legal) conditions to qualify for reuse or they don’t comply with the (contractual) restrictions of the use of the data), unless the user or data holder instructs otherwise, the data recipient must destroy the data and all copies, and, in addition, bring to an end the production and / or commercialization of any goods, derivative data or services that have been produced on the basis of knowledge obtained as a result of the unauthorized data (the DA even speaks of “infringing” goods). These redressive measures can be avoided only if the data holder suffers no significant harm, or the sanction is deemed disproportionate. 

These legal sanctions are far reaching. They resemble the measures available to a holder of an intellectual property right or trade secret in case of infringement, and they go beyond the remedies or sanctions available in case of breach of contract. Indeed, they could protect a data source that is not a party to the data sharing contract. It is therefore vital that data users be aware of both the contractual and extra-contractual risks to which they are exposed in case they fail to respect the conditions for access or re-use.

Part 3 in this series of blog posts will look in more detail at the concept of “exceptional need” and at data sharing between businesses and government (B2G and G2B).

Back in February 2020, the European Commission communicated its European strategy for data, with the aim of establishing EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. To make this strategy concrete, it came up with two legislative proposals: the Data Governance Act (DGA) and the Data Act (DA). The final version of the DGA was published on 3 June 2022 and will be applicable from September 2023. The DA is currently still at proposal stage.

Together, the DGA and DA are intended to (i) give individuals, research institutions, public authorities and companies – and, in particular, small and medium-sized enterprises – access to certain data; and (ii) harmonize the framework for data use and re-use.

In a series of blog posts, we will examine the contours of these new data regulations and take a look at some of their data sharing aspects.

Context

The impetus behind these legislative initiatives was the undisputed importance of data in today’s digital economy. The Internet of Things (i.e., connected objects such as fitness trackers, windmills or electric vehicles) is producing tremendous quantities of data, which are useful to the person or company producing the data, but also to service providers and public authorities.

Also important to bear in mind is that the DGA and DA cannot be read without reference to European ambitions in the field of artificial intelligence (AI) – notably the attempt to start regulating AI through the proposed AI Act – and the promise that AI technologies hold for solving important challenges in important sectors such as health care, mobility and energy. For example, allowing EU-wide access to hospital and research data to researchers working on rare diseases could significantly enhance their ability to find appropriate AI solutions where their Member State data is insufficient. It would also help streamline public investment.

“Data”

For the first time, the concept of “data” is defined in a legislative act: data means “any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audio-visual recording.” (Art. 2(1) DGA and Art. 2(1) DA). 

This definition is broad: it covers both personal and non-personal data generated in the public or private sector, regardless of the data’s meaning, content or function. Telecom-carried communications and audio-visual productions viewed through a streaming service are “data” within the meaning of this definition.

Complex Legislative Landscape

Importantly, the DGA and the DA are not all-encompassing regulations: other regulations, such as those concerning personal data protection or intellectual property, will continue to apply. This creates a complex legislative landscape, which is likely to be difficult to navigate for undertakings, public sector bodies and individuals alike.

In order to determine which rules are applicable you will first need to classify the information under consideration as (i) containing personal data, (ii) containing non-personal data or (iii) containing a mixed dataset. You will also have to decide whether it is public sector information or private sector information. Depending on this classification, the access to and re-use of the data will be governed by different legal instruments, as the following figure shows:

Moreover, there may be additional restrictions because of the data’s protection under intellectual property rights (such as copyright, database rights or other related rights) or trade secrets. 

The DGA and the DA are thus the latest additions to a regulation-heavy field, and it remains to be seen whether they will make it easier for companies, individuals and public sector bodies to understand their rights and obligations as “data subjects”, “data holders” or “data users”, and whether they will provide sufficient legal certainty to incentivize the sharing of data.

Part 2 in this series of blog posts, which will be released next week, will look in more detail at how the EU proposes to harmonize the rules on data sharing.

After much anticipation, the Cyber AB, formerly known as the Cybersecurity Maturity Model Certification (CMMC) Accreditation Body, recently released its pre-decisional draft CMMC Assessment Process (CAP).  The CAP describes the overarching procedures and guidance that CMMC Third-Party Assessment Organizations (C3PAOs) will use to assess entities seeking CMMC certification.  The current version of the CAP applies to contractors requiring CMMC Level 2 certification, which will likely be most contractors handling Controlled Unclassified Information (CUI) based on the Department of Defense’s (DoD) provisional scoping guidance for CMMC 2.0.

Aimed at increasing the accuracy and consistency of assessments conducted by C3PAOs, the CAP is segmented into four distinct phases:

Phase 1:  Plan and Prepare the Assessment;
Phase 2:  Conduct the Assessment;
Phase 3:  Report Assessment Results; and
Phase 4:  Close-Out Plan of Action and Milestones (POAMs) and Assessment.

While the assessment process is still in draft form, DoD contractors should familiarize themselves with the proposed structure and conduct of CMMC assessments, as these parameters will be critical to companies attaining CMMC certification at the level requisite for future government contract awards.

The Cyber AB is currently accepting comments on the draft CAP. 

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.