This is Part 4 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Parts 1, 2, and 3).

The DGA introduces two new types of “intermediaries” – data intermediation service providers and data altruism organizations – to help with the legal and technical practicalities and facilitate data sharing between data holders and data users. These new intermediaries will be able to garner the necessary expertise to establish a contractual and technical framework that fosters trust among data holders, data subjects and users. 

Both types of organization are intended to support data holders or data subjects in making their data available for re-use by third parties. However, data intermediation service providers may operate in a commercial context, while data altruism organizations are not-for-profit entities pursuing general interest objectives.

Although it is not yet entirely clear exactly what types of organizations may qualify as these intermediaries, new notions in the European legal order, and the purpose and the contours of the regulation are becoming apparent. The DGA does provides a general description of the type of organization that will qualify as a “data intermediation service” or a “data altruism organization”. It also imposes some restrictions regarding the conditions for data re-use and, importantly, it introduces new regulatory mechanisms handled by national authorities.

Data intermediation services

Providers of data intermediation services help data subjects and data holders establish commercial relationships with data users for the purpose of “data sharing” (i.e., the provision of data for the purpose of joint or individual use, based on voluntary agreements or Union or national law, in this case through an intermediary, under commercial or open license terms). 

The intermediation service may organize data pooling or the bilateral exchange of data. On the data provider side, the permitted number of data subjects or data holders is undetermined. Data cooperatives are covered but closed groups, such as consortia, are not. Only actual “intermediaries” are targeted: entities that aggregate, enrich, or otherwise add value to datasets in order to exploit the result for their own purposes, such as data brokers or consultancies, are not within the DGA’s scope. Similarly, providers of copyright protected content (such as streaming services) are not considered to be data intermediaries.

Data intermediation service providers will put in place the technical, legal or other means for the data holders/data subjects and the data users to enter into a commercial relationship. The DGA explicitly mentions the case of data subjects exercising their rights regarding their personal data through a data intermediation service: before the data subject gives consent to the data user, the intermediary should inform and even advise on the intended use of the data and the conditions of such use. It may then also provide tools to facilitate the giving and withdrawing of consent.

Because of their quality as intermediaries, providers of these services may not use the data for any purpose other than putting them at the disposal of data users. They may not use the data holders’/data subjects’ data for their own purposes, nor may they make the data intermediation service dependent on other services they may offer. Similarly, the meta-data relating to the use of their services may only be used for developing the data intermediation service. These restrictions are intended to foster a climate of trust, something that would be jeopardized were the trusted intermediary to be at the same time a data user.

Data intermediation service providers must offer access to their services on transparent, non-discriminatory terms (including price). Where the data contain personal data, the DGA explicitly provides that the intermediaries should pursue the data subjects’ best interests. 

Data intermediation service providers also have a role to play on the technical level, in particular as concern the data’s format and the tools available to the data holders and data subjects (e.g., conversion, curation, anonymization or pseudonymization).       

As far as the data intermediation service itself is concerned, the providers must take sufficient security measures, ensure interoperability with other service providers (e.g., open standards) and ensure a continuity of service (and the possibility for the data subjects/data holders to retrieve their data, in case of insolvency).

Data intermediation service providers are subject to new regulatory obligations: they must notify the (new) national authority of their intent, according to a procedure set out in the DGA, before they are allowed to start offering their services. Although no permit or prior authorization is required, data intermediation service providers may obtain a declaration from the competent national authority confirming compliance with the notification obligations. Much like the GDPR, this notification procedure targets service providers with activities in several Member States and service providers established in third countries (which must then designate a representative in the EU).

Data Altruism Organizations

Immense quantities of data (including health data) are needed in order to advance research into technologies that can be used for the public good (such as AI-based health tech applications). At the same time, the GDPR imposes a strict framework for the processing of personal data, which complicates the use and especially the re-use of personal data (for secondary purposes), even if a data subject consents and even if the processing operations pursue non-commercial or public interest purposes.

For example, a data subject may agree to the re-use of their medical results in the context of non-commercial, scientific research, without knowing in advance for which precise research projects the data will be used. GDPR data processing principles, such as purpose limitation or data minimization, complicate such open-purpose processing.

To address this issue, the DGA has introduced data altruism organizations. These organizations may organize the sharing of personal or non-personal data, for general interest purposes (e.g., healthcare, climate change, mobility), scientific research or statistics, without financial compensation for the data subject or data holder (beyond compensation related to the costs that they incur). Importantly, the sharing of such data is voluntary and based on the consent of the data subject or the permission of the data holder. 

However, the DGA does not specify how the data altruism organizations should collect the data from the data subjects and data holders, or which conditions must be met. It merely imposes some conditions and restrictions as to the use of the data in the general interest.

Data altruism organizations must comply with specific requirements to safeguard the rights and interest of both data subjects and data holders. They have certain information obligations (e.g., to provide information, before the data processing, concerning the purposes and location of the intended processing, and to inform data holders and data subjects about a data breach) and they may not use the data for other objectives than the general interest objectives for which the data processing is allowed. From a technical point of view, they must provide tools for obtaining and withdrawing consent, in addition to their security obligations.

The DGA imposes an obligation upon data altruism organizations to register with a Member State “competent authority”, which must verify whether the organization meets the requirements as to its activities, its legal persona and its general interest objectives, and the organization of its activities (in an independent, functionally separate entity from other activities). Like the GDPR, the DGA provides rules on the registration of data altruism organizations with activities in several Member States, or with an establishment outside the EU.

Data altruism organizations are subject to transparency obligations, meaning that they have to keep extensive records of the data users and the data use (date, period, purposes, fees), and draft an annual activity report.

Yesterday, the Office of Management and Budget (OMB) released Memorandum M-22-18, implementing software supply chain security requirements that will have a significant impact on software companies and vendors in accordance with Executive Order 14028, Improving the Nation’s Cybersecurity.  The Memorandum requires all federal agencies and their software suppliers to comply with the NIST Secure Software Development Framework (SSDF)NIST SP 800-­218, and the NIST Software Supply Chain Security Guidance whenever third-party software is used on government information systems or otherwise affects government information.  The term “software” includes firmware, operating systems, applications, and application services (e.g., cloud-based software), as well as products containing software.  It is critical to note that these requirements will apply whenever there is a major version update or new software that the government will be using. 

The Memorandum requires agencies to take the following actions:

  • within 90 days, agencies must inventory all software subject to the Memorandum;
  • within 120 days, agencies will have developed a process to communicate requirements to vendors and ensure that vendor attestation letters can be collected in a central agency system;
  • within 180 days, agencies must assess training needs and develop plans for the review and validation of attestation documents;
  • within 270 days for critical software and within 365 days for all others, agencies will require self-attestations from all software producers; and
  • as needed, obtain from software producers a Software Bill of Materials (SBOM) or other artifact(s) that demonstrate conformance to secure software development practices. 

To comply with the Memorandum, software producers must attest that they adhere to the NIST software supply chain frameworks and guidance.  In lieu of a self-attestation, software producers may also submit third-party assessments of compliance with the software security standards conducted by a certified FedRAMP assessor or an assessor approved by the agency.

Software producers or vendors providing software to the federal government should begin reviewing their security practices and their overall software development lifecycle immediately to ensure that they can attest to compliance with the applicable NIST standards in the very near future.   

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

This is Part 3 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Part 1 and Part 2).

In this post we will consider business to government (B2G and G2B) relations and examine how the European legislature intends to facilitate data sharing here.

As a general rule, data holders are free to decide whether to share their data with public authorities – except where specific legal obligations require the legal or natural person to provide information to tax, administrative or public prosecution authorities. 

The Commission gave some guidance on the conditions for the re-use by public authorities of voluntarily shared private sector data in its 2018 communication and associated staff working document.

The DA and the DGA add to this path and contain provisions making it possible for public authorities to gain access to data held by private entities in case of “exceptional need”, and by allowing certain data to become available to third parties (such as researchers) even when the Open Data Directive does not apply.

B2G data sharing in case of “exceptional need”

The DA imposes a new obligation upon data holders (except SMEs) to make data available if public sector bodies or EU institutions, agencies or bodies have an “exceptional need” for the data. The data may also be re-used for non-commercial research or statistical purposes in this context.

Such “exceptional need” may exist in case of a “public emergency”, defined as an “exceptional situation negatively affecting the population of the Union, a Member State or part of it, with a risk of serious and lasting repercussions on living conditions or economic stability, or the substantial degradation of economic assets in the Union or the relevant Member State(s).” A pandemic or a war may qualify as a “public emergency”.

More broadly, an “exceptional need” may exist where a public authority does not have the data it needs to fulfil a specific task in the public interest, despite having tried to obtain such data in accordance with market conditions or by virtue of other legal provisions. Although data made available in response to a public emergency must be provided free of charge, compensation can be claimed for data provided in other cases of exception need.

To take a concrete example, during an emergency like the COVID 19 pandemic, a government agency competent for public health would be able to collect aggregated telecom data if these data were necessary in order to respond to or recover from the epidemic (e.g., to predict or analyse its development). What’s more, the public authority would be able to share such data with researchers working on an urgent vaccine who needed access to medical data – provided that this data re-use remained within the purposes for which the public authority had requested the data.

The DA sets out an elaborate procedure by which public authorities must request data, and data holders must comply, decline or modify such requests.

Once a public authority has gained access to the requested data, the data may be used for the stated purposes (this principle is similar to the purpose limitation principle contained in the GDPR). Public authorities may not use the DA data sharing obligations to gain access to or re-use data in the context of criminal, customs or tax proceedings. Moreover, the acquired data may not be made available to the public as “open data”, although its re-use for non-commercial research or statistical purposes is permitted in the context of exceptional need. Public authorities must take all necessary measures to protect personal data and trade secrets, and they must destroy data after use (this is analogous to the “storage limitation” principle in the GDPR).

G2B – access to public sector data

It has long been acknowledged that public sector information must be accessible to the public, citizens and undertakings alike. Not only does such access safeguard the transparency of public administrations and governments, information obtained through the investment of public means can also be a considerable asset to the private sector. 

The 2019 Open Data Directive (which replaced the 2003 Directive on the re-use of public sector information) requires Member States to promote the use of open data and stimulate innovation in products and services by establishing minimum rules for the re-use of public sector information. As a result, a national meteorological institution, for example, if financed by public means, may be under an obligation to make “high value” sets of weather data available to the public in a machine-readable form, via an application programming interface, and, where possible, for download. However, the Open Data Directive contains important exceptions covering, for example, information protected under intellectual property rights and trade secrets, and personal data: public authorities in the Member States are under no obligation to make such information accessible to the public.

Although the DGA does not oblige Member State public authorities to allow the re-use of information that is outside the Open Data Directive, it does create a legal framework for the re-use of “data” in general (which includes data protected on grounds of commercial or statistical confidentiality, third-party intellectual property or personal data).

Where a public sector body (PSB) agrees to make such data available for re-use, the data should normally be made available to all third parties, without restrictions or exclusivity. Only if the exclusivity is required for the provision of a service or product in the general interest may the PSB consider granting an exclusive right – which should in any event be limited to a maximum of 12 months.

The PSB may impose conditions for the re-use of data upon the re-user (e.g., fees, measures to protect personal data or creations subject to intellectual property rights or trade secrets) but there must be transparency, and the PSB must make sure that the conditions, which must be fair, non-discriminatory, proportionate and objectively justified, are publicly accessible.

A re-user who agrees to such conditions will be held by a confidentiality obligation, must comply with intellectual property rights, and may not identify data subjects. Importantly, a re-user who intends to make international data transfers must notify the PSB (even if no personal data are involved).

The DA and DGA thus acknowledge both the importance of data for the public sector and the secondary use of public sector data by the private sector, while attempting to safeguard third party rights. This could result in a complex web of legal and contractual restrictions, which could make it difficult for both the PSB and the data acquirer to understand which use is permitted and under which conditions. Much will depend on the whether the PSBs can adapt to their new role: to clear all third-party rights and to formulate such rights and interests in clear contractual conditions (and warranties) for the data users.

Part 4 in this series of blog posts will look at the role of the newly defined data intermediaries that are intended to facilitate data sharing.

On August 24, 2022, the California Attorney General’s Office announced a settlement with Sephora, Inc. (Sephora), a French multinational personal care and beauty products retailer. The settlement resolved Sephora’s alleged violations of the California Consumer Privacy Act (CCPA) for allegedly failing to: disclose to consumers that the company was selling their personal information, process user requests to opt out of sale via user-enabled global privacy controls, and cure these violations within the 30-day period currently allowed by the CCPA.

As part of the settlement, Sephora is required to pay $1.2 million in penalties and comply with injunctive terms, specifically:

  • Clarifying its online disclosures and privacy policy to include an affirmative representation that it sells personal information;
  • Providing mechanisms for consumers to opt out of the sale of personal information, including via the Global Privacy Control (GPC)
  • Conforming its service provider agreements to the CCPA’s requirements; and 
  • Providing reports to the Attorney General relating to its sale of personal information, the status of its service provider relationships, and its efforts to honor GPC.

The settlement is the among the most significant enforcement actions taken in the effort to ensure businesses comply with California’s privacy law – the first of its kind in the United States. Through the CCPA, consumers can ask businesses to stop selling their personal information to third parties, including those signaled by the GPC.  GPC is a third-party tool that could be used by consumers to opt out of the sale of their personal information by automatically sending a signal to any site that is visited by the consumer.

People of the State of California v. Sephora USA, Inc.

The complaint filed by the California Office of the Attorney General (OAG) stated that the Attorney General commenced an enforcement sweep of large retailers to determine whether they continued to sell personal information when a consumer signaled an opt-out via the GPC. According to the complaint, the Attorney General found that activating the GPC signal had no effect when a consumer would visit the Sephora website and that data continued to flow to third party companies, including advertising and analytics providers.  That led to the Attorney General’s conclusion that Sephora’s website allegedly was not configured to detect or process any global privacy control signals, such as GPC, and that Sephora allegedly took no action to block the sharing of personal information when a California consumer signaled their opt-out using the GPC.  The complaint further highlighted the need for businesses to be transparent regarding their use of third-party trackers on their websites and mobile applications.  

The complaint further alleged that when Sephora sells products online, it collects personal information about consumers, including products that consumers view and purchase, consumers’ geolocation data, cookies and other user identifiers, and technical information about consumers’ operating systems and browser types. It then makes this data available to third parties such as advertising networks, business partners, and data analytics providers by installing (or allowing the installation of) third-party trackers in the form of cookies, pixels, software development kits, and other technologies, which automatically send data about consumers’ online behavior to the third-party companies.

By allowing third-party companies access to its customers’ online activities, the complaint alleged that Sephora received discounted or higher-quality analytics and other services derived from the data about consumers’ online activities, including the option to target advertisements to customers that had merely browsed for products online. The complaint alleged that Sephora’s website and mobile app failed to inform consumers that it sells their personal information and that they have the right to opt-out of this sale, that it failed to provide a clear and conspicuous “Do Not Sell My Personal Information” link on their site, and that it failed to provide two or more designated methods for submitting requests to opt-out.  Under Cal. Civ. Code § 1798.140, the CCPA defines a “sale” of personal information to include a disclosure for monetary or other valuable consideration. 

Sephora also allegedly did not have valid service provider contracts in place with each third party that collected personal information when Sephora installed or allowed the use of cookies or relevant code on its website or app, which is one exception to “sale” under the CCPA. Once notified of its CCPA violations, Sephora had 30 days to cure as outlined under the law. However, the company allegedly failed to cure the alleged violations within the time period, thereby prompting the Attorney General to initiate an investigation which led to the enforcement action.

Key Takeaways

The settlement outlines that the “sale” of personal information includes the trade of consumers’ personal information with third parties in exchange for analytics services or placing third party advertising cookies on a website, and other automatic data collection technologies that allow access to consumers’ online activities in exchange for advertising or analytic services. Moreover, such activities will subsequently be considered as either a “sale” or “share” of information under the California Privacy Rights Act (CPRA), effective January 1, 2023. The settlement also drives home the importance of complying with a customer’s request to opt-out of the sale of information, particularly through GPC.

The Attorney General’s enforcement action in the Sephora case aligns with many of the CCPA Enforcement Case Examples previously published by the OAG, which revolve around the disclosure of material terms, consumer consent, cookie options, opt-out mechanisms, and the need to maintain an up-do-date privacy policy. In this enforcement action, OAG pays particular focus on compliance with a consumer’s exercise of their privacy rights.

Businesses should take note of the higher scrutiny devoted to the treatment of consumer data and make efforts to comply with the California privacy laws, including:

  • Assessing whether it uses cookies or other technologies that may be considered a “sale” or “sharing” of personal information for targeted advertising, analytics, or in exchange of other forms of value.
  • Ensuring that its privacy policies are transparent as to the collection, processing, sale and sharing of personal information. A company’s privacy policy should clearly state whether personal information is sold.
  • Confirming that it has established opt-out mechanisms to allow consumers the ability to exercise their opt-out rights. This can take the form of a “Do Not Sell My Personal Information” link at the bottom of the company’s website. More importantly, should a consumer exercise their opt-out rights, a business should ensure that it has an established mechanism to process the request. This would include reviewing website capabilities to recognize any Global Privacy Control signals issued by a consumer’s browser. The settlement makes clear that a business must ensure that any user who has “user-enabled global privacy controls” is treated that same as users who have clicked the “Do Not Sell My Personal Information” link. The impetus behind this requirement stems from the desire to give consumers the ability to stop their data from being sold and allow such consumer to universally opt-out of all online sales in one fell swoop, without the need to click each time on an opt-out link. Businesses should assess their website’s capability to recognize signals triggered by GPC and recognize that an enforcement action is possible if the business does not implement adequate mechanisms to comply with consumer’s opt-out requests.
  • Reviewing the obligations under the California Privacy Rights Act, which will be effective January 1, 2023.

Accordingly, businesses should be diligent in assessing their compliance with the California privacy law. Looking to the future, businesses may also want to review the recently introduced American Data Privacy and Protection Act, a federal legislation aimed at creating a comprehensive federal consumer privacy framework. While not yet adopted, this may provide additional information of how privacy at the federal level may unfold in the coming years.

* * *

Crowell & Moring LLP has a robust California Consumer Privacy Act Practice and is highly experienced at advising companies of all sizes on compliance with state privacy laws. Crowell also has an extensive library of alerts and resources associated with California’s privacy laws, including: CCPA 2.0? California Adopts Sweeping New Data Privacy Protections, California AG Interprets Inferences Under CCPA, and Enforcement of The California Consumer Privacy Act Via Letters Noticing Noncompliant Loyalty Programs and Online Tool for Consumers to Notify Businesses of Potential Violations. If you have questions about this alert or similar issues, please contact one of the Crowell & Moring attorneys listed below, or your regular Crowell & Moring contact.

This is Part 2 in a series of blog posts on recent developments in the EU’s data strategy, which aims to establish EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. The series looks in particular at the recently adopted Data Governance Act (DGA) and the proposed Data Act (DA). (See also Part 1).

Broadly speaking, the purpose of both the DGA and the DA is to encourage “data sharing” and create a level playing field in this area. This concept covers several types of acts such as: “making data accessible”, “accessing” or “using” data, “sharing” data with third parties, and “receiving” data.

It is the proposed DA that sets out the specific data sharing provisions and provides a framework for other laws that impose data sharing. It considers data sharing according to different models in B2C and B2B relations, and relies on generic personas such as the “user”, the “data holder”, “data recipient” (each with their own abstract definition). In particular, it imposes an obligation to share data that is generated by the use of connected devices, and it creates an obligation, in case of exceptional need, to share data with certain public authorities. This “exceptional need” obligation will be examined in more detail in Part 3 of this blog series.

B2C and B2B Data Sharing – Connected Devices

The DA looks at data sharing according to different models for B2C and B2B relations. Its main purpose is to make data that are generated by connected devices (“products” in DA terminology) available to the users of the devices. Widely diverse situations are targeted, from a company using Internet of Things (IoT) devices for tracking shipped goods, to the owner of a wind power plant, to a person measuring their heart rate with a medical tracker and its associated app. 

All these different “users” are entitled to have access to the data generated by the use of the connected device and any indispensable digital services. The design of the IoT device should, if possible, allow the data to be directly accessed by the user. Alternatively, the data holder must ensure that the data are available either to the user, or, upon the user’s request, to a chosen third party. 

“Third party” is not defined in the DA, but would cover, for example, a doctor who reads the data from a glucose monitor to get a more detailed view of their diabetic patient’s condition, or the provider of maintenance services (e.g., for connected cars) who may seek to optimize the planning and performance of the maintenance services using the data generated by the car. 

As a result of the DA, the user, the data holder and third parties could have simultaneous access to the same data, generated by the use of the connected device. This would leave them vulnerable to each other: e.g., access to the data could reveal technical details about the IoT services, or sensitive information about the operations of the IoT user.

In order to control these risks and to establish trust within the IoT ecosystem, the DA imposes certain restrictions upon the use of the IoT data. 

  • The data holder must share the data, thus losing its privileged position regarding data exclusivity. However, if the data holder itself produces connected devices and related services, it is protected to the extent that neither the user, nor their elected third party, may use the data to develop a competing product.
  • Conversely, the data holder may not generate any insights regarding the user’s or third party’s economic situation, assets or production methods that could undermine their commercial position in the market.
  • The user’s interests are protected in the sense that the data holder and the third party may only use the user’s non-personal data if the user agrees. The DA is also wary about the power that a third party may wield over the user, and it explicitly prohibits the third party’s use of “dark patterns” and the profiling of natural persons, and the data (even non-personal, raw, aggregated or derived data) may not be made available to other third parties. User lock-in is also limited since a third party may not preclude the user from making the same data available to other third parties.

B2B – Mandatory Data Sharing (Legal Obligation)

In some situations, data holders may be subject to a legal obligation to make data from connected devices available to “data recipients” (this broad term covers, but is not limited to, a user’s chosen third party). Specific legal obligations may appear in sector regulation (e.g., repair and maintenance information concerning connected motor vehicles).

If the data holder is legally obliged to share data (but not if it does so as a result of a voluntary agreement), it must make the data available on “fair, reasonable and non-discriminatory terms.” 

The data holder must conclude an agreement (covering issues such as access, use, liability, termination and “reasonable” compensation) with the data recipient. Micro, small or medium-sized enterprises are protected as data recipients against abusive practices inter alia by a black and grey list of unfair contractual terms relating to data sharing. Where no agreement can be reached, the parties should have access to a national dispute settlement mechanism.

New Legal Restrictions on the Use of Information (Far-Reaching Sanctions)

Although the DA does not create a new exclusive right to data, it does provide for new legal restrictions on the use and re-use of “data”, without requiring that any substantive threshold be met. This means that contracts governing any IoT ecosystem must be adapted to reflect this protection of the various interests involved.

Moreover, the DA provides that if a data recipient makes unauthorised use or disclosure of data (e.g. they don’t meet the (legal) conditions to qualify for reuse or they don’t comply with the (contractual) restrictions of the use of the data), unless the user or data holder instructs otherwise, the data recipient must destroy the data and all copies, and, in addition, bring to an end the production and / or commercialization of any goods, derivative data or services that have been produced on the basis of knowledge obtained as a result of the unauthorized data (the DA even speaks of “infringing” goods). These redressive measures can be avoided only if the data holder suffers no significant harm, or the sanction is deemed disproportionate. 

These legal sanctions are far reaching. They resemble the measures available to a holder of an intellectual property right or trade secret in case of infringement, and they go beyond the remedies or sanctions available in case of breach of contract. Indeed, they could protect a data source that is not a party to the data sharing contract. It is therefore vital that data users be aware of both the contractual and extra-contractual risks to which they are exposed in case they fail to respect the conditions for access or re-use.

Part 3 in this series of blog posts will look in more detail at the concept of “exceptional need” and at data sharing between businesses and government (B2G and G2B).

Back in February 2020, the European Commission communicated its European strategy for data, with the aim of establishing EU leadership in our data-driven society by creating a single market for data and encouraging data sharing. To make this strategy concrete, it came up with two legislative proposals: the Data Governance Act (DGA) and the Data Act (DA). The final version of the DGA was published on 3 June 2022 and will be applicable from September 2023. The DA is currently still at proposal stage.

Together, the DGA and DA are intended to (i) give individuals, research institutions, public authorities and companies – and, in particular, small and medium-sized enterprises – access to certain data; and (ii) harmonize the framework for data use and re-use.

In a series of blog posts, we will examine the contours of these new data regulations and take a look at some of their data sharing aspects.

Context

The impetus behind these legislative initiatives was the undisputed importance of data in today’s digital economy. The Internet of Things (i.e., connected objects such as fitness trackers, windmills or electric vehicles) is producing tremendous quantities of data, which are useful to the person or company producing the data, but also to service providers and public authorities.

Also important to bear in mind is that the DGA and DA cannot be read without reference to European ambitions in the field of artificial intelligence (AI) – notably the attempt to start regulating AI through the proposed AI Act – and the promise that AI technologies hold for solving important challenges in important sectors such as health care, mobility and energy. For example, allowing EU-wide access to hospital and research data to researchers working on rare diseases could significantly enhance their ability to find appropriate AI solutions where their Member State data is insufficient. It would also help streamline public investment.

“Data”

For the first time, the concept of “data” is defined in a legislative act: data means “any digital representation of acts, facts or information and any compilation of such acts, facts or information, including in the form of sound, visual or audio-visual recording.” (Art. 2(1) DGA and Art. 2(1) DA). 

This definition is broad: it covers both personal and non-personal data generated in the public or private sector, regardless of the data’s meaning, content or function. Telecom-carried communications and audio-visual productions viewed through a streaming service are “data” within the meaning of this definition.

Complex Legislative Landscape

Importantly, the DGA and the DA are not all-encompassing regulations: other regulations, such as those concerning personal data protection or intellectual property, will continue to apply. This creates a complex legislative landscape, which is likely to be difficult to navigate for undertakings, public sector bodies and individuals alike.

In order to determine which rules are applicable you will first need to classify the information under consideration as (i) containing personal data, (ii) containing non-personal data or (iii) containing a mixed dataset. You will also have to decide whether it is public sector information or private sector information. Depending on this classification, the access to and re-use of the data will be governed by different legal instruments, as the following figure shows:

Moreover, there may be additional restrictions because of the data’s protection under intellectual property rights (such as copyright, database rights or other related rights) or trade secrets. 

The DGA and the DA are thus the latest additions to a regulation-heavy field, and it remains to be seen whether they will make it easier for companies, individuals and public sector bodies to understand their rights and obligations as “data subjects”, “data holders” or “data users”, and whether they will provide sufficient legal certainty to incentivize the sharing of data.

Part 2 in this series of blog posts, which will be released next week, will look in more detail at how the EU proposes to harmonize the rules on data sharing.

After much anticipation, the Cyber AB, formerly known as the Cybersecurity Maturity Model Certification (CMMC) Accreditation Body, recently released its pre-decisional draft CMMC Assessment Process (CAP).  The CAP describes the overarching procedures and guidance that CMMC Third-Party Assessment Organizations (C3PAOs) will use to assess entities seeking CMMC certification.  The current version of the CAP applies to contractors requiring CMMC Level 2 certification, which will likely be most contractors handling Controlled Unclassified Information (CUI) based on the Department of Defense’s (DoD) provisional scoping guidance for CMMC 2.0.

Aimed at increasing the accuracy and consistency of assessments conducted by C3PAOs, the CAP is segmented into four distinct phases:

Phase 1:  Plan and Prepare the Assessment;
Phase 2:  Conduct the Assessment;
Phase 3:  Report Assessment Results; and
Phase 4:  Close-Out Plan of Action and Milestones (POAMs) and Assessment.

While the assessment process is still in draft form, DoD contractors should familiarize themselves with the proposed structure and conduct of CMMC assessments, as these parameters will be critical to companies attaining CMMC certification at the level requisite for future government contract awards.

The Cyber AB is currently accepting comments on the draft CAP. 

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

On May 3, 2022, the European Commission published a proposed regulation (the “EHDS Proposal”) for the establishment of a European Health Data Space (or “EHDS”). This is the first proposal for establishing domain-specific common European data spaces following the European strategy for data and an important step in building a European “Health Union”.

In short, the proposed regulation establishes the EHDS, a common space for health data where natural persons can control their electronic health data (primary use) and where researchers, innovators and policy makers have access to these electronic health data in a trusted and secure way that preserves the individual’s personal data (secondary use). Data holders (such as health care providers, including private and public hospitals, and research institutions) may be subject to new, burdensome obligations to make their data available for secondary use through the EHDS.

In this client alert we summarize the main principles the European legislature proposes to facilitate the primary and secondary use of health data in the EHDS and examine the consequences of this proposal for the different actors involved with the EHDS (individuals, health professionals, researchers, policy makers and the health care industry).

The starting point of EHDS Proposal is the finding that health data are fundamental for advancing scientific research and medical innovation, patient well-being and public health (as the Covid 19-pandemic has demonstrated), more efficient policy making and regulatory oversight. At the same time, the patient needs to have better control over their health data, protected as personal data. The EHDS Proposal aims to reconcile the regulation of the primary use of the health data by the individual and health professionals and the secondary use by researchers, innovators and policy makers.

The EHDS Proposal is not an isolated piece of legislation: it sits on top of patchwork of relevant legislation, such as the General Data Protection Regulation, the NIS Directive and, specifically for the medical sector, the Medical Devices Regulation, the In Vitro Diagnostics Regulation and the Cross-Border Health Care Directive. Moreover, the proposal cannot be read without considering the proposed Data Governance Act, the proposed Data Act and the proposed Artificial Intelligence Act. While the Data Governance Act and Data Act would provide a generic, horizontal framework for the sharing of data, the EHDS Proposal would make these principles more concrete for health data.

Considering this complex legal framework, the EHDS Proposal is intended to offer some guidance on how electronic health data may be used for various purposes, considering not in the least that health data are protected under the GDPR as a “special category of data”, protected by additional safeguards for its processing. It does so through substantive rules, through technical regulation (e.g. formats of electronic health records or “EHRs” and interoperability requirements) and through regulatory oversight by dedicated national authorities.

The EHDS Proposal consists of two main components, being the primary and secondary use of electronic health data.

Primary Use of Electronic Health Data

The first purpose of the EHDS Proposal is to strengthen the rights of natural persons in relation to the availability and control of their “electronic health data”, a notion that covers both personal and non-personal electronic health data, i.e. data concerning health and genetic data in electronic format within or outside the scope of the GDPR.

The rights of the data subjects regarding the “primary use” of electronic health data would be clarified in the EHDS Proposal, with “primary use” defined as the processing of such data “for the provision of health services to assess, maintain or restore the state of health of the natural person to whom that data relates, including the prescription, dispensation and provision of medicinal products and medical devices, as well as for relevant social security, administrative or reimbursement services”.

The EHDS Proposal would also provide more detailed guidance on how the data subject rights under the GDPR (e.g. rights to access, to obtain a copy in a standardized format or to rectify the data) may be exercised in relation to electronic health data, as well as on how to restrict such rights (e.g. delay the exercise of the rights to allow the health care professional the time to communicate with the patient). Individuals would be able to easily access and share these data (e.g. with the healthcare professionals of their choice) in and across Member States. They may even require a data holder to transmit their electronic health data to a “data recipient” in the health or social security sector. They would also be able to exercise better control over their data, in the sense that they would have the right to know which health care professionals have access to their data and to restrict their access to all or part of their data.

The health care professionals, on their end, would also have the right under the EHDS Proposal to access the electronic health data of individuals under their treatment (in particular patient summaries, prescriptions, dispensations, medical images and image reports, lab results and discharge reports, i.e. the “priority categories of personal electronic health data”). At the same time, they would be obligated to ensure that the electronic health data are updated in an European Health Record (“EHR”) system, with the information concerning the health services they provided.

Secondary Use

Acknowledging the importance of health data for research, innovation, policy making, regulatory purposes, patient safety or the treatment of other patients, the EHDS Proposal would explicitly implement the possibilities to reuse personal data for secondary purposes authorized under the GDPR.

Under the proposal, the “data holder” (a notion similar to the one in the proposed Data Act) would be under the obligation to make certain categories of electronic data available for secondary use. These categories of data cover a wide variety of data, including EHRs but also data impacting on data, genomic data, socio-economic data, etc. from various sources (generated using connected devices, administrative data, data from clinical trials, questionnaires, biobanks etc.).

The obligation to make these data available for secondary use would be required, even where the data may be protected under intellectual property rights or trade secrets, and measures must be taken to maintain this protection (although the EHDS Proposal does not indicate who would be responsible for these measures).

Access to these data would be managed by a “health data access body”, which would grant requests for access (in the form of a “data permit”) only for the broad objectives of scientific research, innovation, policy-making and regulatory activities.

In particular, the EHDS Proposal would authorize the processing of data for one of the following limited purposes: (a) public interest activities in public and occupational health (e.g. epidemics or pandemics), (b) supporting various public authorities in the health or care sector, (c) producing statistics, (d) education or teaching in the health or care sectors, (e) scientific research related to health or care sectors, (f) development and innovation in relation to products or services in public health or social security, medicinal products or of medical devices or, (g) training, testing and evaluating of algorithms (including in medical devices, AI systems and digital health applications) for medical applications (public health or social security, medicinal products or of medical devices); or (h) providing personalised healthcare.

Inversely, the EHDS Proposal would explicitly prohibit the use of data for a number of prejudicial secondary uses. It would forbid the use the data for taking decisions that are detrimental to the natural person, based on their electronic health data, or decisions that exclude natural persons from their insurance contracts or modify the terms to their detriment, developing harmful products or services. The data may not be used for advertising or marketing activities and the data may not be transferred in any way to a third party which is not mentioned in the data permit.

Interestingly, the “data users” may include any person who has lawful access to electronic health data – although some purposes are reserved for public authorities. This means that members of the pharmaceutical industry may request access to the data, even if they have a commercial purpose, as long as they intend to pursue one of the legitimate purposes, such as scientific research, innovation or the use of data to develop and train selected algorithms.

Whether this “permit-based approach” will be sufficient to facilitate the sharing of health data for secondary use, while at the same time guaranteeing the rights of individuals, remains to be seen: the success will largely depend on the practice and staffing of these national health data access bodies. It is noted that the GDPR follows a risk-based approach, creating more flexibility due to self-assessments and sufficient documentation.

Technical Provisions

The EHDS Proposal not only contains substantive provisions on the use and reuse of health data but also organizes Europe’s technical infrastructure to support the primary and secondary uses of health data.

In order to make electronic health data accessible and transmissible, they should be processed using a common, interoperable format, the “European electronic health record exchange format” for which the Commission will determine the technical specifications. The natural person, the health care provider and the data recipient should be able to use this format to read and access the health data.

In order to guarantee a minimum level of security and interoperability, the EHDS Proposal would impose a self-certification scheme for EHR systems. The proposal also introduces a voluntary label for wellness applications to ensure transparency for users (and procurers) regarding the interoperability and security requirements (so the data generated by these apps can be added to the EHR). This scheme should also reduce cross-border market barriers for manufacturers (which must be established in the EU or have an authorized representative in the EU, prior to making an EHR system available in the EU). In the same vein, importers and distributors have specific obligations (e.g. verification of the conformity of the EHR system). A system of market surveillance of EHR systems is also provided, as Regulation 2019/1020 on market surveillance and compliance of products also applies to EHR systems. These rules apply in addition to compliance obligations resulting from the AI or medical device regulations.

Furthermore, a cross-border infrastructure at the European level would be set up under the name ‘MyHealth@EU’. It will bring together the “national contact points for digital health” and the “central platform for digital health”, in view of facilitating the exchange of electronic health data for primary use. The EHDS Proposal designates which Member States  are joint controllers and the Commission as a processor.

Similarly, a cross-border infrastructure at the European level would be set up for the secondary use of electronic health data, under the name “HealthData@EU”. The Member States must designate a national contact point for secondary use of electronic health data, which will be responsible for facilitating such use by “authorised participants” in a cross-border context.

To optimize the secondary use of the health data, the EHDS Proposal contains some technical requirements to ensure the health data quality and utility for secondary use: a description of the available data sets, a data quality and utility label, a EU datasets catalogue and minimum specifications for cross-border data sets for secondary use.

Regulatory Supervision

The EHDS Proposal would introduce new regulatory authorities, with distinct responsibilities for the primary and the secondary use of the electronic health data

Member States will be required to set up a digital health authority responsible for monitoring and guaranteeing the rights of individuals, under this primary use component.

The health data access bodies, to be created by the Member States, will decide whether access for secondary use is permissible and issue a “data permit”. Interestingly, they will also collect the data from various data holders (who must inform the heath data access body about the data sets they hold), prepare and disclose the data to the data user, only for the permitted purposes, while preserving IP rights and trade secrets and allowing data subjects to exercise their rights. They would also have support, documentation, publicity and technical management obligations. They should also facilitate cross-border access to electronic health data for secondary use hosted in other Member States through HealthData@EU. Finally, they would monitor and supervise the compliance of data users and data holders with their respective obligations.

The EHDS Proposal contains detailed provisions on the content of the data permit, the application process and the access to the data (in a secure processing environment).

Opportunities

The EHDS Proposal introduces an ambitious framework for facilitating the access to and (re-)use of health data. Its first purpose is to improve the access to health data for the data subjects, while at the same time strengthening their rights, and health care providers (primary use). 

The harmonization of technical requirements and the self-certification scheme for EHRs may reduce the barriers for EHR-developers, importers and distributors and facilitate access to the EU-wide market.

It is, however, the incentives to unlock these sensitive data for secondary purposes that show the Commission’s ambitions.

Importantly, research and innovation in data-intensive applications (including training algorithms for AI-applications, medical devices or medicinal products) are explicitly mentioned as authorized secondary purposes, meaning that data users can apply for a data permit for such intended purposes. As the EHDS Proposal intends to assure a certain data quality and the availability of large quantities of data from different sources, research institutions and industry actors should be able to leverage this new regulation to pursue faster and better innovations than if they only had access to their own data sets.

Health professionals should benefit from the EHDS as well, in particular with the secondary use of “providing personalized healthcare consisting in assessing, maintaining or restoring the state of health of natural persons, based on the health data of other natural persons”.

Finally, data holders (such as healthcare providers, including private or public hospitals, and research institutions) may be subject to new, burdensome obligations to make their data available for secondary use through the health data access bodies. The definition of “data holder” in the EHDS Proposal could use some clarification, as the current description covers any entity or body health or care sectors (or researchers in these sectors) that has the right or the legal obligation to make available certain data (in case of non-personal data the control of the technical design of a product or service suffices). On the other hand, they may also develop additional sources of revenue: data holders are indeed entitled to a fee, which is based on the cost of conducting the access procedure but (except for public sector bodies) may also include compensation for part of the cost of collecting and formatting the data.

We also note that entities that are operating in the US and the EU will likely need to navigate rules regarding health data that may not be harmonized, including US regulations governing health data privacy, interoperability, certification of EHRs, and oversight of medical devices.

For more information, please contact the professional(s) listed below, or your regular Crowell & Moring contact.

The California Office of the Attorney General issued its first opinion interpreting the California Consumer Privacy Act (CCPA) on March 10, 2022, addressing the issue of whether a consumer has a right to know the inferences that a business holds about the consumer. The AG concluded that, unless a statutory exception applies, internally generated inferences that a business holds about the consumer are personal information within the meaning of the CCPA and must be disclosed to the consumer, upon request. The consumer has the right to know about the inferences, regardless of whether the inferences were generated internally by the business or obtained by the business from another source. Further, while the CCPA does not require a business to disclose its trade secrets in response to consumers’ requests for information, the business cannot withhold inferences about the consumer by merely asserting that they constitute a “trade secret.”

Under the CCPA, the definition of “personal information” includes “inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.” (Civ. Code, § 1798.140, subd. (o)). The CCPA gives consumers the right to know what personal information a business collects about them. As such, a consumer has the right to request and receive the specific pieces of information “collected about” them. (Civ. Code, § 1798.110, subd. (a)). The precise question that the opinion addressed was whether a consumer’s right to receive the specific pieces of personal information that a business has collected about that consumer applies to internally generated inferences.

The opinion explained that an inference is a personal “characteristic deduced about a consumer,” such as “married” or “likely voter.” For purposes of the CCPA, “inferences” means “the derivation of information, data, assumption, or conclusions from facts, evidence, or another source of information or data.” (Civ. Code, § 1798.140, subd. (m)). The opinion held that inferences are deemed “personal information” for the purposes of CCPA when two conditions are met.

First, the inference must be drawn from any information listed in the definition of “personal information.”

California Civil Code section 1798.14(o) lists the following as personal information:

  • personal identifiers (such as names, addresses, account numbers, or identification numbers);
  • customer records;
  • characteristics of protected classifications (such as age, gender, race, or religion);
  • commercial information (such as property records or purchase history);
  • biometric information;
  • online activity information;
  • geolocation data;
  • “audio, electronic, visual, thermal, olfactory, or similar information”;
  • professional or employment information;
  • education information.

Second, the inference must be used to create a profile about the consumer (where a business is using inferences to predict, target or affect consumer behavior).

In its reasoning, the opinion rejected the argument that the wording of the statute “about the consumer” is limited just to personal information collected from the consumer. Inferences can be gathered directly from the consumer, found in public repositories, created internally using proprietary technology, bought, or collected from another source. The AG opinion made clear that, irrespective of their origin, inferences constitute a part of the consumer’s unique identity and become part of the information that the business has “collected about” the consumer. As such, a request from the consumer to know and receive information collected about them must disclose inferences, regardless of how such inferences were obtained or generated by the business. The AG opinion clarified that, if the inference was based on public information, such as government identification numbers, vital records, or tax rolls, the inference must be disclosed to the consumer, even if the public information itself that formed the basis of the inference need not be disclosed.

The opinion offered an example of inferences that may not need to be disclosed, namely inferences that are used solely for internal purposes and that are not used to predict a consumer’s propensity or to create a profile. A business may combine information obtained from a consumer with online postal information to obtain a nine-digit zip code to facilitate a delivery. Such zip code would not need to be disclosed to the consumer because it will not be used to identify or predict the consumer’s characteristics.

A business bears the burden of demonstrating that inferences are trade secrets under applicable law.

The opinion recognized that a consumer’s right to know about the inferences is not absolute and a business may rely on a number of exceptions to the CCPA. For example, the CCPA excludes information that is freely available from government sources, and there are specific exceptions for certain categories of information, such as medical records, credit reporting, banking, and vehicle safety records. Further, a business obligation to respond to a request for personal information may be relieved by several carve-out provisions of Section 1798.145:

  1. The obligations imposed on businesses by this title shall not restrict a business’ ability to:
    • Comply with federal, state, or local laws.
    • Comply with a civil, criminal, or regulatory inquiry . . .
    • Cooperate with law enforcement agencies . . .
    • Exercise or defend legal claims.
    • Collect, use, retain, sell, or disclose information that is deidentified . . .
    • Collect or sell a consumer’s personal information if every aspect of that conduct takes place solely outside California. . . .

(Civ. Code, § 1798.145, subd. (a)(1)).

Importantly, the opinion clarified that businesses are not required to disclose their trade secrets in response to consumers’ request for information. The opinion recognized that while an algorithm that a company uses to derive its inferences might be a protected trade secret, CCPA only requires a business to disclose an output of its algorithm, not the algorithm itself. The AG further clarified that while CCPA does not require a business to disclose trade secrets, a business does bear the burden of demonstrating that such inferences are trade secrets under applicable law, if such business would like to withhold consumers’ inferences on the ground that they are protected trade secrets. The opinion also recognized that whether a particular inference can be protected as a “trade secret” is fact-specific.

Ramifications of the opinion.

The opinion made clear that the California AG sees inferences as another piece of personal information in the bundle of consumer information that may be subject of commercial exploitation and thus subject to disclosure. While opinions on interpretations of a statute by the Office of the Attorney General are not controlling or binding on a court, they have generally been found as persuasive authority. The opinion also made clear that the California Privacy Rights Act, which becomes effective on January 1, 2023, will not change the AG’s opinion on this issue.

This opinion has an impact on the privacy practices of advertisers, data brokers, and other businesses that use behavioral analytics tools or artificial intelligence to derive personal characteristics, make profiles about consumers, and target consumers based on such particular characteristics. Such businesses need to go through the two-part test described above to determine whether inferences drawn in the context of their business are pieces of personal information and thus subject to the consumer right to know provisions of the CCPA. If the answer is yes, then these inferences must be disclosed upon request.

If a business would like to withhold an inference on the basis that the inference is a trade secret, then the business would also need to analyze whether it can protect such inference as a trade secret. The business would need to show that the inference itself derives “independent economic value” from not being generally known to the public or others who can obtain economic value from its use or disclosure. The business would also need to demonstrate that it has used reasonable efforts to maintain the secrecy of the inference and must identify the inference with “reasonable particularity.” If a business denies a consumer’s request to know “in whole or in part, because of a conflict with federal or state law, or an exception to the CCPA,” the business would need to explain the basis of its denial, as broad assertions of “trade secret” or “proprietary information” would not suffice. (Cal. Code Regs., tit. 11, § 999.313(c)(4)).

When water cooler chatter became less common when the pandemic hit in 2020, chat platforms and text messages (IM) filled the gap.  Collaboration tools like Zoom, Microsoft Teams, Slack, Bloomberg Chat and IM are now ubiquitous, with more than 67% of white-collar employees still “working from home to some degree.”[1] Indeed, a survey of IT managers reported that 91% of all companies now use at least two messaging apps.[2]

As more companies integrate these channels into their typical business practices, more and more legal matters will involve the review of chat message conversations. It is imperative that companies have processes and systems in place to control, retain, monitor, and review such business communications.

There are numerous challenges for business in reviewing chat data, including identifying and accessing chat platforms, handling ephemeral data, identifying participants (with various aliases or usernames), decoding the cryptic nature of some messages, coordinating the attachments and responses to those messages, and making sense of notices when parties enter or leave the conversation.  People also often speak differently in a chat setting (more tersely, and using shorthand, emojis, slang, abbreviations, and images) than in other communication forms. Thus, external context may be even more essential to understand the nuances of the matter being discussed.

Continue Reading From The Water Cooler to the DMs – Tips and Tricks for Efficiently Reviewing Chat Communications