On November 9, 2023, the European Parliament has adopted the final version of the Data Act, marking a significant milestone in the evolving landscape of digital regulation. The Data Act is part of the European Commission’s broader strategy to shape Europe’s digital future (see our earlier posts here and here).

The widespread use of internet-connected products (the so-called Internet of things or “IoT”) has notably increased the volume and potential value of data for consumers, businesses, and society at large. Recognizing that barriers to data sharing hinder optimal data allocation for societal benefit, led to the drafting of the Data Act. Initially proposed by the European Commission in February 2022, the Data Act is designed to regulate data sharing and usage within the EU.

The Data Act, which applies to both personal and non-personal data, encompasses several key elements designed to foster an efficient, fair, and innovative data economy:

  • It facilitates data sharing, particularly data generated by connected devices and used by related services. This spans all sectors, underscoring the significance of non-personal data sharing for societal and economic benefits;
  • Itestablishes mechanisms for data transfer and usage rights, with a special focus on cloud service providers and data processing services. This facilitates a more fluid and secure data sharing environment;
  • It introduces interoperability standards to ensure data can be accessed, transferred, and used across different sectors, which is crucial for innovation and competitive markets;
  • It reinforces the right to data portability, allowing users to move their data across different service providers, which enhances user autonomy and promotes competition;
  • It mandates that providers of data processing services, such as cloud and edge services, implement reasonable measures against unauthorized third-party access to non-personal data, thereby fostering trust in data;
  • It aims to balance the availability of data with the protection of trade secrets;
  • It recognizes the need for public sector bodies, the Commission, the European Central Bank or Union bodies to use existing data to respond to public emergencies or in other exceptional cases; and
  • It provides protections against unfair contractual terms that are unilaterally imposed.

These elements collectively aim to enhance data accessibility and utility, protect individual and business interests, and foster a more competitive and innovative digital market in the EU.

The adopted text now needs formal approval by the Council to become law. Once finalized, the Data Act will enter into force on the 20th day following its publication in the Official Journal of the European Union and will apply from 20 to 32 months from the date of entry into force. The timeline for complete enforcement is thus expected to span several years, allowing businesses and stakeholders adequate time to adapt to the new requirements.

As always, we will continue to monitor the developments in this matter and keep you informed of any further updates.

On October 24, 2023, the European Data Protection Supervisor (EDPS), which is the supervisory authority for the EU institutions, bodies, offices and agencies (EUIs), published a new opinion on the widely discussed proposal for an EU Regulation laying down harmonized rules on artificial intelligence (commonly known as the AI Act Proposal). Although the EDPS does not supervise the private sector, it plays an influential role in both the European and global regulatory community and this new opinion is, thus, a valuable addition to the current legislative debate.

The AI Act Proposal was published by the European Commission in April 2021 with a view to establishing harmonized rules for AI within the EU. In short, the purpose of this AI Act is to regulate the development and use of AI-based systems, based on the risks they entail. 

The EDPS previously issued a joint opinion in collaboration with the European Data Protection Board (in June 2021). However, this new opinion is solely the EDPS’ initiative, and it aims to provide further recommendations to the EU co-legislators as the negotiations on the AI Act Proposal enter their final stage. The opinion focuses on a number of institutional, legal and technical aspects related to the role and tasks of the EDPS as future AI supervisor of the EUIs.

Key takeaways:

  • The EDPS supports the establishment of a legal framework for AI systems based on the EU values as enshrined in both the EU Charter of Fundamental Rights and the European Convention on Human Rights;
  • The EDPS confirms the “red lines” set out in the earlier EDPB-EDPS joint opinion, stating that several uses of AI should be prohibited because they pose unacceptable risks to fundamental rights – including the use of AI for social scoring, categorizing individuals from biometrics, and individual risk assessments for law enforcement purposes;
  • The new opinion also reiterates the view that national Data Protection Authorities should be designated as the national supervisory authorities, on account of their expertise in AI-related legislation. Simultaneously, the EDPS emphasizes the need for cooperation between these national authorities and other oversight authorities to ensure that AI systems are trustworthy, safe, and compliant with EU legislation in the field of their deployment;
  • The EDPS stresses the need for clarity with regards to “high-risk AI systems” and regarding other notions, such as “provider” and “development”;
  • The EDPS emphasizes that AI systems already in use at the date of applicability of the AI Act should not be exempted from its scope and should be required to comply with the AI Act requirements from its date of applicability;
  • The EDPS welcomes its various proposed roles as notified body in the context of pre-market control (conformity assessment), market surveillance authority in the context of post-market control, and competent authority for the supervision of the development, provision or use of AI systems by EUIs;
  • The EDPS emphasizes that the proposed AI Office, a new EU body that would support the harmonized application of the AI Act, must be independent if it is to strengthen enforcement in the EU and prevent providers of AI systems from engaging in ‘forum shopping’. It highlights the need to strengthen cooperation between the AI office and the EDPS in its role of AI supervisor. It expresses regret that it currently lacks voting rights on the AI Office’s management board, while national supervisory authorities have these voting rights;
  • The EDPS recommends introducing the right to lodge a complaint before the competent supervisory authority, and to an effective judicial remedy against a decision of the authority before which a complaint has been brought (specifying the competences of the EDPS as a supervisory authority); and finally,
  • The EDPS suggests providing, notably in case of the use of high-risk AI systems, the right to obtain human intervention and to contest the output of the decision-making, as well as a right to an explanation from the deployer of the AI system regarding any decisions significantly affecting the user.

In short, with its recent opinion, the EDPS aims to provide more specific advice as to how its tasks, duties and powers set out in the AI Act could be further clarified, as well as further recommendations on how to ensure effective enforcement of the AI Act through a true “European approach”.

We will continue to monitor the developments in this matter and keep you informed of any further updates.

We would like to thank Arthur Focquet, Associate, for his contribution to this alert.

On October 30, 2023, the Securities and Exchange Commission (the “SEC”) filed a civil lawsuit charging SolarWinds Corporation (“SolarWinds” or the “Company”) and its chief information security officer, Timothy G. Brown (“Brown”), with securities fraud, internal controls failures, misleading investors about cyber risk, and disclosure controls failures, among other violations.  The SEC’s claims arise from allegedly known cybersecurity risks and vulnerabilities at SolarWinds associated with the SUNBURST cyberattack that occurred between 2018 and 2021.

Continue Reading Uncharted Territory: The SEC Sues SolarWinds and its CISO for Securities Laws Violations in Connection with SUNBURST Cyberattack

The summer has been anything but slow in the People’s Republic of China. China is leaning into its regulation of emerging technologies, while attempting to strike a balance with its domestic economic priorities. In just the past few weeks, state authorities have issued a slew of draft measures and announced new initiatives – all with significant ramifications for businesses processing data within the PRC. From personal information processing to facial recognition to cross-border data transfers, what follows is a highlight reel of what you may have missed while you were away on vacation, with the comment period for many of these developments closing within the next few weeks.

Continue Reading Catch Up Fast: The “Data Days” of Summer in China

On July 25, 2023, the Senate Judiciary Committee held its fifth hearing this year on artificial intelligence (AI). This is the second hearing held by the Subcommittee on Privacy, Technology, and the Law, and it highlighted the “bipartisan unanimity” in regulating AI technology.

Continue Reading The Future is Here: Senate Judiciary Committee’s Oversight of AI and Principles for Regulation

On July 24, 2023, an en banc Eleventh Circuit joined the majority of circuits to find that just one text is sufficient to establish standing to bring a Telephone Consumer Protection Act (“TCPA”) claim. The decision, Drazen v. Pinto, — F.4th —, 2023 WL 4699939 (11th Cir. July 24, 2023), not only undoes the panel’s original holding, but also reverses course from the Eleventh Circuit’s prior decision in Salcedo v. Hanna, 936 F.3d 1162 (11th Cir. 2019), which held that a Plaintiff who received a single text message did not have TCPA standing.  

Continue Reading The First Text Cuts the Deepest: Eleventh Circuit Aligns with Other Circuits on TCPA Standing

On July 26, 2023, the SEC finalized long-awaited disclosure rules (the “Final Rules”) regarding cybersecurity risk management, strategy, governance, and incidents by public companies that are subject to the reporting requirements of the Securities Exchange Act of 1934.  While the end results are substantially similar to rules proposed by the SEC in March 2022, there are some key distinctions. 

Continue Reading Five Key Takeaways from the SEC’s Final Cybersecurity Rules for Public Companies

In an unconventional opening to the normally staid proceedings of the United States Senate, the voice of Frank Sinatra introduced the July 12, 2023 Senate Judiciary Subcommittee hearing on artificial intelligence (AI) and intellectual property. More accurately, an AI-generated version of Frank Sinatra’s voice sang about regulating AI to the tune of New York, New York, which Senator Chris Coons (D-DE), Chairman of the Senate Judiciary Subcommittee on Intellectual Property, used to illustrate both the possibilities and the risks of the use of AI in creative industries.

Continue Reading Senate Judiciary Subcommittee on Intellectual Property Hearing on Artificial Intelligence and Intellectual Property – Part II: Copyright

On June 18, 2023, the Biden-Harris administration announced the launch of a new “U.S. Cyber Trust Mark” program (hereinafter the “Program”). First proposed by Federal Communication Commission (“FCC”) Chairwoman Jessica Rosenworcel, the Program aims to increase transparency and competition across the smart devices sector and to assist consumers in making informed decisions about the security of the devices they purchase.

Continue Reading Biden Admin Eyes IoT Cyber Practices

In a June 30, 2023 decision by the Superior Court of California, County of Sacramento, the Court issued a ruling delaying agency enforcement of final regulations under the California Privacy Rights Act (CPRA) until March 2024. Calfornia Chamber of Commerce v. California Privacy Protection Act, Case No. 34-2023-80004106-CU-WM-GDS (Sacramento Superior Court, June 30, 2023).

The California Consumer Privacy Act of 2018 (CCPA) and the California Privacy Rights Act of 2020 (CPRA) provisions in the ballot initiative passed in 2020 by California voters are still in effect. However, enforcement of the final regulations implementing the CPRA, enacted on March 29, 2023 by the California Privacy Protection Agency (Agency) and which were set to go in effect on July 1, 2023, has been stayed by the California court until March 2024 (until one year after the enactment of the final CPRA regulations). Assuming the ruling is not overturned on appeal, it gives businesses another 9 months to become compliant with the final CPRA regulations. Businesses still need to remain compliant with the prior CCPA regulations in effect before the final CPRA regulations, including the CPRA provisions that were in the ballot initiative of 2020. The Agency has set a public meeting for July 14 to discuss enforcement and other topics.  

Notably, on March 29, 2023, the Agency issued final regulations with respect to only 12 of the 15 areas required by Section 1798.185 of the CPRA. The Court ruled that enforcement of these regulations was delayed until March 29, 2024. Enforcement of any regulations in the remaining three areas (cybersecurity audits, risk assessments and automated decision-making technology) will begin until a year after the Agency finalizes those rules. The Court did not mandate any specific date by which the Agency must finalize these remaining regulations.