Text messages and other non-email, electronic communications have become increasingly important in securities fraud matters. These communications are often sent from personal mobile devices and often provide key evidence.  It has become clear that the most interesting, and sometimes most problematic, communications often do not take place via email.

Continue Reading Text Messages Lead to $4.47B Liability in Securities Fraud Case

The U.S. Securities and Exchange Commission (“SEC”) adopted a final rule on July 26, 2023 that requires public companies to disclose material cybersecurity incidents under new Item 1.05 of Form 8-K. Since its adoption, public companies have faced practical challenges in determining whether and when a cybersecurity incident warrants disclosure under Item 1.05.

On May 21, 2024, roughly six months after the final rule’s effective date, Erik Gerding, Director of the SEC’s Division of Corporation Finance, issued a statement signaling that public companies should consider disclosing incidents in a different fashion under a Form 8-K.  Specific points of note:

Continue Reading SEC “Encourages” Public Companies to Disclose “Immaterial” Cybersecurity Incidents Under Item 8.01 of Form 8-K

“Browsing and location data are sensitive . . .. Full stop,” says the Federal Trade Commission. As is all granular data that can reveal “insights” that “can be attributed to particular people” through a “re-identification” procedure. This is one basis of complaints the FTC filed against Avast, X-Mode Social, and InMarket. A March 4, 2024 FTC blog post titled FTC Cracks Down on Mass Data Collectors: A Closer Look at Avast, X-Mode, and InMarket describes why these three companies’ collection of consumers’ browsing and location data raised concerns for the agency, and looks at two other data governance practices by those companies that also concerned the agency. All companies operating in the United States that collect and use consumer data should understand the themes emerging from the proposed settlements and orders and heed the admonitions from the agency moving forward.

Continue Reading “Browsing and location data are sensitive . . .. Full stop”

On December 26, 2023, the Department of Defense (DoD) released the highly anticipated proposed rule for the Cybersecurity Maturity Model Certification Program (CMMC), a cybersecurity regulatory program that will likely impact most of the government contractor community. Every contractor who handles sensitive data such as Controlled Unclassified Information (CUI) or Federal Contract Information (FCI) during DoD contract performance will be covered by this regulation. While the CMMC program builds upon the security requirements included in Defense Federal Acquisition Regulation Supplement (DFARS) clause 252.204-7012, CMMC will bring greater scrutiny to contractors’ cybersecurity compliance and potentially greater consequences for failure to comply in the era of the Department of Justice’s Civil Cyber Fraud Initiative and False Claims Act litigation. If finalized as proposed, the rule will significantly impact the CMMC regime, notably by requiring senior company officials to complete an affirmation for every CMMC level self-assessed or certified, thus increasing legal compliance risks.

Continue Reading DoD’s New Year Resolution: A Cybersecurity Maturity Model Certification Program (CMMC) Proposed Rule

Public companies now have a pathway to request a delay in their cybersecurity incident disclosure to the U.S. Securities and Exchange Commission (“SEC”). On December 6, 2023, the Federal Bureau of Investigation (“FBI”) Cyber Division published the “Cyber Victim Requests to Delay Securities and Exchange Commission Public Disclosure Policy Notice” (the “Policy Notice”) in response to the SEC’s finalized disclosure rules (the “Final Rules”). Published on July 26, 2023, the Final Rules established guidelines around cybersecurity risk management, strategy, governance, and incidents for public companies subject to the Securities Exchange Act of 1934. Among several requirements under the Final Rules, companies are required to disclose cybersecurity incidents within four days of a materiality determination by filing an SEC Form 8-K.

Continue Reading FBI Offers Pathway to Request Delay of SEC Cybersecurity Incident Disclosures

On November 9, 2023, the European Parliament has adopted the final version of the Data Act, marking a significant milestone in the evolving landscape of digital regulation. The Data Act is part of the European Commission’s broader strategy to shape Europe’s digital future (see our earlier posts here and here).

The widespread use of internet-connected products (the so-called Internet of things or “IoT”) has notably increased the volume and potential value of data for consumers, businesses, and society at large. Recognizing that barriers to data sharing hinder optimal data allocation for societal benefit, led to the drafting of the Data Act. Initially proposed by the European Commission in February 2022, the Data Act is designed to regulate data sharing and usage within the EU.

The Data Act, which applies to both personal and non-personal data, encompasses several key elements designed to foster an efficient, fair, and innovative data economy:

  • It facilitates data sharing, particularly data generated by connected devices and used by related services. This spans all sectors, underscoring the significance of non-personal data sharing for societal and economic benefits;
  • Itestablishes mechanisms for data transfer and usage rights, with a special focus on cloud service providers and data processing services. This facilitates a more fluid and secure data sharing environment;
  • It introduces interoperability standards to ensure data can be accessed, transferred, and used across different sectors, which is crucial for innovation and competitive markets;
  • It reinforces the right to data portability, allowing users to move their data across different service providers, which enhances user autonomy and promotes competition;
  • It mandates that providers of data processing services, such as cloud and edge services, implement reasonable measures against unauthorized third-party access to non-personal data, thereby fostering trust in data;
  • It aims to balance the availability of data with the protection of trade secrets;
  • It recognizes the need for public sector bodies, the Commission, the European Central Bank or Union bodies to use existing data to respond to public emergencies or in other exceptional cases; and
  • It provides protections against unfair contractual terms that are unilaterally imposed.

These elements collectively aim to enhance data accessibility and utility, protect individual and business interests, and foster a more competitive and innovative digital market in the EU.

The adopted text now needs formal approval by the Council to become law. Once finalized, the Data Act will enter into force on the 20th day following its publication in the Official Journal of the European Union and will apply from 20 to 32 months from the date of entry into force. The timeline for complete enforcement is thus expected to span several years, allowing businesses and stakeholders adequate time to adapt to the new requirements.

As always, we will continue to monitor the developments in this matter and keep you informed of any further updates.

On October 24, 2023, the European Data Protection Supervisor (EDPS), which is the supervisory authority for the EU institutions, bodies, offices and agencies (EUIs), published a new opinion on the widely discussed proposal for an EU Regulation laying down harmonized rules on artificial intelligence (commonly known as the AI Act Proposal). Although the EDPS does not supervise the private sector, it plays an influential role in both the European and global regulatory community and this new opinion is, thus, a valuable addition to the current legislative debate.

The AI Act Proposal was published by the European Commission in April 2021 with a view to establishing harmonized rules for AI within the EU. In short, the purpose of this AI Act is to regulate the development and use of AI-based systems, based on the risks they entail. 

The EDPS previously issued a joint opinion in collaboration with the European Data Protection Board (in June 2021). However, this new opinion is solely the EDPS’ initiative, and it aims to provide further recommendations to the EU co-legislators as the negotiations on the AI Act Proposal enter their final stage. The opinion focuses on a number of institutional, legal and technical aspects related to the role and tasks of the EDPS as future AI supervisor of the EUIs.

Key takeaways:

  • The EDPS supports the establishment of a legal framework for AI systems based on the EU values as enshrined in both the EU Charter of Fundamental Rights and the European Convention on Human Rights;
  • The EDPS confirms the “red lines” set out in the earlier EDPB-EDPS joint opinion, stating that several uses of AI should be prohibited because they pose unacceptable risks to fundamental rights – including the use of AI for social scoring, categorizing individuals from biometrics, and individual risk assessments for law enforcement purposes;
  • The new opinion also reiterates the view that national Data Protection Authorities should be designated as the national supervisory authorities, on account of their expertise in AI-related legislation. Simultaneously, the EDPS emphasizes the need for cooperation between these national authorities and other oversight authorities to ensure that AI systems are trustworthy, safe, and compliant with EU legislation in the field of their deployment;
  • The EDPS stresses the need for clarity with regards to “high-risk AI systems” and regarding other notions, such as “provider” and “development”;
  • The EDPS emphasizes that AI systems already in use at the date of applicability of the AI Act should not be exempted from its scope and should be required to comply with the AI Act requirements from its date of applicability;
  • The EDPS welcomes its various proposed roles as notified body in the context of pre-market control (conformity assessment), market surveillance authority in the context of post-market control, and competent authority for the supervision of the development, provision or use of AI systems by EUIs;
  • The EDPS emphasizes that the proposed AI Office, a new EU body that would support the harmonized application of the AI Act, must be independent if it is to strengthen enforcement in the EU and prevent providers of AI systems from engaging in ‘forum shopping’. It highlights the need to strengthen cooperation between the AI office and the EDPS in its role of AI supervisor. It expresses regret that it currently lacks voting rights on the AI Office’s management board, while national supervisory authorities have these voting rights;
  • The EDPS recommends introducing the right to lodge a complaint before the competent supervisory authority, and to an effective judicial remedy against a decision of the authority before which a complaint has been brought (specifying the competences of the EDPS as a supervisory authority); and finally,
  • The EDPS suggests providing, notably in case of the use of high-risk AI systems, the right to obtain human intervention and to contest the output of the decision-making, as well as a right to an explanation from the deployer of the AI system regarding any decisions significantly affecting the user.

In short, with its recent opinion, the EDPS aims to provide more specific advice as to how its tasks, duties and powers set out in the AI Act could be further clarified, as well as further recommendations on how to ensure effective enforcement of the AI Act through a true “European approach”.

We will continue to monitor the developments in this matter and keep you informed of any further updates.

We would like to thank Arthur Focquet, Associate, for his contribution to this alert.

On October 30, 2023, the Securities and Exchange Commission (the “SEC”) filed a civil lawsuit charging SolarWinds Corporation (“SolarWinds” or the “Company”) and its chief information security officer, Timothy G. Brown (“Brown”), with securities fraud, internal controls failures, misleading investors about cyber risk, and disclosure controls failures, among other violations.  The SEC’s claims arise from allegedly known cybersecurity risks and vulnerabilities at SolarWinds associated with the SUNBURST cyberattack that occurred between 2018 and 2021.

Continue Reading Uncharted Territory: The SEC Sues SolarWinds and its CISO for Securities Laws Violations in Connection with SUNBURST Cyberattack

The summer has been anything but slow in the People’s Republic of China. China is leaning into its regulation of emerging technologies, while attempting to strike a balance with its domestic economic priorities. In just the past few weeks, state authorities have issued a slew of draft measures and announced new initiatives – all with significant ramifications for businesses processing data within the PRC. From personal information processing to facial recognition to cross-border data transfers, what follows is a highlight reel of what you may have missed while you were away on vacation, with the comment period for many of these developments closing within the next few weeks.

Continue Reading Catch Up Fast: The “Data Days” of Summer in China

On July 25, 2023, the Senate Judiciary Committee held its fifth hearing this year on artificial intelligence (AI). This is the second hearing held by the Subcommittee on Privacy, Technology, and the Law, and it highlighted the “bipartisan unanimity” in regulating AI technology.

Continue Reading The Future is Here: Senate Judiciary Committee’s Oversight of AI and Principles for Regulation