Banner for PrivSec Global: A Global Live Stream Experience. 22-23 September 2021. The Largest Data Protection, Privacy and Security Event of 2021. Businesspeople smiling in the background of the banner.

Metaverse Law Speaks at PrivSec Global

On September 23, 2021 attorney Lily Li spoke at PrivSec Global: The Largest Data Protection, Privacy and Security Event of 2021. The Global Live Stream Experience was a two day event from September 22 to September 23, 2021.

The topic of discussion was “Why Most CCPA Cases Will Fail: Five Hurdles Plaintiffs Must Clear.” For more details on the topic and to watch the presentation on-demand, click here.

Human hand holding a smartphone. AI machine in the background working on the phone.

Guidance on Artificial Intelligence and Data Protection

Image by geralt from Pixabay.

For many of us, Artificial Intelligence (“AI”) represents innovation, opportunities, and potential value to society.

For data protection professionals, however, AI also represents a range of risks involved in the use of technologies that shift processing of personal data to complex computer systems with often opaque processes and algorithms.

Data protection and information security authorities as well as governmental agencies around the world have been issuing guidelines and practical frameworks to offer guidance in developing AI technologies that will meet the leading data protection standards.

Below, we have compiled a list* of official guidance recently published by authorities around the globe.

Canada:

  • 1/17/2022 – Government of Ontario, “Beta principles for the ethical use of AI and data enhanced technologies in Ontario”
    https://www.ontario.ca/page/beta-principles-ethical-use-ai-and-data-enhanced-technologies-ontario
    The Government of Ontario released six beta principles for the ethical use of AI and data enhanced technologies in Ontario. In particular, the principles set out objectives to align the use of data enhanced technologies within the government processes, programs, and services with ethical considerations being prioritized.

China:

  • 12/12/2022 – Cyberspace Administration of China, Regulations on the Administration of Deep Synthesis of Internet Information Services
    http://www.cac.gov.cn/2022-12/11/c_1672221949354811.htm (in Chinese) and
    http://www.cac.gov.cn/2022-12/11/c_1672221949570926.htm (in Chinese)
    The Regulations target deep synthesis technology, which are synthetic algorithms that produce text, audio, video, virtual scenes, and other network information. The accompanying Regulations FAQs state that providers of deep synthesis technology must provide safe and controllable safeguards and conform with data protection obligations.
  • 9/26/2021 – Ministry of Science and Technology (“MOST”), New Generation of Artificial Intelligence Ethics Code
    http://www.most.gov.cn/kjbgz/202109/t20210926_177063.html (in Chinese)
    The Code aims to integrate ethics and morals into the full life cycle of AI systems, promote fairness, justice, harmony, and safety, and avoid problems such as prejudice, discrimination, privacy, and information leakage. The Code provides for specific ethical requirements in AI technology design, maintenance, and design.
  • 1/5/2021 – National Information Security Standardisation Technical Committee of China (“TC260”), Cybersecurity practice guide on AI ethical security risk prevention
    https://www.tc260.org.cn/upload/2021-01-05/1609818449720076535.pdf (in Chinese)
    The guide highlights ethical risks associated with AI, and provides basic requirements for AI ethical security risk prevention.

E.U.:

  • European Telecommunication Standards Institute (“ETSI”) Industry Specification Group Securing Artificial Intelligence (“ISG SAI”)
    https://www.etsi.org/committee/1640-sai
    The ISG SAI has published standards to preserve and improve the security of AI. The works focus on using AI to enhance security, mitigating against attacks that leverage AI, and securing AI itself from attack.
  • 4/21/2021 – European Commission, “Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts”
    https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=75788
    The EU Commission proposed a new AI Regulation – a set of flexible and proportionate rules that will address the specific risks posed by AI systems, intending to set the highest global standard. As an EU regulation, the rules would apply directly across all European Member States. The regulation proposal follows a risk-based approach and calls for the creation of a European enforcement agency.

France:

Germany:

Read More
Image of the United States Capitol Building at night.

Strengthening the U.S. Government Supply Chain: Cybersecurity under Executive Order 14028

Image Credit: Michael Jowen from Unsplash.

U.S. government agencies have a reputation for occasionally clinging on to outdated technology. Some illustrative examples include the U.S. Department of Defense (DoD) paying Microsoft $9 million to continue supporting the defunct Windows XP in 2015 and a U.S. Government Accountability Office (GAO) report from 2019 documenting multiple agencies using legacy systems with 8 to 50-year-old components. In its findings, the GAO unsurprisingly concluded that such legacy systems using outdated or unsupported software languages and hardware poses a cybersecurity risk.

In the wake of the SolarWinds, Microsoft Exchange, and Colonial Pipeline security incidents that impacted U.S. government agencies and/or U.S. critical infrastructure, President Biden issued Executive Order 14028 to update minimum cybersecurity standards for all software sold to the federal government and throughout the supply chain.

Existing Requirements under FedRAMP, DFARS, and CMMC

The new obligations arising out of Executive Order 14028 add to existing security regulations for certain government contractors and subcontractors.

The Federal Risk and Authorization Management Program (FedRAMP) oversees the safe provisioning of cloud products and services from a Cloud Service Provider (CSP) to any government agency. As part of the FedRAMP authorization process, an accredited Third-Party Assessment Organization (3PAO) assesses the CSP’s controls under NIST SP 800-53, a security framework for federal government information systems. The 3PAO also assesses additional controls above the NIST baseline that are unique to cloud computing.

Contractors who supply products or services specifically to the DoD are subject to the Defense Federal Acquisition Regulation Supplement (DFARS). The DFARS standards establish compliance with fourteen groups of cybersecurity requirements under NIST SP 800-171, meant to protect Controlled Unclassified Information (CUI).  

In November 2020, the DoD released the Cybersecurity Maturity Model Certification (CMMC) framework, which builds upon DFARS. Contractors undergo an audit by a CMMC Third Party Assessment Organization (C3PAO), which issues a certification for the contractors’ assessed cybersecurity maturity level. The certification ranges from CMMC Level 1, indicating a low, ad-hoc maturity, to CMMC Level 5, indicating a high, optimized maturity. As contractors progress further up the DoD supply chain all the way to prime contractors—those working directly with the DoD—the DoD scale requirements for those contractors to meet higher certification levels. Meeting all DFARS controls and 110 controls in NIST SP 800-171 roughly correlates to CMMC level 3.

Cybersecurity Requirements of Executive Order 14028

Read More
Image of the entrance to the United States Supreme Court building.

Will the Courts Treat Foreign Data Privacy Laws as Fact or Farce in U.S. Contracts? Whose Law Will Prevail in Privacy Disputes?

Image Credit: MarkThomas from Pixabay.

[Originally published as a Feature Article: Will the Courts Treat Foreign Data Privacy Laws as Fact or Farce in U.S. Contracts?, by Amira Bucklin and Lily Li, in Orange County Lawyer Magazine, May 2021, Vol. 63 No.5, page 40.]

by Amira Bucklin and Lily Li

In 2020, when lockdown and shelter-at-home orders were implemented, the world moved online. Team meetings, conference calls, even court hearings entered the cloud. More than ever, consumers used online shopping instead of strolling through malls, and online learning platforms instead of classrooms. “Zoom” became a way to meet up with friends over a glass of wine, or conduct job interviews in a blouse, suit jacket, and yoga pants.

This has had vast consequences for personal privacy and cybersecurity. While most consumers might recognize the brand of their online learning platform, ecommerce store, or video conference tool of choice, most consumers don’t notice the network of service providers that work in the background. A whole ecosystem of connected businesses and platforms that collect, store, and transfer data and software, all governed by a new set of international privacy rules and contractual commitments. Yet, many of these rules have not been tested in the courts, and they have several implications in the context of privacy.

The Privacy Conundrum

This month marks the three-year anniversary of the EU’s General Data Protection Regulation (GDPR). As expected, its consequences have been far-reaching, and fines for violations have been staggeringly high.

The GDPR requires companies in charge of personal data (“data controllers”) to enter into data processing agreements with their service providers (or “data processors”), including, at times, standard data protection clauses drafted by the EU Commission. These data processing mega-contracts (ranging from 1-100+ pages) impose a series of foreign data protection and security obligations on the parties.

A unique challenge presented by these contracts is the fact that such data processing agreements and model data protection clauses often include their own choice of law provisions, calling for the applicability of EU member state law, and requiring the parties to grant third-party beneficiary rights to individuals in a wholly different country.

This challenge is not just limited to parties contracting with EU companies, either. Due to the GDPR’s extraterritorial scope, two U.S.-based companies can enter into a contract subject to the laws of the State of California, but which includes a data processing addendum or security schedule that is subject to the laws of the United Kingdom, France, or Germany.

What happens if there is a dispute between these parties regarding their rights and responsibilities, which are subject to foreign data protection laws? How will U.S. courts treat these disputes? How much deference will—and should—a U.S. court provide to foreign interpretations of law?

Read More
Map of the United States - State Privacy Laws

State Privacy Laws in the Wake of the CCPA: A Tough Act to Follow

Image Credit: Free-Photos from Pixabay.

Hard on the heels of the California Consumer Privacy Act of 2018 (CCPA) and updated state privacy laws in Nevada and Maine which took effect in 2019, state data privacy legislation is still on the rise.

In November of 2020, California citizens approved the California Privacy Rights and Enforcement Act (CPRA), further amending the CCPA. The CPRA is intended to strengthen privacy regulations in California by creating new requirements for companies that collect and share sensitive personal information. It also creates a new agency, the California Privacy Protection Agency, that will be responsible for enforcing CPRA violations.

Most recently, the Virginia Governor signed the Consumer Data Protection Act into law, thereby making Virginia yet another U.S. state with a comprehensive state privacy law. 

As momentum builds for state privacy laws, 2021 could be the year that privacy laws gain footing across the country, helping Americans exercise control over their digital lives.

Washington’s Privacy Act 2021, SB 5062
**Update: The WPA did not pass the House by the April 11 deadline. On April 12, however, Senator Carlyle tweeted that the “bill remains alive through the end of the session.” The legislature will close on April 25.

*** Update 4/26: The WPA did not pass for the third year in a row, due to the late introduction of a limited private right of action (for injunctive relief). Jump to the bottom of the page for links to other pending state legislation.

The most notable – due to its furthest progression in state legislation – is the current draft of the Washington Privacy Act 2021 (“WPA”). This draft bill is the third version of the act introduced by Washington state Sen. Reuven Carlyle (D-Seattle) in as many years.

Scope

The WPA would apply to legal entities that:

Read More
1 2 3 4 5