WSJPro Cybersecurity Symposium

Metaverse Law to Speak at WSJ Cybersecurity Symposium

Metaverse Law will be one of the speakers at the Wall Street Journal’s Cybersecurity Symposium and will focus on the applicable laws and regulations per business type.

It is a two day event in San Diego, CA from Thursday, January 9 to Friday January 10, 2020. The agenda for both days includes breakfast and registration, several speakers, networking breaks, lunch, a cocktail reception on the ninth, and a cybersecurity strategy development bootcamp on the tenth.

A detailed itinerary as well as registration details can be found at https://cybersecurity.wsj.com/symposium/san-diego/#schedule

Blue EU flag fluttering in the wind

Schrems II: No Privacy Shield for EU-US Data Transfers, but Don’t Put Your Eggs into Standard Contractual Clauses Either

Image Credit: Capri23auto from Pixabay

On July 16th, 2020, privacy professionals scrambled after the Court of Justice of the European Union (CJEU) handed down its decision in Schrems II. The ruling invalidated the US-EU Privacy Shield agreement, which authorized transfers of data from the EU to the US for Privacy Shield-certified companies. Though the ruling on Privacy Shield was unexpected given that it was not directly at issue, such a decision is not without precedent or historical pattern. Privacy Shield itself was a replacement for the Safe Harbor framework that was invalidated in 2015 in Schrems I.

Now that the Privacy Shield framework has been invalidated, both data controllers and data processors are likely concerned about the next steps to take to ensure that any data transfers integral to its operations can continue. Although the U.S. Department of Commerce has indicated that it will continue processing Privacy Shield certifications, affected companies such as U.S. data importers and EU data exporters should quickly explore and adopt other transfer legitimizing mechanisms with their service providers and vendors in order to prevent any gaps in compliance.

Alternative Mechanism: Standard Contractual Clauses

Under the GDPR, data transfers to “third countries” outside the EU and international organizations are restricted unless validated by an approved mechanism to ensure that GDPR protection will follow.

Under GDPR Article 45, data transfers may be valid on the basis of an “adequacy decision,” where the European Commission has previously evaluated and determined that a third country provides “an adequate level of protection.”

GDPR Article 46(1) provides that, in the absence of an adequacy decision for the third country, other possible transfer mechanisms include Standard Contractual Clauses (SCC). SCCs, also known as “model clauses,” are sets of pre-approved and non-negotiable contractual provisions that both the importer and exporter must agree to.

SCCs are the primary mechanism for data transfers between EU and non-EU entities. This is because binding corporate rules (BCR) are traditionally reserved for intraorganizational transfers of data within multinational corporations, Article 49 derogations should typically only be used for limited, non-repetitive situations, and the other mechanisms listed under Article 46(1) (codes of conduct and certification mechanisms) have not yet been tested.

Evaluate on a “Case-by-Case” Basis

Even if using SCCs, the importer and exporter must complete a “case-by-case” analysis to determine if the laws of the third country provide an adequate level of protection or whether additional safeguards are necessary to meet the standards of the GDPR or the Charter of Fundamental Rights.

For instance, laws that allow presumptively broad law enforcement surveillance of personal data without a judicial review process will likely be non-compliant with the GDPR.

Given China’s recently enacted Cryptography Law, which provides for an encryption backdoor accessible to government actors, China may serve as an example of a third country where SCCs might not be able to automatically validate a cross-border data transfer. Since businesses operating in China may be legally required to provide data to government without requiring judicial approval, such a legal obligation would defeat the adequacy of SCCs as a transfer mechanism. The reliance on SCCs to validate data transfers might fail in such instances.

A similar analysis may have to be completed for US service providers. For instance, many cloud providers may fall under Section 702 of the Foreign Intelligence Surveillance Act (FISA) and Executive Order 12333, both of which govern surveillance programs like PRISM and UPSTREAM. The CJEU heavily scrutinized these programs in its decision to strike down Privacy Shield, finding that these programs were not subject to adequate judicial oversight and that EU citizens would be especially vulnerable given that the protections of the Fourth Amendment of the U.S. Constitution do not apply to EU citizens.

Moving Forward

What’s next on the horizon? Perhaps the third time is the charm.

It is foreseeable that the European Commission and U.S. Department of Commerce might again negotiate a third agreement. This new agreement will need to provide additional checks and balances and reassurances for EU individuals whose data is transferred to the US for processing, beyond the level provided for in the stricken-down Privacy Shield.

In an Opinion dated April 13, 2016, Article 29 Working Party (WP29), the predecessor to the current European Data Protection Board (EDPB), had already determined that one of Privacy Shield’s deficiencies was its failure to address “massive and indiscriminate collection of personal data originating from the EU” by US intelligence agencies. WP29 also expressed concerns that the Privacy Shield Ombudsperson was not sufficiently independent and powerful enough to be an adequate tribunal. It concluded by urging the Commission to improve Privacy Shield to provide equivalent protections as in the EU. Given that these concerns were telegraphed well in advance of Privacy Shield’s actual invalidation, the next framework must absolutely address these issues if it wishes to survive scrutiny. In the meantime, businesses should review their data transfer flows, remain agile and flexible in responding to developing law, and ensure that transfers are validated by multiple mechanisms as a contingency.

Chinese Go Board

China’s 2020 Cryptography Law in the Context of China’s Burgeoning Data Privacy and Security Regime

[Originally published as a Feature Article: China’s 2020 Cryptography Law in the Context of China’s Burgeoning Data Privacy and Security Regime, by Carolyn K. Luong, in Orange County Lawyer Magazine, April 2020, Vol. 62 No.4, page 31.]

By Carolyn Luong

U.S.-China relations have been a trending topic throughout the past year due to several conflicts involving the alleged encroachment upon free speech principles and perceived threats to U.S. national security. The NBA and Activision-Blizzard, both U.S.-based organizations, fielded criticisms in October of 2019 for supposed political censorship motivated by the fear of losing Chinese customers. Furthermore, as the U.S. races to build out its 5G infrastructure, the U.S. government has explicitly restricted U.S. corporations from conducting business with Chinese technology manufacturer Huawei upon apprehension that Huawei equipment may contain backdoors to enable surveillance by the Chinese government.[1]

Dr. Christopher Ford, Assistant Secretary of the U.S. State Department’s Bureau of International Security and Nonproliferation remarked in September that, “Firms such as Huawei, Tencent, ZTE, Alibaba, and Baidu have no meaningful ability to tell the Chinese Communist Party ‘no’ if officials decide to ask for their assistance—e.g., in the form of access to foreign technologies, access to foreign networks, useful information about foreign commercial counterparties . . . .”[2] These Chinese firms in response firmly deny any allegations of contemplated or actual instances of required cooperation with the Chinese government to compromise user information or equipment.

Continue Reading China’s 2020 Cryptography Law in the Context of China’s Burgeoning Data Privacy and Security Regime
Computer screens against skyscraper backdrop

Should Bar Associations Vet Technology Service Providers for Attorneys?

[Originally published in GPSOLO, Vol. 36, No. 6, November/December 2019, by the American Bar Association. Reproduced with permission. All rights reserved.]

Image Credit: Gerd Altmann from Pixabay1

Bar associations across the country have similar goals: advance the rule of law, serve the legal profession, and promote equal access to justice. Technology can easily support these goals. From online research and billing software, to virtual receptionist and SEO services, technology vendors improve the efficiency and accessibility of attorneys. It is no wonder then that bar associations around the country are promoting technology solutions for their members.

Despite the obvious benefits, bar associations need to be diligent about vetting technology vendors. By promoting one technology provider over another, bar associations could run afoul of advertising laws, tax requirements, and software agreements. In addition, bar associations and their members need to pay close attention to technology vendors’ cybersecurity safeguards to protect client confidences.

This article will briefly address each of these issues in turn and provide a non-exhaustive checklist of considerations before choosing a legal technology provider.

Bar Associations as Influencers

When we think of product endorsements today, we think of social media influencers, bloggers, and vloggers—not bar associations. Yet, bar associations wield incredible influence over the purchasing decisions of their members. Given this influence, bar associations should stay mindful of laws addressing unfair and deceptive advertising, such as Section 5 of the Federal Trade Commission Act (FTC Act), state false advertising laws, and state unfair trade practices acts (little FTC acts).

Continue Reading Should Bar Associations Vet Technology Service Providers for Attorneys?
Lock on a computer screen held to edges by chains

What Is Happening in Children’s Online Privacy?

Children’s online privacy has always been an important topic, but a number of recent developments around the world have many businesses taking it more seriously. In September, Google agreed to pay a record $170 million fine to the U.S. Federal Trade Commission for violating the Children’s Online Privacy Protection Act (COPPA) by illegally collecting personal information from children without parental consent and using it to profit through targeted ads. A few weeks later, China’s own version of COPPA called the “Measures on Online Protection of Children’s Personal Data,” came into force, providing further clarity on protecting children’s personal data online under China’s Cyber Security Law. On October 7, the FTC hosted a public workshop to explore whether to update COPPA, which is over 20 years old and in need of a refresh due to the emergence of new technologies. (Just think of all those smart devices, social media platforms and educational apps and technologies that were not around in 1998). Finally, the California Attorney General recently released proposed regulations to the California Consumer Protection Act, which goes into effect in January 2020, that would require a business that knowingly collects the personal information of children under the age of 13 to establish, document and comply with a reasonable method for determining that the person affirmatively authorizing the sale of the personal information about the child is the parent or guardian of that child.

Many children start using the Internet at an early age, raising privacy issues distinct from those for adults. First, children may not understand what data is being collected about them and how it is used. Second, children can easily fall victim to criminal behavior online by providing seemingly innocuous information to web users who can appropriate such information for malicious purposes. Third, children cannot give the same meaningful consent to data collection and use activities as an adult. 

In the U.S., Congress passed COPPA in 1998 to protect children’s use of the Internet—particularly websites and services targeted toward children. COPPA requires website operators to provide clear and conspicuous notice of the data collection methods employed by the website, including functioning hyperlinks to the website privacy policy on every web page where personal information is collected. It also requires affirmative consent by parents prior to collection of personal information for children under the age of 13. Recognizing that teenagers between the ages of 13 and 18 are not protected under COPPA, many individual states have made efforts to address privacy issues for this age group.

Recognizing the need to update COPPA to keep up with the times, the FTC considered the following topics at the October workshop, among others:

  • How the development of new technologies, the evolving nature of privacy harms, and changes in the way parents and children use websites and online services, affect children’s privacy today;
  • Whether COPPA should permit general audience platforms to rebut the presumption that all users of child-directed content are children, and if so, under what circumstances;
  • Whether COPPA should be amended to better address websites and online services that do not include traditionally child-oriented activities, but that have large numbers of child users.

It remains unclear how these issues and others will be resolved. Eager to tap into the new revenue streams that children represent, many tech companies will try to carve out exceptions to COPPA—openly or not. On the other side, child advocates and politicians such as Senator Edward Markey, one of the original authors of COPPA, are pushing back and even trying to tighten restrictions related to children’s online privacy. 

Sometimes the issues are not so black and white. For instance, many well-intentioned companies—tech and otherwise—that have no interest in marketing to children might still be unable to verify the age of users that visit their websites, resulting in inadvertent marketing to minors. Even those that attempt to verify the age of users may face challenges, given the thousands of websites dedicated to helping users bypass age gates and parental controls. Finally, some age verification techniques may run counter to data minimization and privacy concerns – e.g. the collection of credit card data to verify age, when it is not necessary for the provision of the service. Regardless of what happens with COPPA at the FTC and with new privacy laws that are springing up across the world, companies will need to be extra-cautious about how they approach children’s online privacy—continually reviewing their practices and policies to ensure that they are not running afoul of the multitude of laws and regulations out there. Those that do not run the risk of becoming subject to both regulatory and legal action.

1 2 3 7