Cell phone with image of lock on the screen.

Reasonable Security: Implementing Appropriate Safeguards in the Remote Workplace

Photo by Franck on Unsplash

In 2020, with large portions of the global workforce abruptly sent home indefinitely, IT departments nationwide scurried to equip workers of unprepared companies to work remotely.

This presented an issue. Many businesses, particularly small businesses, barely have the minimum network defenses set up to prevent hacks and attacks in the centralized office. When suddenly everyone must become their own IT manager at home, there are even greater variances between secure practices, enforcement, and accountability.

“Reasonable Security” Requirements under CCPA/CPRA and Other Laws

Under the California Consumer Privacy Act (CCPA), the implementation of “reasonable security” is a defense against a consumer’s private right of action to sue for data breach. A consumer who suffers an unauthorized exfiltration, theft, or disclosure of personal information can only seek redress if (1) the personal information was not encrypted or redacted, or (2) the business otherwise failed its duty to implement reasonable security. See Cal. Civ. Code § 1798.150.

Theoretically, this means that a business that has implemented security measures—but nevertheless suffers a breach—may be insulated from liability if the security measures could be considered reasonable measures to protect data. Therefore, while reasonable security is not technically an affirmative obligation under the CCPA, the reduced risk of consumer liability made reasonable security a de facto requirement.

However, under the recently passed California Privacy Rights Act (CPRA), the implementation of reasonable security is now an affirmative obligation. Under revised Cal. Civ. Code § 1798.100, any business that collects a consumer’s personal information shall implement reasonable security procedures and practices to protect personal information. See our CPRA unofficial redlines.

Continue Reading Reasonable Security: Implementing Appropriate Safeguards in the Remote Workplace
Medical stethoscope and blue ink pen laying on appointment booklet. HIPAA privacy notices.

Deidentified Health Info under HIPAA: Deconstructing Dinerstein v. Google, LLC

Image Credit: DarkoStojanovic from Pixabay.

HIPAA Lawsuit
Privacy Compliance

Health data is an increasingly fraught area of privacy. Outside of sectoral health privacy laws like HIPAA, many regulations such as the GDPR and the California Privacy Rights Act (CPRA) rightly treat health or biometric information as a sensitive or special category of data deserving of more protections than many other types of data.

The amount of electronic heath data collected by companies is also increasing at a staggering rate. DNA testing kits and wearable fitness trackers are everywhere, and telehealth has proliferated in the wake of COVID-19.

Healthcare data controllers are just as likely to be big tech companies as opposed to traditional covered entities. Consequently, courts now need to consider a variety of privacy frameworks, not just HIPAA and HITECH, when they adjudicate healthcare claims.

In September 2020, the U.S. District Court for the Northern District of Illinois dismissed a lawsuit brought against the University of Chicago and the University of Chicago Medical Center (collectively referred to as “the University”) and Google for allegations that the University improperly disclosed healthcare data to Google as part of a research partnership. Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. 2020).

Even though the University and Google were able to shake off this lawsuit, this case touched upon several interesting questions at the intersection of HIPAA and other privacy laws:

Continue Reading Deidentified Health Info under HIPAA: Deconstructing Dinerstein v. Google, LLC
Featured Video Play Icon

California Privacy Rights Act Highlights With Lily Li and DPO Advisor

Permalink to video here: https://vimeo.com/484360790

Mike: Hi everyone, if you’ve been following data privacy at all, you’ve probably already heard of California’s new landmark privacy law, the California Consumer Privacy Act, or CCPA as it is widely known.

The CCPA was the biggest data privacy shakeup in United States history. However, on November 3rd, California passed the California Privacy Rights Act or the CPRA, which adds teeth to the CCPA and further strengthens the rights for California consumers.

Here to talk about the upcoming CPRA is Lily Li, who is a Data Privacy Attorney and the founder of Metaverse Law.

Lily, thanks so much for joining us today.

Lily: Hey, thanks for having me.

Mike: Well, let’s jump right in. Can you please explain to everyone what the CPRA is?

Lily: Well, the CPRA is a law that amends the existing law on the books. As you mentioned there is this law called the California Consumer Privacy Act. It was passed by the California Legislature in 2018 and went into affect January 1st of this year.

Now we have CPRA, which is a ballot initiative that passed in the latest election, and it amends CCPA even further to make it more protective of privacy rights. Both of how customers use sensitive data and also about how companies use children’s data. We can definitely go more into the different changes that CPRA made to CCPA but this is a little bit of background on how it started.

Mike: That’s great. What do you feel are some of the key changes that the CPRA brings?

Lily: Well, the CPRA brings in this idea of sensitive personal information or sensitive personal data. And this aligns with a lot of other global privacy laws like GDPR and the new Brazilian Data Protection law.

Previously CCPA treated all types of personal information the same with respect to data subject requests. So people could get copies of their data. People could delete their data and a lot of people still have those rights with respect to companies.

Now, in CPRA there’s a new category of data sensitive personal data, sensitive personal information and these categories of data include things like health care information, now precise geo location, information about people’s genetics or biometric data.

And what’s important about these categories of data is that not only does the law prevent you from sharing this data without providing certain notices. The law also allows consumers to limit how a company uses sensitive data for their own purposes.

So even if you’re collecting Geo location information, not giving it out to third parties, if you’re using it for purposes at the company that aren’t related to why you’re collecting it from the consumer, the consumer can have the right to ask you to limit your use of sensitive data.

A good example of this is precise Geo location data. Uber got in trouble a little awhile ago because it would collect Geo location data from people using its rideshare app—even after people had stopped using the app. And so Uber could track people’s location in their homes or while they were still waiting for the right transit service.

This is a big No-No—especially if you are not disclosing it. But now, customers and consumers have the right to say hey, only use these sensitive pieces of information to provide me the services that I’ve requested. Don’t use it for anything else.

Another big change that the CPRA makes. Some people call it “CIPRA” now like to use the term CIPRA is that it increases the penalties for children’s data.

So previously, you could suffer fines if you were using children’s data in violation of how you disclosed the uses of data and privacy policy or if you refuse to respond to consumer requests regarding children’s data and the finding regime was the same. It was $2500 to $7500 per violation.

The difference between CPRA and CCPA is that under CCPA you could be fined $2500 per violation or $7500 per intentional violation. So you had to intentionally violate the law, and not just accidentally violate it because you didn’t know about the rules.

What “CIPRA” does or CPRA does is that it removes the intentionality requirement when you’re dealing with children’s data. So if you are using children’s data in ways that you haven’t disclosed in your privacy policy or are you are not fulfilling consumer requests regarding children’s data, then you are subject to that higher fine of $7500 per violation without any showing that you did it on purpose.

And there are a lot of other changes in CPRA that affects businesses. One of them is concerning behavioral advertising.

Under CCPA there was a lot of debate about whether or not re marketing, re targeting other types of cookies that track users across websites counted as sales of consumer data. And if something counted as a sale of consumer data under California law, you need to put a lot of disclosures on your website, like I do not sell my personal information.

Some companies were arguing that targeting ads behavioral advertising wasn’t a sale. There was no real exchange of money for personal information.

But CPRA removes that ambiguity. Under CPRA it is very clear that cross contextual behavioral advertising, that is to say, cookies that you set on a device that tracks users across different platforms in order to create a profile for a user to target them, counts as sales of data under CCPA, and so triggers a lot of the same disclosure requirements as if you were selling data in more traditional formats. So that’s another big change due to CPRA.

Mike: What do you think are the most important steps for businesses to take to comply with the CPRA?

Continue Reading California Privacy Rights Act Highlights With Lily Li and DPO Advisor
European Union flag.

EU-US Data Transfers After Schrems II: European Commission Publishes New Draft Standard Contractual Clauses

Image Credit: GregMontani from Pixabay.

**Update: On June 4, 2021, the European Commission formally adopted the new standard contractual clauses (“SCCs”) for international personal data transfers. Businesses will have a grace period of 18 months from the effective date of the European Commission’s decision to update all existing SCCs for transfers outside the European Union with the new SCCs.

In the meantime, businesses will be allowed to keep using the old SCCs for “new” data transfers over a transition period of three months from the effective date of the European Commission’s decision — giving organizations the chance to make any changes necessary for compliance with the new SCCs before incorporating them into their contracts. Such contracts, however, will also need to be updated within the 18-month-grace period.

On November 12, 2020, roughly four months after the European Court of Justice’s “Schrems II” decision which invalidated the EU-US Privacy Shield, the EU Commission released a draft set of new Standard Contractual Clauses (“SCCs” or “model clauses”).

These updated SCCs allow transfers of personal data from the EU to third countries, as well as a transfers by controllers when engaging processors located inside the EU. (For a further analysis of the Schrems II judgment, and the motivation for these new clauses, see our prior blog post).

Who can use the new SCCs?

The Commission’s draft, which includes the new SCCSs in its Annex, covers two new types of international transfers and contains important updates in order to bring the text of the model clauses in line with the General Data Protection Regulation (“GDPR”).

The current SCCs, approved by the Commission in 2001 and 2010, only addressed two data flow scenarios:

  • An EU-based controller exporting data outside of the EU to other controllers (controller-controller SCCs)
  • An EU-based controller exporting data outside of the EU to processors (processor- processor SCCs).

In this new draft, the Commission addressed a gap which frequently occurred in practice: EU processors exporting data to controllers and processors outside of the EU. This addition further reflects the expanded territorial scope of the GDPR.

Continue Reading EU-US Data Transfers After Schrems II: European Commission Publishes New Draft Standard Contractual Clauses
1 5 6 7 8 9 14