Cell phone with image of lock on the screen.

Reasonable Security: Implementing Appropriate Safeguards in the Remote Workplace

Photo by Franck on Unsplash

In 2020, with large portions of the global workforce abruptly sent home indefinitely, IT departments nationwide scurried to equip workers of unprepared companies to work remotely.

This presented an issue. Many businesses, particularly small businesses, barely have the minimum network defenses set up to prevent hacks and attacks in the centralized office. When suddenly everyone must become their own IT manager at home, there are even greater variances between secure practices, enforcement, and accountability.

“Reasonable Security” Requirements under CCPA/CPRA and Other Laws

Under the California Consumer Privacy Act (CCPA), the implementation of “reasonable security” is a defense against a consumer’s private right of action to sue for data breach. A consumer who suffers an unauthorized exfiltration, theft, or disclosure of personal information can only seek redress if (1) the personal information was not encrypted or redacted, or (2) the business otherwise failed its duty to implement reasonable security. See Cal. Civ. Code § 1798.150.

Theoretically, this means that a business that has implemented security measures—but nevertheless suffers a breach—may be insulated from liability if the security measures could be considered reasonable measures to protect data. Therefore, while reasonable security is not technically an affirmative obligation under the CCPA, the reduced risk of consumer liability made reasonable security a de facto requirement.

However, under the recently passed California Privacy Rights Act (CPRA), the implementation of reasonable security is now an affirmative obligation. Under revised Cal. Civ. Code § 1798.100, any business that collects a consumer’s personal information shall implement reasonable security procedures and practices to protect personal information. See our CPRA unofficial redlines.

Continue Reading Reasonable Security: Implementing Appropriate Safeguards in the Remote Workplace
Medical stethoscope and blue ink pen laying on appointment booklet. HIPAA privacy notices.

Deidentified Health Info under HIPAA: Deconstructing Dinerstein v. Google, LLC

Image Credit: DarkoStojanovic from Pixabay.

HIPAA Lawsuit
Privacy Compliance

Health data is an increasingly fraught area of privacy. Outside of sectoral health privacy laws like HIPAA, many regulations such as the GDPR and the California Privacy Rights Act (CPRA) rightly treat health or biometric information as a sensitive or special category of data deserving of more protections than many other types of data.

The amount of electronic heath data collected by companies is also increasing at a staggering rate. DNA testing kits and wearable fitness trackers are everywhere, and telehealth has proliferated in the wake of COVID-19.

Healthcare data controllers are just as likely to be big tech companies as opposed to traditional covered entities. Consequently, courts now need to consider a variety of privacy frameworks, not just HIPAA and HITECH, when they adjudicate healthcare claims.

In September 2020, the U.S. District Court for the Northern District of Illinois dismissed a lawsuit brought against the University of Chicago and the University of Chicago Medical Center (collectively referred to as “the University”) and Google for allegations that the University improperly disclosed healthcare data to Google as part of a research partnership. Dinerstein v. Google, LLC, No. 19-cv-04311 (N.D. Ill. 2020).

Even though the University and Google were able to shake off this lawsuit, this case touched upon several interesting questions at the intersection of HIPAA and other privacy laws:

Continue Reading Deidentified Health Info under HIPAA: Deconstructing Dinerstein v. Google, LLC
Featured Video Play Icon

California Privacy Rights Act Highlights With Lily Li and DPO Advisor

Permalink to video here: https://vimeo.com/484360790

Mike: Hi everyone, if you’ve been following data privacy at all, you’ve probably already heard of California’s new landmark privacy law, the California Consumer Privacy Act, or CCPA as it is widely known.

The CCPA was the biggest data privacy shakeup in United States history. However, on November 3rd, California passed the California Privacy Rights Act or the CPRA, which adds teeth to the CCPA and further strengthens the rights for California consumers.

Here to talk about the upcoming CPRA is Lily Li, who is a Data Privacy Attorney and the founder of Metaverse Law.

Lily, thanks so much for joining us today.

Lily: Hey, thanks for having me.

Mike: Well, let’s jump right in. Can you please explain to everyone what the CPRA is?

Lily: Well, the CPRA is a law that amends the existing law on the books. As you mentioned there is this law called the California Consumer Privacy Act. It was passed by the California Legislature in 2018 and went into affect January 1st of this year.

Now we have CPRA, which is a ballot initiative that passed in the latest election, and it amends CCPA even further to make it more protective of privacy rights. Both of how customers use sensitive data and also about how companies use children’s data. We can definitely go more into the different changes that CPRA made to CCPA but this is a little bit of background on how it started.

Mike: That’s great. What do you feel are some of the key changes that the CPRA brings?

Lily: Well, the CPRA brings in this idea of sensitive personal information or sensitive personal data. And this aligns with a lot of other global privacy laws like GDPR and the new Brazilian Data Protection law.

Previously CCPA treated all types of personal information the same with respect to data subject requests. So people could get copies of their data. People could delete their data and a lot of people still have those rights with respect to companies.

Now, in CPRA there’s a new category of data sensitive personal data, sensitive personal information and these categories of data include things like health care information, now precise geo location, information about people’s genetics or biometric data.

And what’s important about these categories of data is that not only does the law prevent you from sharing this data without providing certain notices. The law also allows consumers to limit how a company uses sensitive data for their own purposes.

So even if you’re collecting Geo location information, not giving it out to third parties, if you’re using it for purposes at the company that aren’t related to why you’re collecting it from the consumer, the consumer can have the right to ask you to limit your use of sensitive data.

A good example of this is precise Geo location data. Uber got in trouble a little awhile ago because it would collect Geo location data from people using its rideshare app—even after people had stopped using the app. And so Uber could track people’s location in their homes or while they were still waiting for the right transit service.

This is a big No-No—especially if you are not disclosing it. But now, customers and consumers have the right to say hey, only use these sensitive pieces of information to provide me the services that I’ve requested. Don’t use it for anything else.

Another big change that the CPRA makes. Some people call it “CIPRA” now like to use the term CIPRA is that it increases the penalties for children’s data.

So previously, you could suffer fines if you were using children’s data in violation of how you disclosed the uses of data and privacy policy or if you refuse to respond to consumer requests regarding children’s data and the finding regime was the same. It was $2500 to $7500 per violation.

The difference between CPRA and CCPA is that under CCPA you could be fined $2500 per violation or $7500 per intentional violation. So you had to intentionally violate the law, and not just accidentally violate it because you didn’t know about the rules.

What “CIPRA” does or CPRA does is that it removes the intentionality requirement when you’re dealing with children’s data. So if you are using children’s data in ways that you haven’t disclosed in your privacy policy or are you are not fulfilling consumer requests regarding children’s data, then you are subject to that higher fine of $7500 per violation without any showing that you did it on purpose.

And there are a lot of other changes in CPRA that affects businesses. One of them is concerning behavioral advertising.

Under CCPA there was a lot of debate about whether or not re marketing, re targeting other types of cookies that track users across websites counted as sales of consumer data. And if something counted as a sale of consumer data under California law, you need to put a lot of disclosures on your website, like I do not sell my personal information.

Some companies were arguing that targeting ads behavioral advertising wasn’t a sale. There was no real exchange of money for personal information.

But CPRA removes that ambiguity. Under CPRA it is very clear that cross contextual behavioral advertising, that is to say, cookies that you set on a device that tracks users across different platforms in order to create a profile for a user to target them, counts as sales of data under CCPA, and so triggers a lot of the same disclosure requirements as if you were selling data in more traditional formats. So that’s another big change due to CPRA.

Mike: What do you think are the most important steps for businesses to take to comply with the CPRA?

Continue Reading California Privacy Rights Act Highlights With Lily Li and DPO Advisor
social network patents

Facebook, Patents, and Privacy: Social Media Innovations to Mine Personal Data

Social Media Patents & Privacy Data

[©2016. Published in GPSOLO, Vol. 37, No. 5, September/October 2020, by the American Bar Association. Reproduced with permission. All rights reserved. This information or any portion thereof may not be copied or disseminated in any form or by any means or stored in an electronic database or retrieval system without the express written consent of the American Bar Association or the copyright holder]

* Updated November 25 to include references to CPRA/ Prop24.

The episode “Nosedive” of the television series Black Mirror envisions a society built on social credit scores. In this dystopia, all social media networks have converged into one platform—think Facebook, TikTok, Yelp, and Equifax combined.

This umbrella social platform allows users to rate each other on a five-point scale after each social interaction. Those with a high score gain access to job opportunities, favorable zip codes, and even high-status relationships. Those with a low score have the social ladder kicked out from under them, leading to a downward cycle of estrangement—and in the case of Black Mirror’s protagonist, jail time.

While the society in “Nosedive” seems far-fetched, is the technology behind it plausible?

Facebook Patents That Impact Privacy

According to Facebook’s patents, the answer is a resounding “yes.”

In a series of filings spanning almost a decade, Facebook has obtained several patents that allow social media platforms to track, identify, and classify individuals in new and innovative ways. Below are just few.

Tracking individuals via dust. U.S. Patent No. 9485423B2, “associating cameras with users and objects in a social networking system” (filed September 16, 2010, patented June 25, 2013), allows social media networks to identify an individual’s friends and relationships by correlating users across the same camera. To do so, an algorithm analyzes the metadata of a photo to find a camera’s “signature.”

Continue Reading Facebook, Patents, and Privacy: Social Media Innovations to Mine Personal Data
1 2 3 4