Ever since the enactment of the Illinois Biometric Information Privacy Act (BIPA), we have been watching the development of laws around the collection, use, disclosure and retention of biometric information. In general, BIPA and other biometric information privacy laws enacted since BIPA, require any company that is collecting biometric information, such as fingerprints, voice recognition, retinal scans or facial scans, to provide notice to individuals from whom they are collecting this information that they are collecting the biometric information, the purpose for which it is being collected and used, to whom they are disclosing it, and how long they are retaining it. The laws usually require companies to put appropriate security measures in place to protect  the biometric information.

Litigation is rampant with BIPA and other biometric information privacy laws. For instance, recently, a fast food chain was sued for using voice recognition technology in its drive-through facilities without providing notice to consumers and obtaining consent.

The reason for these laws is pretty clear—this information is highly sensitive and unique to each person and if it is compromised, it could be significant or even catastrophic for the people whose information is compromised. As I say, we have only one face, one set of fingerprints, a unique voice, and two irises. If a bad actor were to get ahold of this unique information, they could use it for nefarious purposes, including to steal our identity in very significant ways.

These laws, similar to the California Consumer Privacy Act (CCPA), include a private right of action if the company fails to comply with the provisions of the law. This means that if a company does not provide notice of the collection, use, disclosure and retention of the information, or if there is a compromise of the information, individual consumers can directly sue the company for failing to comply with the law and without showing actual harm, damages or consequences. This can lead to costly litigation.

It is hard (but necessary) for a full-time privacy professional like me to keep up with these laws, let alone businesses that are not focused on this area of law. Biometric laws are popping up like drone laws used to pop up back in the day on the state, county, city and municipal level. For instance, the City of New York has enacted a biometric law that becomes effective next month that applies to a “commercial establishment” in New York City, which means “a place of entertainment, a retail store, or a food and drink establishment,” that requires the business to place a “clear and conspicuous sign near all of the commercial establishment’s customer entrances notifying customers in plain, simple language…that customers’ biometric identifier information is being collected, retained, converted, stored or shared, as applicable.” The law further prohibits the sale of biometric information.

The New York City ordinance differs from BIPA and other state laws in that  it (1) does not apply to employees of companies; (2)  does not apply to financial institutions; and (3)  does not apply to governmental entities. The similarity of the statutes however, is that they both contain a private right of action for consumers. The New York City law states that an aggrieved person can sue the company for a violation of the law after first  giving the company thirty days’ notice to cure the violation. This is similar to the private right of action in the CCPA (an individual may seek damages of $500 for each violation, up to $5,000 for each intentional or reckless violation, and receive reasonable attorneys’ fees and costs, expert witness fees, litigation expenses and injunctive relief).

New York City establishments—take note. Other establishments—understand that this is a rapidly developing area of privacy law that is difficult to monitor and may be tricky to comply with on a national, state, and municipal level. If you are collecting any biometric data from employees or consumers, you may wish to consider implementing a biometric information compliance program.

In just the last two weeks, three of the world’s most prominent social networks have been linked to stories about data leaks. Troves of information on both Facebook and LinkedIn users – hundreds of millions of them – turned up for sale in marketplaces in the cyber underground. Then, earlier this week, a hacker forum published a database purporting to be information on users of the new Clubhouse social network. 

Andrew Sellers is the Chief Technology Officer at QOMPLX Inc.

To hear Facebook, LinkedIn and Clubhouse speak, however, nothing is amiss. All took pains to explain that they were not the victims of a hack, just “scraping” of public data on their  users by individuals. Facebook went so far as to insist that it would not notify the 530 million users whose names, phone numbers, birth dates and other information were scraped from its site. .

So which is it? Is scraping the same as hacking or just an example of “zealous” use of a social media platform? And if it isn’t considered hacking…should it be? As more and more online platforms open their doors to API-based access, what restrictions and security should be attached to those APIs to prevent wanton abuse? 

To discuss these issues and more, we invited Andrew Sellers into the Security Ledger studios. Andrew is the Chief Technology Officer at the firm QOMPLX* where he oversees the technology, engineering, data science, and delivery aspects of QOMPLX’s next-generation operational risk management and situational awareness products. He is also an expert in data scraping with specific expertise in large-scale heterogeneous network design, deep-web data extraction, and data theory. 

While the recent incidents affecting LinkedIn, Facebook and Clubhouse may not technically qualify as “hacks,” Andrew told me, they do raise troubling questions about the data security and data management practices of large social media networks, and beg the question of whether more needs to be done to regulate the storage and retention of data on these platforms. 


(*) QOMPLX is a sponsor of The Security Ledger.

The California Attorney General recently approved modified regulations under the California Consumer Privacy Act (CCPA). One part of the modified regulations bans “dark patterns” on a website. What are dark patterns? Public comments to the proposed regulations describe dark patterns as deliberate attempts to subvert or impair a consumer’s choice to opt-out on a website. Dark patterns could be used on a website to confuse or distract a consumer into granting knowing consent instead of choosing the opt-out option.

The modified regulations therefore ban the use of dark patterns that:

  • Use an opt-out request process that requires more steps than the process for a consumer to opt back into the sale of personal information after previously opting out;
  • Use confusing language (e.g., double-negatives, “Don’t Not Sell My Personal Information”);
  • Require consumers to click through or listen to unnecessary reasons why they should not submit a request to opt-out before confirming their request;
  • Require a consumer to provide personal information that is unnecessary to implement an opt-out request; or
  • Require a consumer to search or scroll through the text of a website or privacy policy to submit the opt-out request after clicking the “Do Not Sell My Personal Information” link (but before actually choosing the option).

If your website uses any such dark patterns you may wish to revise those mechanisms and implement clearer, more transparent methods for your website’s users to opt-out.

Gardiner v. Walmart provided some guidance as to the specificity required to state a claim under the California Consumer Privacy Act (CCPA) and the types of damages that may be recoverable for breaches of California consumer data. On July 10, 2020, Lavarious Gardiner filed a proposed class action against Walmart, alleging that unauthorized individuals accessed his personal information through Walmart’s website. Although Walmart never disclosed the alleged breach or provided any formal notification to consumers (and maintains that no breach occurred), Gardiner claimed that he discovered his personal information on the dark web and was told by hackers that the information came from his Walmart online account. He also claims that by using cybersecurity scan software he discovered many vulnerabilities on Walmart’s website.

Gardiner claimed Walmart violated the CCPA and California’s Unfair Competition Law. In response, Walmart filed a motion to dismiss, which was granted on March 5, 2021 (of note – with leave to amend). While Gardiner has now amended his complaint, the court’s ruling on Walmart’s motion to dismiss addresses some important points related to data breach class actions, including:

  • The compliant MUST state when the alleged breach occurred. Gardiner had only alleged that his information was on the dark web, not when the breach actually occurred. The court also stated that for purposes of a CCPA claim, the relevant conduct is the actual data breach resulting from a “failure to implement and maintain reasonable security procedures and practices.” This means that the breach must have occurred on or after January 1, 2020, the effective date of the CCPA.
  • The complaint must sufficiently allege disclosure of personal information. Gardiner had only alleged that his credit card number was disclosed, but had not alleged that his 3-digit access code was affected.
  • Plaintiff’s damages arising from a data breach MUST not be speculative -this is common across courts that dismiss class action data breach suits. Here, Gardiner had not alleged that he incurred any fraudulent charges or suffered any identity theft or other harm.

The court also dismissed Gardiner’s unfair competition claims that were based on a benefit of the bargain theory.

The court also addressed the disclaimers in Walmart’s privacy policy.; Walmart argued that Gardiner’s contract-based claims were barred by the its website Terms of Use, which included a warranty disclaimer and limitation of liability for data breaches. The court said that the limitation of liability was clear and emphasized with capitalization, which put Gardiner on notice of its contents. This is an important part of the decision for ANY company with online presence -a company’s website Privacy Policy and Terms of Use could be the final line of defense.

Gardiner has since his complaint. Whether the amendments will avoid another motion to dismiss is unknown. Still, this decision provides valuable insight for claims made under the CCPA and important lessons about website Privacy Policies and Terms of Use.

California Attorney General Xavier Becerra announced this week that the Office of Administrative Law approved additional California Consumer Privacy Act (CCPA) regulations, which became effective March 15, 2021.

The additional changes to the regulations primarily affect businesses that sell the personal information of California residents. The changes include a uniform Opt-Out Icon for the purpose of promoting consumer awareness of the right to opt-out of the sale of personal information, guidance to businesses regarding opt-out requests, including what not to do, and changes regarding the proof that a business may require for authorized agents and consumer verifications.

New sections of the regulations include a requirement that a business that sells personal information it collects from consumers offline shall also inform consumers by an offline method of their right to opt-out and provide instructions on how to submit a request to opt-out. The new regulations state that the Opt-Out Icon may be used in addition to posting the notice of the right to opt-out, but not in lieu of any requirement to post the notice of right to opt-out or a “Do Not Sell My Personal Information” link. (A link to download the Opt-Out Icon can be found here.)

With respect to authorized agents, a business may require that the consumer authorized agent provide proof that the consumer gave the agent signed permission to submit the request. The business may also require the consumer to do either of the following: (1) verify their own identity directly with the business or (2) directly confirm with the business that it provided the authorized agent permission to submit the request.

Other new sections of the regulations state that a business’s methods for submitting requests to opt-out should be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out. Examples of methods that businesses should not use are specified in the regulations and include:

  • The process for opting out shall not require more steps than the business process for opting in to the sale of personal information;
  • The business should not use confusing language such as double negatives (Don’t Not Sell My Personal Information);
  • The business shall not require consumers to click through or to listen to reasons they should not submit a request to opt-out before confirming their request;
  • The business cannot require the process for submitting a request to opt-out to require the consumer to provide personal information that is not necessary to implement the request; and
  • Upon clicking the “Do Not Sell My Personal Information” link, the business shall not require the consumer to search or scroll through the text of a privacy policy or similar document or webpage to locate the mechanism for submitting a request to opt-out.

The bottom line for these additional changes to the CCPA regulations is that the overriding principles remain the same: inform consumers of their right to opt-out of the sale of their personal information and present this information to consumers in a way that is easy to read and understand.

California Governor Gavin Newsom, along with Attorney General Xavier Becerra, Senate President pro Tempore Toni G. Atkins (D-San Diego), and Assembly Speaker Anthony Rendon (D-Lakewood), announced the appointment of the five-member inaugural board for the California Privacy Protection Agency (CPPA) this week.

The Board was established by the California Consumer Privacy Rights Act (CPRA) and will oversee the rulemaking process for various topics relating to the CPRA, including privacy audits, consumer opt-out rights, and compliance relating to the protection of the privacy rights of consumers with regard to their personal information.

According to Attorney General Xavier Becerra, “The California Privacy Protection Agency marks a historic new chapter in data privacy by establishing the first agency in the country dedicated to protecting forty million Californians’ fundamental privacy rights. The CPPA Board will help California residents understand and control their data privacy while holding online businesses accountable.”

The Board members will select an Executive Director and may serve for no more than eight years.


Federal Court Finds the California Consumer Privacy Act (CCPA) Does Not Apply Retroactively, Dismissing Claims Against Walmart Stemming from an Alleged Data Breach | Data Privacy + Cybersecurity Insider














Skip to content

Virginia Governor Ralph Northam signed the Consumer Data Protection Act (CDPA) on Tuesday, March 2, 2021. Virginia now joins California as the second state to have a data privacy law. The law takes effect on January 1, 2023, so businesses have some time to get ready. In our previous article on the proposed legislation, we described the new consumer rights available, the lack of a private right of action, and detailed which businesses will have to comply with the new law.  In addition to providing consumers with their rights regarding their data, the CDPA requires transparent processing of personal data through a privacy notice, which must include the following:

  • The categories of personal data collected by the controller;
  • The purposes for which the categories of personal data are used and disclosed to third parties, if any;
  • The rights that consumers may exercise via the new law;
  • The categories of personal data that the controller shares with third parties, if any; and
  • The categories of third parties, if any, with whom the controller shares personal data.

In addition, if a controller sells personal data to data brokers or processes personal data for targeted advertising, controllers must disclose such processing to consumers and inform them about how a consumer may exercise the right to object to such processing, in a clear and conspicuous manner.

Finally, the new law requires controllers to conduct a risk assessment of each of their processing activities involving personal data and an additional risk assessment any time there is a change in processing that materially increases the risk to consumers.

This week, Consumer Reports published a Model State Privacy Act. The Consumer advocacy organization proposed model legislation “to ensure that companies are required to honor consumers’ privacy.” The model legislation is similar to the California Consumer Privacy Act, but seeks to protect consumer privacy rights “by default.”  Some additional provisions of the model law include a broad prohibition on secondary data sharing, an opt-out of first-party advertising, and a private right of action in addition to enforcement by state Attorneys General.

While the introduction of a model privacy law is an interesting development, we also continue to track state privacy laws in multiple states right now, as several states have recently introduced consumer privacy legislation. Connecticut, Massachusetts, Illinois, Minnesota, New York and Utah recently saw the introduction of new privacy legislation. As legislative sessions move forward into 2021, we expect even more states to follow suit.

Our list of pending state privacy legislation includes:

We will continue to provide updates as these bills move forward.

Binary Check Ad Blocker Security News

With the passage of the Consumer Privacy Rights Act (CPRA), we are presenting several blog articles on different topics related to the new law. We previously wrote about key effective dates and the newly-added definition of sensitive information. This week, we will focus on consumer opt-out rights and data profiling.

Consumer Opt-Out Rights

The CPRA created several new rights for consumers – one of which is the right to opt out of the sale or the sharing of their personal information. In order to understand this new opt-out right, we need to review the new definition of sharing personal information in the CPRA.

The CPRA differentiates between the sale of personal information and the sharing of personal information. Sharing personal information means disclosing it to third parties for “cross-context behavioral advertising, whether or not for monetary or other valuable consideration, including transactions between a business and a third party for cross-context behavioral advertising for the benefit of a business in which no money is exchanged.” Section 1798.140 (a)(h)(1).

What is cross-contextual behavioral advertising? Think about advertising targeted to the consumer based on their internet behavior. Contextual advertising might be an ad shown specifically to a consumer for a product related to that consumer’s internet search. If you are a California resident, the CPRA will give you the right to opt out of the sharing of your personal information in this way. How will a consumer exercise this right? The CPRA states that a consumer shall have the right, at any time, “to direct a business that sells or shares personal information about the consumer to third parties not to sell or share the consumer’s personal information.” Section 1798.120(a).

Data Profiling – What is it?

Another consumer right related to the consumer opt-out rights found in the CPRA pertains to data profiling. Profiling is defined in the CPRA as the automated processing of personal information to “to evaluate certain personal aspects relating to a natural person, and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.” Section 1798.140 (z). One bright note is that Section 1798.185 (a)(16) states that regulations will need to be developed “governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.”

We will be following these opt-out rights closely – both from a consumer privacy standpoint and for businesses that use such targeted advertising technologies, including automated processing of personal information – to see how the regulations will address the logic involved in the decision-making process and its impact on consumers.