Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • Court certifies two classes in restaurant chain data breach

    Privacy, Cyber Risk & Data Security

    On April 15, the U.S. District Court for the Middle District of Florida certified a nationwide class and a California-only class of restaurant customers who claim the restaurant chain’s negligence led to a 2018 data breach that compromised their credit card information. The two classes of consumers include those who made credit or debit card purchases at affected restaurants in March and April 2018, when their data was accessed by cybercriminals, and who incurred reasonable expenses or time spent mitigating the consequences of the breach. The judge certified the classes only on the plaintiffs’ negligence and state Unfair Competition Law (California) claims, and deferred ruling on the class certification related to claims that the restaurants’ parent company breached an implied contract with customers by failing to have adequate cybersecurity protocols. Certifying that claim, the judge stated, could require applying 50 different state laws on the breach of implied contracts. 

    Privacy/Cyber Risk & Data Security Courts Data Breach Consumer Finance State Issues

  • Court addresses alternative theories of liability in BIPA class action

    Privacy, Cyber Risk & Data Security

    On January 28, the U.S. District Court for the Northern District of Illinois denied a motion to reconsider and a motion to certify questions for appeal and stay proceedings pending appeal in a matter concerning class claims that an auto leasing company and its parent company (collectively, “defendants”) violated the Illinois Biometric Information Privacy Act (BIPA) by unlawfully collecting biometric fingerprint data without first receiving informed consent. The court previously denied the defendants’ motion to dismiss after concluding the plaintiff stated a BIPA claim against both defendants. However, the auto leasing company argued, among other things, that the parent company should not be held liable because it was never the plaintiff’s employer, did not control her work environment, and had nothing to do with the fingerprint timekeeping system. The court disagreed, finding that under BIPA, the plaintiff’s allegations of the parent company were not “legal conclusions,” and “control over employee timekeeping and privacy [] describes a relevant factual aspect of her personal experience working for defendants.” According to the court, “[t]his factual allegation raises the reasonable inference that [the parent company] administered the alleged fingerprint-scanning system, and in turn, plausibly suggests that [the parent company] collected, retained, and disseminated her fingerprints.” The parent company will have the opportunity to address alternative theories of liability while seeking summary judgment against the plaintiff or at trial, the court wrote.

    Privacy/Cyber Risk & Data Security Courts BIPA Class Action State Issues

  • Court dismisses data breach claims citing lack of compromised sensitive information

    Privacy, Cyber Risk & Data Security

    On January 12, the U.S. District Court for the Central District of California dismissed a data breach lawsuit brought against a hotel chain, ruling the plaintiff lacked standing. The plaintiff claimed class members were victims of a data breach when hotel employees at a franchise in Russia allegedly accessed personal information without authorization, including guests’ names, addresses, phone numbers, email addresses, genders, birth dates and loyalty account numbers. The plaintiff’s suit alleged, among other things, violations of the California Consumer Privacy Act and the state’s Unfair Competition Law. While the hotel disclosed the incident last March and admitted that class members’ personal information was compromised, the court determined that the plaintiff lacked standing to bring claims after the hotel’s investigation found that “no sensitive information, such as social security numbers, credit card information, or passwords, was compromised.” The court determined that the plaintiff failed to plausibly plead that any of the class members’ more sensitive data had fallen into the wrong hands, and that “[w]ithout a breach of this type of sensitive information, Plaintiff has not suffered an injury in fact and cannot meet the constitutional requirements of standing.”

    Privacy/Cyber Risk & Data Security Courts Data Breach CCPA State Issues

  • Irish Data Protection Commission fines U.S. social networking company for violating GDPR

    Privacy, Cyber Risk & Data Security

    On December 15, the Irish Data Protection Commission (Commission) announced a final decision was reached in a General Data Protection Regulation (GDPR) investigation into a U.S.-based social networking tech company’s actions related to a 2019 data breach that affected users across the European Union. The final decision, published by the European Data Protection Board (EDPA), imposes a €450,000 fine against the company, and resolves an investigation in which the Commission alleged the company violated Articles 33(1) and 33(5) of the GDPR by failing to provide notice about the breach within a 72-hour period and by neglecting to adequately document the breach. According to the Commission, this inquiry is the first “dispute resolution” Article 65 decision (draft decision) under the GDPR, and marks the first decision issued against a “big tech” company. According to the final decision, “a number of concerned supervisory authorities raised objections” to aspects of the draft decision, taking issue, among other things, with the size of the proposed fine, which was originally set between €135,000 and €275,000. The EDPA determined that the objections were “relevant and reasoned” and instructed the Commission to increase the fine to ensure “it fulfils its purpose as a corrective measure and meets the requirements of effectiveness, dissuasiveness and proportionality” established under the GDPR.

    Privacy/Cyber Risk & Data Security Of Interest to Non-US Persons GDPR EU Data Breach

  • NYDFS announces cybersecurity toolkit for small businesses

    Privacy, Cyber Risk & Data Security

    On November 17, NYDFS announced a partnership with a non-profit company to provide a free cybersecurity toolkit to small businesses, including those in the financial services sector. The toolkit is intended to help small businesses strengthen their cybersecurity and to protect themselves and their customers from growing cyber threats. Operational tools and educational resources covered in the toolkit address “identifying hardware and software, updating defenses against cyber threats, strengthening passwords and multi-factor authentication, backing up and recovering data, and protecting email systems.” NYDFS’ partnership with the company also includes the development of a set of sample policies based on cybersecurity best practices to help small businesses install necessary governance and procedures. The sample policies include, among other things, a risk assessment and a sample third-party service provider policy. NYDFS advises small businesses to “review the tools and sample policies and to adapt them to their specific business risks and operations, including to comply with any applicable state and federal laws.”  

    Privacy/Cyber Risk & Data Security State Issues State Regulators NYDFS

  • California voters approve expanded privacy rights

    Privacy, Cyber Risk & Data Security

    On November 3, California voters approved a ballot initiative, the California Privacy Rights Act of 2020 (CPRA), that expands on the California Consumer Privacy Act (CCPA). While there are a number of differences between the CPRA and the CCPA, some key provisions include:

    • Adding expanded consumer rights, including the right to correction and the right to limit sharing of personal information for cross-context behavioral advertising, whether or not for monetary or other valuable consideration.
    • Changing the definitions of various entities, including increasing the numerical threshold for being a business to 100,000 from 50,000 consumers and households and removing devices from this threshold.
    • Adding the category of sensitive personal information that is subject to specific rights.
    • Creating a new privacy agency, the California Privacy Protection Agency, to administer, implement, and enforce the CPRA.

    It is important to note that the Gramm-Leach-Bliley Act and Fair Credit Reporting Act exemptions are in the CPRA, and the act extends the employee and business-to-business exemption to January 1, 2023.

    Implementation deadlines

    The CPRA becomes effective January 1, 2023, with enforcement delayed until July 1, 2023. However, the CPRA contains a look-back provision (i.e., the CPRA will apply to personal information collected by a business on or after January 1, 2022). The new privacy agency also is required to begin drafting regulations starting on July 1, 2021, with final regulations to be completed one year later.

    Learn more

    Please refer to a Buckley article for further information on the differences between the CCPA and the CPRA: 6 Key Ways the California Privacy Rights Act of 2020 Would Revise the CCPA (Corporate Compliance Insights), as well a continuing InfoBytes coverage here.

    Privacy/Cyber Risk & Data Security CCPA CPRA California Consumer Protection Ballot Initiative

  • Health insurer to pay $48 million to resolve 2014 data breach

    Privacy, Cyber Risk & Data Security

    On September 30, a multistate settlement was reached between a health insurance company and a collation of 42 state attorneys general and the District of Columbia to resolve a 2014 data breach that allegedly comprised the personal information of more than 78 million customers nationwide. According to the states, cyber attackers infiltrated the company’s systems using malware installed through a phishing email. The data breach resulted in the exposure of consumers’ social security numbers, birthdays, and other personal data. Under the terms of the settlement, the health insurer must pay $39.5 million in penalties and fees, and is required to (i) not misrepresent the extent of its privacy and security protections; (ii) implement a comprehensive information security program, including “regular security reporting to the Board of Directors and prompt notice of significant security events to the CEO”; (iii) implement specific security requirements, including “anti-virus maintenance, access controls and two-factor authentication, encryption, risk assessments, penetration testing, and employee training”; and (iv) schedule third-party assessments and audits for three years.

    Separately, the California AG reached a $8.69 million settlement, subject to court approval, in a parallel investigation, which requires the health insurer to, among other things, implement changes to its information security program and fix vulnerabilities to prevent future data breaches.

    Previously in 2018, the health insurer reached a $115 million class action settlement, which provided for two years of credit monitoring, reimbursement of out-of-pocket costs related to the breach, and alternative cash payment for credit monitoring services already obtained (covered by InfoBytes here).

    Privacy/Cyber Risk & Data Security Courts Settlement Data Breach State Issues State Attorney General

  • California AG enters into privacy settlement with fertility-tracking mobile app

    Privacy, Cyber Risk & Data Security

    On September 17, the California attorney general announced a settlement with a technology company that operates a fertility-tracking mobile app to resolve claims that security flaws put users’ sensitive personal and medical information at risk in violation of state consumer protection and privacy laws. According to the complaint filed in the Superior Court for the County of San Francisco, the company’s app allegedly failed to adequately safeguard and preserve the confidentiality of medical information by, among other things, (i) allowing access to user information without the user’s consent, by failing to “authenticate the legitimacy of the user to whom the medical information was shared”; (ii) allowing a password-change vulnerability to permit unauthorized access and disclosure of information stored in the app without the user’s consent; (iii) making misleading statements concerning implemented security measures and the app’s ability to protect consumers’ sensitive personal and medical information from unauthorized disclosure; and (iv) failing to implement and maintain reasonable security procedures and practices.

    Under the terms of the settlement, the company—which does not admit liability—is required to pay a $250,000 civil penalty and incorporate privacy and security design principles into its mobile apps. The company must also obtain affirmative authorization from users before sharing or disclosing sensitive personal and medical information, and must allow users to revoke previously granted consent. Additionally, the company is required to provide ongoing annual employee training concerning the proper handling and protection of sensitive personal and medical information, in addition to training on cyberstalking awareness and prevention. According to the AG’s press release, the settlement also includes “a first-ever injunctive term that requires [the company] to consider how privacy or security lapses may uniquely impact women.”

    Privacy/Cyber Risk & Data Security Courts Settlement Data Breach State Issues State Attorney General

  • New York AG settles data breach lawsuit with national coffee chain

    Privacy, Cyber Risk & Data Security

    On September 15, the New York attorney general announced a settlement with a national franchisor of a coffee retail chain to resolve allegations that the company violated New York’s data breach notification statute and several state consumer protection laws by failing to protect thousands of customer accounts from a series of cyberattacks. As previously covered by InfoBytes, the AG claimed that, beginning in 2015, customer accounts containing stored value cards that could be used to make purchases in stores and online were subject to repeated cyberattack attempts, resulting in more than 20,000 compromised accounts and “tens of thousands” of dollars stolen. Following the attacks, the AG alleged that the company failed to take steps to protect the affected customers or to conduct an investigation to determine the extent of the attacks or implement appropriate safeguards to limit future attacks. The settlement, subject to court approval, would require the company to (i) notify affected customers, reset their passwords, and refund any stored value cards used without permission; (ii) pay $650,000 in penalties and costs; (iii) maintain safeguards to protect against similar attacks in the future; and (iv) develop and follow appropriate incident response procedures.

    Privacy/Cyber Risk & Data Security Courts Settlement Data Breach State Issues

  • District court preliminarily approves $650 million biometric privacy class action settlement

    Privacy, Cyber Risk & Data Security

    On August 19, the U.S. District Court for the Northern District of California granted preliminary approval of a $650 million biometric privacy settlement between a global social media company and a class of Illinois users. If granted final approval, the settlement would resolve consolidated class action claims that the social media company violated the Illinois Biometric Information Privacy Act (BIPA) by allegedly developing a face template that used facial-recognition technology without users’ consent. A lesser $550 million settlement deal filed in May (covered by InfoBytes here), was rejected by the court due to “concerns about an unduly steep discount on statutory damages under the BIPA, a conduct remedy that did not appear to require any meaningful changes by [the social media company], over-broad releases by the class, and the sufficiency of notice to class members.” The preliminarily approved settlement would also require the social medial company to provide nonmonetary injunctive relief by setting all default face recognition user settings to “off” and by deleting all existing and stored face templates for class members unless class members provide their express consent after receiving a separate disclosure on how the face template will be used.

    Privacy/Cyber Risk & Data Security Courts BIPA Class Action Settlement

Pages

Upcoming Events