Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • Pennsylvania Attorney General settles with data collection company for failing to disclose data use

    Courts

    On February 22, the Attorney General for the State of Pennsylvania, Michelle A. Henry, announced a settlement with a company for selling consumers’ data information without clearly notifying those consumers pursuant to the Unfair Trade Practices and Consumer Protection Law and the Telemarketer Registration Act (TRA) and required the defendant pay $25,000 in monetary relief. The defendant operated various websites that collected consumers’ personal information with offers of free samples or payments for online surveys. The Pennsylvania AG alleged the defendant failed to properly disclose to consumers that the purpose of collecting their data was for lead generation, made misrepresentations regarding free samples and brand affiliations, and failed to obtain necessary consumer requests and agreements.

    As part of the settlement, the Pennsylvania AG required the defendant to provide certain disclosures, including the collection of consumer data is for lead generation, consumer information may be sold to third parties, and defendant functions as an aggregator of promotional offerings. The settlement further enjoined the defendant from making certain misrepresentations to consumers. There were also orders related to telemarketing practices and consumer usage data, including a requirement that defendant not “use, sell, transfer or share any [c]onsumer [d]ata obtained from Pennsylvania consumers[.]”

    Courts Pennsylvania State Attorney General Data Collection / Aggregation Telemarketing

  • CFPB sues nonbank mortgage lender for alleged HMDA and CFPA violations

    Federal Issues

    On October 10, the CFPB filed a lawsuit against a Florida-based nonbank mortgage originator for allegedly failing to accurately report mortgage data in violation of the Home Mortgage Disclosure Act (HMDA). According to the complaint, in 2019 the Bureau found that the lender violated HDMA by intentionally misreporting data regarding applicants’ race, ethnicity and gender from 2014-2017, which resulted in the lender paying a civil money penalty and taking corrective action. In this action, the Bureau alleges that during its supervision process, it found the lender submitted HMDA data for 2020 contained “widespread errors across multiple data fields” including 51 errors in 159 files and the lender violated a 2019 consent order condition that required it to improve its data practices. The alleged errors include (i) mistakes in inputting data concerning subordinate lien loans and acquired loans; (ii) inclusion of loans in HMDA reporting that did not meet the HMDA criteria for reportable applications; (iii) incorrect characterization of purchaser type for tens of thousands of loans; (iv) erroneous rate spread calculations, leading to errors in interconnected fields; (iv) inaccurate data related to lender credits; and (v) incorrect categorization of specific loan applications as “approved but not accepted” when they were, in fact, withdrawn, resulting in discrepancies in associated fields. Along with the HDMA violations and the violations of the 2019 consent order, the CFPB also alleges violations of the CFPA and requests that the court permanently enjoin the lender from committing future violations of HMDA, require the lender to take corrective action to prevent further violations of HMDA, injunctive relief, and the imposition of a civil money penalty.  

    Federal Issues CFPB Enforcement Lending Mortgage Lenders Mortgages Consumer Finance HMDA CFPA Data Collection / Aggregation

  • Tech giant denied summary judgment in private browsing lawsuit

    Courts

    On August 7, the U.S. District Court for the Northern District of California entered an order denying a multinational technology company’s motion for summary judgment on claims that the company invaded consumers’ privacy by tracking the consumers’ browsing history in the company’s private browsing mode. After reviewing the company’s disclosed general terms of service and privacy notices and disclosures, the court found that the company never explicitly told users that it would be collecting their data while browsing in private mode.  Without evidence that the company explicitly told users of this practice, the court concluded that it could not “find as a matter of law that users explicitly consented to the at-issue data collection,” and therefore, could not grant the company’s motion for summary judgment.

    Plaintiffs, who are account holders (Class 1 for Incognito users and Class 2 for users of other private browsing modes), brought a class action suit against the company for the “surreptitious interception and collection of personal and sensitive user data” while the users were in a “private browsing mode.” Along with invasion of privacy, intrusion upon seclusion, and breach of contract, plaintiffs asserted violations of (i) the Federal Wiretap Act; (ii) The California Invasion of Privacy Act; (iii) Comprehensive Data Access and Fraud Act; and (iv) California’s Unfair Competition Law.

    The court previously denied the defendant’s two motions to dismiss. 

    Courts Privacy, Cyber Risk & Data Security Consumer Protection CIPA Wiretap Act California Data Collection / Aggregation

  • District Court grants defendant’s motion for summary judgment in data collection suit

    Courts

    On December 12, the U.S. District Court for the Northern District of California granted a defendant’s motion for summary judgment in a suit alleging that it collected consumers’ data without first obtaining their consent. According to the opinion, the plaintiffs are users of the defendant’s browser who alleged that they chose not to sync their browsers with the defendant’s accounts while browsing the web from July 2016 to the present. The complaint further noted that the browser’s sync feature permits “users to store their personal information by logging into the browser with their [defendant’s] account.” The district court granted the defendant’s motion for summary judgment after determining that most of the issues are “browser agnostic” rather than specific to the browser. Furthermore, the district court determined that because those issues are not specific to the browser, the defendant’s general privacy policies “governs the collection of those categories of information identified by plaintiffs.” The district court also found that “a reasonable person viewing those disclosures would understand that [the defendant] maintains the practices of collecting its users' data when users use [the defendant’s] services or third-party sites that use [the defendant’s] services and that [the defendant] uses the data for advertising purposes.” The district court also noted that “a reasonable user reviewing these same disclosures would understand that [the defendant] combines and links this information across sites and services for targeted advertising purposes.”

    Courts Data Privacy Data Collection / Aggregation

  • District Court grants plaintiff’s injunction in data scraping suit

    Courts

    On September 30, the U.S. District Court for the Northern District of California certified a stipulation and proposed order regarding a permanent injunction and dismissal to abandon remaining allegations against an Israel-based company and a Delaware company (collectively, defendants) related to their use of data scraping from the parent company of large social media platforms (plaintiff). In 2020, the plaintiff alleged that the defendants developed and distributed internet browser extensions to illegally scrape data from the plaintiff’s platform and other platforms. The order noted that the court’s prior summary judgment decision concluded that the defendants collected data using “self-compromised” accounts of users who had downloaded the defendants’ browser extensions. The order further noted that the defendants stipulated that the plaintiff had established that it suffered “irreparable injury” and incurred a loss of at least $5,000 in a one-year period as a result of one of the companies’ unauthorized access. The order further noted that judgment has been established “based on [the Israel-based company’s] active data collection through legacy user products beginning October 2020, and based on [the Israel-based company’s] direct access to password-protected pages on [the plaintiff’s] platforms using fake or purchased user accounts.” Under the injunction, the defendants are immediately and permanently barred from accessing or using two of the plaintiff’s social media platforms without the plaintiff’s express written permission, regardless of whether the companies are using the platforms directly or via a third party. The defendants are also banned from collecting data or assisting others collect data without the plaintiff’s permission, and are required to delete any and all software, scripts or code that are designed to access or interact with two of the plaintiff’s social media platforms. Additionally, the defendants are prohibited from using or selling any data that they have previously collected from the plaintiff’s social media platforms.

    Courts Privacy, Cyber Risk & Data Security Data Scraping Social Media Data Collection / Aggregation

  • Dem chairs request info on agency data use

    Privacy, Cyber Risk & Data Security

    On August 16, Chairman of the Committee on the Judiciary Jerrold Nadler (D-NY) and Chairman of the Committee on Homeland Security Bennie Thompson (D-MS) sent a letter to multiple government agency leaders, requesting information on their purchases and use of personal data from data brokers. According to the chairmen, “[c]ompanies participating in the data market acquire user information for package and sale through social media, mobile applications, web hosts, and other sources,” and such products “can include precise details on individuals’ location history, internet activity, and utilities information, to name a few.” The letter further noted that, “improper government acquisition of this data can thwart statutory and constitutional protections designed to protect Americans’ due process rights.” The letter also pointed out that the agencies receiving the letter “have contracts with numerous data brokers, who provide detailed information on millions of Americans.” The chairmen requested a briefing from the agencies, in addition to documents and communications related to contracts the government has had with data brokers, legal analyses on the use of personal data, and parameters and limitations set on the use of the data by the end of August.

    Privacy, Cyber Risk & Data Security Federal Issues Data Collection / Aggregation U.S. House Data Brokers

  • Court grants final approval of privacy class action settlement

    Courts

    On July 20, the U.S. District Court for the Northern District of California granted final approval of a class action settlement in a suit against a fintech company alleged to have accessed the personal banking data of users without first obtaining consent, in violation of California privacy, anti-phishing, and contract laws. As previously covered by InfoBytes, the district court granted preliminary approval of the $58 million settlement in November. In granting final approval of the settlement, the court determined it was adequate, and noted that the plaintiffs’ claim that the defendant’s practices breached California’s anti-phishing law was “relatively untested.” In addition to the $58 million settlement fund, the settlement provides for injunctive relief.

    Courts California Class Action Settlement Data Collection / Aggregation Privacy, Cyber Risk & Data Security

  • House committee advances comprehensive consumer privacy bill

    Privacy, Cyber Risk & Data Security

    On July 20, the U.S. House Committee on Energy and Commerce voted 53-2 to send H.R. 8152, the American Data Privacy and Protection Act, to the House floor. As previously covered by a Buckley Special Alert, a draft of the bill was released in June, which would, among other things, require companies to collect the least amount of data possible to provide services, implement special protections for minors, and allocate enforcement responsibilities to the FTC. The bill has been revised from its initial draft to allow consumers to bring lawsuits after notifying certain state and federal regulators beginning two years after the law takes effect, which is different from the four-year wait period proposed in the draft. Additionally, the current patchwork of five state privacy laws would be preempted, although under the revised bill California's new privacy agency would be allowed to enforce the federal law. The revised bill also includes a provision that narrows the scope of algorithmic impact assessments required of large data holders to focus on algorithms that pose a “consequential risk of harm.” Additionally, the revised bill includes a more expansive definition of “sensitive data” to include browsing history, race, ethnicity, religion and union membership. It also sets a tiered system of responsibility depending on the size of companies for data related to people under 17.

    Privacy, Cyber Risk & Data Security U.S. House Data Data Collection / Aggregation American Data Privacy and Protection Act Federal Legislation

  • Khan outlines FTC’s plans to enforce privacy, data security

    Privacy, Cyber Risk & Data Security

    On April 11, FTC Chair Lina Khan spoke at the Opening General Session of the IAPP Global Privacy Summit 2022, focusing on the Commission’s’ approach to privacy and data security enforcement strategy. In her remarks, Khan offered observations on “the new political economy” of how American consumers’ data is “tracked, gathered, and used,” and identified how the Commission is adjusting to address these “new market realities.” She also raised broad questions about the current framework for policing “the use and abuse of individuals’ data.” Khan observed that digital technology now allows firms to collect vast amounts of data on a “hyper-granular level,” tracking individuals as they carry out daily tasks. The information collected includes precise personal location, web browsing history, health records, and a complete picture of ones social network of family and friends. This data, analyzed and aggregated at a huge scale, yields “stunningly detailed and comprehensive user profiles that can be used to target individuals with striking precision.” She acknowledged that this data can be put towards adding value for consumers but that consumers are often unaware that companies are monetizing their personal data at huge profits leading to business models that “incentivize endless tracking and vacuuming up of users’ data.” These incentives have rendered today’s digital economy as, quoting a scholar, “probably the most highly surveilled environment in the history of humanity.”

    Khan also outlined three key aspects of the FTC’s approach to addressing the above risks to consumers:

    • The FTC will focus on “dominant firms” causing “widespread harm.” This includes addressing conduct by the dominant firms themselves as well as “dominant middlemen” facilitating the conduct through unlawful data practices.
    • The FTC is taking an interdisciplinary approach by “assessing data practices through both a consumer protection and competition lens” because widescale commercial surveillance and data collection practices have the potential to violate both consumer protection and antitrust laws. The FTC will also increase reliance on technologists such as data scientists, engineers, user design experts, and AI researchers to augment the skills of their lawyers, economists, and investigators.
    • The FTC will focus on designing effective remedies “informed by the business strategies that specific markets favor and reward” and that are responsive to the new value that companies place on collected data. Such remedies may include bans from surveillance industries for companies and individuals, disgorgement, requiring updated security measures such as dual-factor authentication, and requiring the deletion of illegally collected data and any algorithms derived from the same.

    Khan further indicated that the FTC is considering initiating rulemaking to address commercial surveillance practices and inadequate data security. She concluded by suggesting a paradigmatic shift away from the current framework used to assess unlawful data gathering. Specifically, she stated that “market realities may render the ‘notice and consent’ paradigm outdated and insufficient” – noting that users find privacy policies overwhelming and have no real alternatives to accepting their terms given the increasingly critical reliance on digital tools to navigate daily life. Khan called for new legislation to address these concerns, saying, “[W]e should approach data privacy and security protections by considering substantive limits rather than just procedural protections, which tend to create process requirements while sidestepping more fundamental questions about whether certain types of data collection and processing should be permitted in the first place. The central role that digital tools will only continue to play invites us to consider whether we want to live in a society where firms can condition access to critical technologies and opportunities on users surrendering to commercial surveillance.”

    Privacy/Cyber Risk & Data Security Federal Issues FTC Data Collection / Aggregation Consumer Protection

  • District Court denies defendant’s motion to dismiss Illinois BIPA class action

    Courts

    On October 28, the U.S. District Court for the Northern District of Illinois denied a Delaware-based technology management service defendant’s motion to dismiss a putative class action that alleged it stored and collected biometric data from employees of companies that utilized the defendant’s timekeeping services. The court also granted the plaintiff’s motion to remand two of her three claims to state court because the plaintiff had not alleged an injury in fact sufficient to establish Article III standing in federal court for those claims.

    The plaintiff alleged that the defendant violated the Illinois’ Biometric Information Privacy Act (BIPA) by selling time and attendance solutions to Illinois employers, including biometric-enabled hardware such as fingerprint and facial recognition scanners that collected and stored employee biometrics data. The plaintiff alleged that the defendant violated Section 15(a) of BIPA by failing to publish a retention schedule for the biometric data, violated Section 15(b) of BIPA by obtaining the plaintiff’s biometric data without first providing written disclosures and obtaining written consent, and violated section 15(c) of BIPA, by participating in the dissemination of her biometric data among servers. According to the district court, the plaintiff lacked standing regarding the Section 15(a) claim because the harm resulting from the defendant’s failure to publish a retention policy was not sufficiently particularized and the plaintiff had not otherwise alleged a concrete injury resulting from the violation. The district court concluded that the plaintiff’s Section 15(c) claim also lacked standing because, though she alleged that the defendant profits off its biometric data collection practices by marketing its biometric time clocks that utilize the software as “superior options” and “gains a competitive advantage”, the “complaint doesn't allege an injury in fact stemming from [the defendant’s] profiting off of [the plaintiff’s] biometric data.”

    With regard to the Section 15(b) claim, the district court rejected the defendant’s argument that the requirement to inform clients regarding its biometric data collection and receiving written consent did not apply, noting that the defendant is right that it “doesn’t penalize mere possession of biometric information.” However, that does not help the defendant “because the complaint alleges that defendant did more than possess [the plaintiff’s] biometric information: it says that [the defendant] collected and obtained it.” Additionally, the district court rejected the defendant’s argument that it is not liable as a third-party vendor who lacks the power to obtain the required written releases from its clients’ employees. The district court stated that “while it’s probably true that [the defendant] wasn’t in a position to impose a condition of employment on its clients’ employees, the statutory definition of a written waiver doesn’t excuse vendors like [the defendant] from securing their own waivers before obtaining a person’s data.”

    Courts BIPA Illinois Data Collection / Aggregation Class Action Privacy/Cyber Risk & Data Security State Issues

Pages

Upcoming Events