Skip to main content
Menu Icon
Close

InfoBytes Blog

Financial Services Law Insights and Observations

Filter

Subscribe to our InfoBytes Blog weekly newsletter and other publications for news affecting the financial services industry.

  • 9th Circuit: Networking site cannot deny data scraping access to publicly available profiles

    Privacy, Cyber Risk & Data Security

    On April 18, on remand from the U.S. Supreme Court, the U.S. Court of Appeals for the Ninth Circuit affirmed a district court’s order preliminarily enjoining a professional networking site from denying a data analytics company access to publicly available member profiles. At issue are allegations brought by the networking site claiming the data analytics company used automated bots to extract user data from the networking site’s website (a process known as “scraping”) for the purposes of selling its analytics services to businesses. The networking site sent the data analytics company a cease-and-desist letter, asserting violations of state and federal law, including the Computer Fraud and Abuse Act (CFAA). The data analytics company responded that it had a right to access the public pages and later sought a preliminary injunction. In granting the preliminary injunction, the district court ordered the networking site to, among other things, “remove any existing technical barriers to [its] public profiles, and to refrain from putting in place any legal or technical measures” that would block access.

    The 9th Circuit previously affirmed the preliminary injunction, but was called to further consider whether the CFAA applies to the data analytics company’s data scraping after the U.S. Supreme Court vacated the appellate court’s judgment in light of its ruling in Van Buren v. United States.

    On remand, the appellate court reviewed whether the data analytics company accessed data “without authorization” in violation of the CFAA after it received the cease-and-desist letter. The 9th Circuit found that the ruling in Van Buren, in which the Supreme Court suggested that the CFAA only applies in cases where someone is accused of hacking into or exceeding their authorized access to a network that is protected, or in situations where the “gates are up,” narrowed the CFAA’s scope and most likely did not apply to cases involving data scraped in bulk by automated bots from public websites. “A defining feature of public websites is that their publicly available sections lack limitations on access; instead, those sections are open to anyone with a web browser,” the appellate court wrote. “In other words, applying the ‘gates’ analogy to a computer hosting publicly available webpages, that computer has erected no gates to lift or lower in the first place.” Therefore, the court held, the phrase “without authorization” does not apply to public websites.

    In determining that a preliminary injunction was appropriate, the appellate court held that the district court did not abuse its discretion in concluding that the data analytics company met the standard of establishing that the plaintiff is likely to succeed on the merits, is likely to suffer irreparable harm without such relief, that the “balance of equities” is in the favor of the plaintiff, and that the injunction would be in the public interest.  The court found that the data analytics company showed that it “currently has no viable way to remain in business other than using [the networking site’s] public profile data” for its analytic services and “demonstrated a likelihood of irreparable harm absent a preliminary injunction.” In considering the balance of hardships, the 9th Circuit agreed that the scales “tipped sharply” in favor of the data analytics company “when weighing the likelihood that [the data analytics company] would go out of business against [the networking site’s] assertion that an injunction threatened its members’ privacy” and therefore risked the goodwill it had developed with its members. Finally, the court rejected the networking site’s claims that the data analytics company violated the CFAA, which would have preempted the remaining state law claims.  
     

    Privacy/Cyber Risk & Data Security Courts Appellate Ninth Circuit Cyber Risk & Data Security Computer Fraud and Abuse Act Data Scraping

  • Colorado seeks comments on privacy rulemaking; draft regulations to come this fall

    Privacy, Cyber Risk & Data Security

    Recently, the Colorado attorney general released pre-rulemaking considerations for the Colorado Privacy Act (CPA). The considerations seek informal public input on any area of the CPA, including those “that need clarification, consumer concerns, anticipated compliance challenges, impacts of the CPA on business or other operations, cost concerns, and any underlying or related research or analyses.” As covered by a Buckley Special Alert, the CPA was enacted last July to establish a framework for personal data privacy rights and provides consumers with numerous rights, including the right to access their personal data, opt-out of certain uses of personal data, make corrections to personal data, request deletion of personal data, and obtain a copy of personal data in a portable format. The CPA is effective July 1, 2023 with certain opt-out provisions taking effect July 1, 2024. Under the CPA, the AG has enforcement authority for the law, which does not have a private right of action. The AG also has authority to promulgate rules to carry out the requirements of the CPA and issue interpretive guidance and opinion letters. Finally, the AG has authority to develop technical specifications for at least one universal opt-out mechanism.

    The AG’s office stated that it plans to adopt a principle-based model for the state’s rulemaking approach rather than a prescriptive one, and outlined five principles intended to help implement the CPA:

    • rules should protect consumers and help consumers understand and exercise their rights;
    • rules should clarify ambiguities as necessary to promote compliance and minimize unnecessary disputes;
    • rules should facilitate efficient and expeditious compliance by ensuring processes are simple and straightforward for consumers, controllers and processors, and enforcement agencies;
    • rules should facilitate interoperability and allow the CPA to function alongside protections and obligations created by other state, national, and international frameworks; and
    • rules should not be unduly burdensome so to as to prevent the development of adaptive solutions to address challenges presented by advances in technology.

    The pre-rulemaking considerations laid out several questions for input related to topics addressing universal opt-out mechanisms, consent for processing consumer data in specific circumstances, dark patterns, data protection assessments that screen for heightened risk of harm, the effects of profiling on consumers, opinion letters and interpretive guidance, offline and off-web data collection, and differences and similarities between the CPA and laws in other jurisdictions. A formal notice of rulemaking and accompanying draft regulations will be issued this fall. Comments may be submitted through the AG’s portal here.

    Privacy/Cyber Risk & Data Security State Issues State Attorney General Colorado Colorado Privacy Act Consumer Protection

  • Virginia enacts additional consumer data protections

    Privacy, Cyber Risk & Data Security

    On April 11, the Virginia governor signed legislation enacting additional amendments to the Virginia Consumer Data Protection Act (VCDPA). Both bills take effect July 1.

    HB 714 (identical bill SB 534) expands the definition of a nonprofit organization to include political and certain tax-exempt 501(c)(4) organizations, thus exempting them from the VCDPA’s provisions. The bill also abolishes the Consumer Privacy Fund and provides that all civil penalties, expenses, and attorney fees collected from enforcement of the VCDPA shall be deposited into the Regulatory, Consumer Advocacy, Litigation, and Enforcement Revolving Trust Fund. Under Section 59.1-584, the attorney general has exclusive authority to enforce the law and seek penalties of no more than $7,500 per violation should a controller or processor of consumer personal data continue to violate the VCDPA following a 30-day cure period, or breach an express written statement provided to the attorney general that the alleged violations have been cured.

    HB 381 amends VCDPA provisions related to consumers’ data deletion requests. Specifically, the amendment provides that a controller that has obtained a consumer’s personal data from a third party “shall be deemed in compliance with a consumer’s request to delete such data . . . by either (i) retaining a record of the deletion request and the minimum data necessary for the purpose of ensuring the consumer’s personal data remains deleted from the business’s records and not using such retained data for any other purpose . . . or (ii) opting the consumer out of the processing of such personal data for any purpose except for those exempted pursuant” to the VCDPA. 

    As previously covered by InfoBytes, the VCDPA was enacted last year to establish a framework for controlling and processing consumers’ personal data in the Commonwealth. The VCDPA, which explicitly prohibits a private right of action, allows consumers to access their personal data; make corrections; request deletion of their data; obtain a copy of their data in a portable format; and opt out of targeted advertising, sale of their data, or “profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.” 

    Privacy/Cyber Risk & Data Security State Issues State Legislation Virginia Consumer Protection Act Virginia Consumer Protection VCDPA

  • Khan outlines FTC’s plans to enforce privacy, data security

    Privacy, Cyber Risk & Data Security

    On April 11, FTC Chair Lina Khan spoke at the Opening General Session of the IAPP Global Privacy Summit 2022, focusing on the Commission’s’ approach to privacy and data security enforcement strategy. In her remarks, Khan offered observations on “the new political economy” of how American consumers’ data is “tracked, gathered, and used,” and identified how the Commission is adjusting to address these “new market realities.” She also raised broad questions about the current framework for policing “the use and abuse of individuals’ data.” Khan observed that digital technology now allows firms to collect vast amounts of data on a “hyper-granular level,” tracking individuals as they carry out daily tasks. The information collected includes precise personal location, web browsing history, health records, and a complete picture of ones social network of family and friends. This data, analyzed and aggregated at a huge scale, yields “stunningly detailed and comprehensive user profiles that can be used to target individuals with striking precision.” She acknowledged that this data can be put towards adding value for consumers but that consumers are often unaware that companies are monetizing their personal data at huge profits leading to business models that “incentivize endless tracking and vacuuming up of users’ data.” These incentives have rendered today’s digital economy as, quoting a scholar, “probably the most highly surveilled environment in the history of humanity.”

    Khan also outlined three key aspects of the FTC’s approach to addressing the above risks to consumers:

    • The FTC will focus on “dominant firms” causing “widespread harm.” This includes addressing conduct by the dominant firms themselves as well as “dominant middlemen” facilitating the conduct through unlawful data practices.
    • The FTC is taking an interdisciplinary approach by “assessing data practices through both a consumer protection and competition lens” because widescale commercial surveillance and data collection practices have the potential to violate both consumer protection and antitrust laws. The FTC will also increase reliance on technologists such as data scientists, engineers, user design experts, and AI researchers to augment the skills of their lawyers, economists, and investigators.
    • The FTC will focus on designing effective remedies “informed by the business strategies that specific markets favor and reward” and that are responsive to the new value that companies place on collected data. Such remedies may include bans from surveillance industries for companies and individuals, disgorgement, requiring updated security measures such as dual-factor authentication, and requiring the deletion of illegally collected data and any algorithms derived from the same.

    Khan further indicated that the FTC is considering initiating rulemaking to address commercial surveillance practices and inadequate data security. She concluded by suggesting a paradigmatic shift away from the current framework used to assess unlawful data gathering. Specifically, she stated that “market realities may render the ‘notice and consent’ paradigm outdated and insufficient” – noting that users find privacy policies overwhelming and have no real alternatives to accepting their terms given the increasingly critical reliance on digital tools to navigate daily life. Khan called for new legislation to address these concerns, saying, “[W]e should approach data privacy and security protections by considering substantive limits rather than just procedural protections, which tend to create process requirements while sidestepping more fundamental questions about whether certain types of data collection and processing should be permitted in the first place. The central role that digital tools will only continue to play invites us to consider whether we want to live in a society where firms can condition access to critical technologies and opportunities on users surrendering to commercial surveillance.”

    Privacy/Cyber Risk & Data Security Federal Issues FTC Data Collection / Aggregation Consumer Protection

  • District Court approves $90 million settlement in data tracking suit

    Courts

    On March 31, the U.S. District Court for the Northern District of California granted final approval to a $90 million class action settlement resolving claims that a social media platform unlawfully tracked consumers’ browsing data. According to the settlement agreement, the defendant obtained and collected data from approximately 124 million platform users in the U.S. who visited websites that displayed the defendant’s “Like” button between April 22, 2010 and September 26, 2011. According to the settlement, in addition to paying a $90 million settlement, the company must delete the data it had collected from users during the class period.

    Courts Privacy/Cyber Risk & Data Security Class Action California Settlement

  • Arizona amends data breach notification requirements

    Privacy, Cyber Risk & Data Security

    On March 29, the Arizona governor signed HB 2146, amending the Arizona Revised Statutes’ security breach notification requirements. Specifically, if a person conducting business in the state that “owns, maintains or licenses unencrypted and unredacted computerized personal information becomes aware of a security incident” involving more than 1,000 individuals, the person is required to notify the three largest national consumer reporting agencies, the state attorney general, and the director of the Arizona Department of Homeland Security within 45 days. The bill also makes various technical corrections and will take effect 90 days after legislature adjourns.

    Privacy/Cyber Risk & Data Security State Legislation State Issues Arizona Data Breach

  • District Court refuses to enforce choice-of-law provision, allows individual state data privacy claims to proceed

    Privacy, Cyber Risk & Data Security

    On March 30, the U.S. District Court for the Northern District of Illinois denied a global tech company’s bid to dismiss class action Illinois Biometric Information Privacy Act (BIPA) claims. Plaintiffs (Illinois residents) sued the company alleging it violated BIPA by applying image recognition technology to photos uploaded to subscribers’ account without receiving informed written consent. Plaintiffs also claimed the company failed to establish a file retention schedule and deletion guidelines as required by state law. The company argued that the terms of use agreed to by the subscribers contain a choice-of-law provision stating that the laws of Washington State govern the conditions of use and any disputes. The court disagreed, stating that Washington’s biometric protection statute does not provide for a private cause of action and is therefore contrary to Illinois’ fundamental public policy. “The fact that BIPA creates a private cause of action underscores the importance Illinois places on an individual’s right to control their biometric information,” the court said. “Applying Washington law would rob plaintiffs of control over their individual biometric information, instead leaving it to Washington’s attorney general to bring suit.” The court also held that Illinois has a greater material interest in the dispute than Washington. The court allowed the plaintiffs’ claims regarding consent to proceed in federal court but remanded the other claims to the Cook County Circuit Court.

    Privacy/Cyber Risk & Data Security Courts State Issues Washington Illinois BIPA

  • EU and U.S. agree in principle on new Trans-Atlantic Data Privacy Framework

    Privacy, Cyber Risk & Data Security

    On March 25, the U.S. and the European Commission announced their agreement in principle on a new Trans-Atlantic Data Privacy Framework (Framework) to foster cross-border transfers of personal data from the EU to the U.S. (See also White House and European Commission fact sheets here and here.) Under the Framework, the U.S. has committed to implementing reforms and safeguards to “strengthen the privacy and civil liberties protections applicable to U.S. signals intelligence activities.” The announcement follows negotiations that began after the Court of Justice of the EU (CJEU) issued an opinion in the Schrems II case (Case C-311/18) in July 2020, holding that the EU-U.S. Privacy Shield did not satisfy EU legal requirements.

    As previously covered by InfoBytes, the CJEU’s ruling (which could not be appealed) concluded that the Standard Contractual Clauses issued by the European Commission for the transfer of personal data to data processors established outside of the EU are valid. However, the Court invalidated the EU-U.S. Privacy Shield. In annulling the EU-U.S. Privacy Shield, the CJEU determined that because the requirements of U.S. national security, public interest, and law enforcement have “primacy” over the data protection principles of the EU-U.S. Privacy Shield, the data transferred under the EU-U.S. Privacy Shield would not be subject to the same level of protections prescribed by the GDPR. Specifically, the CJEU held that the surveillance programs used by U.S. authorities are not proportionally equivalent to those allowed under the EU law because they are not “limited to what is strictly necessary,” nor, under certain surveillance programs, does the U.S. “grant data subjects actionable rights before the courts against the U.S. authorities.” 

    According to the factsheet released by the White House, the U.S. has made “unprecedented commitments” that build on the safeguards that were in place under the annulled EU-U.S. Privacy Shield with the goal of addressing issues identified in the Schrems II decision. These commitments include (i) strengthening the privacy and civil liberties safeguards governing U.S. signals intelligence activities through measures that would limit U.S. intelligence authorities’ data collection to what is necessary to advance legitimate national security objectives; (ii) establishing a new, multi-layered redress mechanism with independent and binding authority “consist[ing] of individuals chosen from outside the U.S. Government who would have full authority to adjudicate claims and direct remedial measures, as needed”; and (iii) enhancing the U.S.’s existing rigorous and layered oversight of signals intelligence activities, and requiring U.S. intelligence agencies to “adopt procedures to ensure effective oversight of new privacy and civil liberties standards.” The factsheet further stated that participating companies and organizations will continue to be required to adhere to the EU-U.S. Privacy Shield principles, including the requirement of self-certification through the U.S. Department of Commerce. EU individuals will also continue to have access to avenues of recourse to resolve complaints against businesses and organizations participating in the Framework, including through alternative dispute resolution and binding arbitration.

    The White House stated that President Biden will issue an executive order outlining the aforementioned commitments “that will form the basis of the Commission’s assessment in its future adequacy decision.” According to the announcement, the U.S. and European Commission “will now continue their cooperation with a view to translate this arrangement into legal documents that will need to be adopted on both sides to put in place this new Trans-Atlantic Data Privacy Framework.”

    Privacy/Cyber Risk & Data Security Consumer Protection EU EU-US Privacy Shield GDPR Of Interest to Non-US Persons

  • Social networking apps settle minors' data claims for $1.1 million

    Privacy, Cyber Risk & Data Security

    On March 25, the U.S. District Court for the Northern District of Illinois granted final approval to a $1.1 million class action settlement resolving claims that the operators of two video social networking apps (defendants) “‘surreptitiously tracked, collected, and disclosed the personally identifiable information and/or viewing data of children under the age of 13,’ ‘without parental consent’” in violation of federal and California privacy law. Specifically, plaintiffs asserted violations of the Video Privacy Protection Act (VPPA), the California constitutional right to privacy, the California Consumers Legal Remedies Act (CLRA), and the Illinois Consumer Fraud and Deceptive Businesses Practices Act. Defendants countered that plaintiffs’ state-law claims were preempted by the Children’s Online Privacy Protection Act, and that, furthermore, the “alleged conduct is not within the scope of VPPA or the cited state consumer protection laws” and “does not amount to a common law invasion of privacy or a violation of Plaintiffs’ rights under the California Constitution.” Moreover, defendants argued that plaintiffs could not recover actual damages. According to plaintiffs’ supplemental motion for final approval, following months-long negotiations, the parties agreed to settle the action on a class-wide basis.

    The settlement requires defendants to pay $1.1 million into a non-reversionary settlement fund, to be dispersed pro rata to class members (anyone in the U.S. who, prior to the settlement’s effective date and while under the age of 13, registered for or used the apps) who submit a valid claim after the payment of settlement administration expenses, taxes, fees, and service awards. The court’s order, however, declined to award an objector’s counsel any attorneys’ fees for his efforts to negotiate modified relief because the agreement was negotiated in a separate proceeding in related multidistrict litigation. The court also denied plaintiffs’ motion for sanctions against the objector’s law firm.

    Privacy/Cyber Risk & Data Security Courts Settlement Class Action State Issues Illinois California COPPA

  • Insurers obligated to indemnify retailer’s payment card claims following data breach

    Privacy, Cyber Risk & Data Security

    On March 22, the U.S. District Court for the District of Minnesota ordered two insurance companies to cover a major retailer’s 2013 data breach settlement liability under commercial general liability policies. As previously covered by InfoBytes, in 2018 the retailer reached a $17 million class action settlement to resolve consumer claims related to a 2013 data breach, which resulted in the compromise of at least 40 million credit cards and theft of personal information of up to 110 million people. The banks that issued the payment cards compromised in the data breach sought compensation from the retailer for costs associated with the cancellation and replacement of the payment cards. The retailer settled the issuing banks’ claims and later sued the insurers in 2019 for refusing to cover the costs, arguing that under the general liability policies, the insurers are obligated to indemnify the retailer with respect to the settlements reached with the issuing banks. The retailer moved for partial summary judgment, seeking a declaration that the general liability policies (which “provide coverage for losses resulting from property damage, including ‘loss of use of tangible property that is not physically injured’”) covered the costs incurred by the retailer when settling the claims for replacing the payment cards. According to the retailer, the insurers’ “refusal to provide coverage for these claims lacked any basis in either the Policies’ language or Minnesota law.” The court reviewed whether the cancellation of the payment cards following the data breach counted as a “loss of use” under the general liability policies. Although the court had previously dismissed the retailer’s coverage claims, the court now determined that the “expense that [the retailer] incurred to settle claims brought by the [i]ssuing [b]anks for the costs of replacing the compromised payment cards was a cost incurred due to the loss of use of the payment cards” because being cancelled “rendered the payment cards inoperable.”

    Privacy/Cyber Risk & Data Security Courts Data Breach Indemnification Insurance

Pages

Upcoming Events