left-caret

Caveat Vendor

FTC Report on Big Data: Risks & Recommendations

January 12, 2016

Mary-Elizabeth M. Hadley

In a report released last week, the Federal Trade Commission (“FTC” or the “Commission”) highlights the benefits and risks associated with the use of “big data” and offers businesses some suggestions on how to minimize those risks, including through compliance with consumer protection laws.
The report builds on the Commission’s September 2014 workshop which, as loyal blog readers will recall, explored the potential for big data to be used as a tool of exclusion, negatively impacting underserved and low income customers.
Benefits vs. Risks
The Commission acknowledged that many benefits can flow from companies’ collection of information regarding consumers’ choices, experiences and characteristics, including the ability to predict individuals’ preferences, help tailor services and shape personalized marketing.  In the context of healthcare, for example, big data can also lead to tailored treatment and more specialized care for underserved communities.
In contrast, big data also has the potential to “categorize consumers in ways that can result in exclusion of certain populations.”  For example, particular uses of big data may create or reinforce existing disparities, such as when it is used to target ads for financial services that otherwise eligible low-income consumers may never receive.  Big data may also expose characteristics individuals deem sensitive, including sexual orientation, ethnic origin and religion.
Considerations for Companies Using Big Data
Recognizing the ubiquitous nature of the use of big data, the FTC provided companies some guidance on potentially applicable privacy laws and policy considerations, including:

  1. Compliance with the Fair Credit Reporting Act (“FCRA”)

    • Background: The FCRA applies to companies, known as consumer reporting agencies (“CRAs”), that compile and sell consumer reports which contain consumer information that is to be used or expected to be used for credit, employment, insurance, housing or similar decisions about consumers’ eligibility for certain benefits and transactions.

      • Traditional CRAs include credit bureaus, employment background screening companies and other organizations providing particularized services for making consumer eligibility decisions, such as screening tenants.

    • The report flags a current trend, however, in which companies may purchase predictive analytics products for eligibility determinations.  In contrast to traditional credit scoring models, which focus on factors such as past late payments, these companies may use non-traditional characteristics – including a consumer’s zip code, social media usage or shopping history – to create a report on a consumer’s creditworthiness.

    • When a company uses such a report to determine whether the individual is a good credit risk, it should be mindful of the potential applicability of the FCRA.

  2. Federal Trade Commission Act (“FTC Act”)

    • Background: Section 5 of the FTC Act prohibits unfair or deceptive acts or practices in or affecting commerce.  Under the FTC Act, an act or practice is deceptive if it involves a material statement or omission that is likely to mislead a consumer acting reasonably under the circumstances.  Similarly, a failure to disclose material information may violate Section 5.

      • In addition, an unfair practice is one that is likely to cause substantial consumer injury where (i) the injury is not reasonably avoidable by consumers, and (ii) benefits to consumers or competition do not outweigh the injury.

    • Recommendations:

      • Ensure that your company is not violating any material promises to consumers, including promises (i) to refrain from sharing data with third parties, (i) to provide consumers with choices about sharing and (iii) to safeguard consumers’ personal information;

      • Take steps to reasonably secure consumers’ data; and

      • Do not sell big data analytics products to consumers that you know or have reason to know will use the products for fraudulent purposes.

        • This fact-specific inquiry will focus on whether the company is offering or using big data analytics in a deceptive or unfair way.

  3. Policy Considerations
    Because big data raises new potential discriminatory harms, the FTC recommends business ask themselves a number of questions, including:
    How representative is the data set?  If its data set is missing information regarding certain populations (e.g., individuals who are not very tech-savy), a company should take steps to address this underrepresentation.
    * Does the data model account for biases?  For example, a company that has a big data algorithm that only considers applicants from “top tier” schools to help them make hiring decisions may be incorporating previous biases in college admission decisions.
    For its part, the Commission promises to continue monitoring big data practices to ensure compliance with existing laws and to bring enforcement actions where it deems appropriate.

Caveat Vendor is Paul Hastings’ Consumer Issues blog. We welcome your feedback. Please contact our blog editor with any thoughts or suggestions. Subscribe to Caveat Vendor. You will receive an email when the blog has been updated.

Get In Touch With Us

Contact Us