“We are in the era of big data. With a smartphone now in nearly every pocket, a computer in nearly every household, and an ever-increasing number of Internet-connected devices in the marketplace, the amount of consumer data flowing throughout the economy continues to increase rapidly.”
We couldn’t have said it better ourselves, so we didn’t try to. That quote, the very first words of the Federal Trade Commission’s new report, Big Data: A Tool for Inclusion or Exclusion?, encapsulates everything we and other Internet attorneys have been saying for years. This is the era of big data; this is the era of cybersecurity. It is vital that businesses and individuals alike understand how their data is being used, and how their data is protected.
The FTC’s new report aims to add to that understanding. In particular, the Commission sought to “discuss both the potential of big data to create opportunities for consumers and to exclude them from such opportunities.” After workshops and seminars and studies, the FTC published this Report, which it describes as “address[ing] only the commercial use of big data consisting of consumer information and focus[ing] on the impact of big data on low-income and underserved populations.”
The FTC was concerned with the potential big data has to exclude particular people based on certain factors. For example, big data has the potential to make lending money to “undeserving” or “high risk” individuals much less likely, because lenders have huge amounts of information about those people that can quickly be accessed and compiled to provide a very accurate risk analysis.
All this being said, the Report wasn’t completely cynical. It also highlighted areas where big data could be used to improve peoples’ lives and benefit historically and currently underserved segments of the population. For example, IBM created a sophisticated medical diagnosis process that can use big data gathered from textbooks and academic papers to actually diagnose a real human being. This could help communities that lack doctors to at least begin identifying potential health issues.
While the Report kicked back and forth various potential problems and benefits of big data, the meat of the report was a discussion of the legal issues companies should be aware of so they do not violate the law and risk FTC enforcement. This, in turn, will help make use of big data a more inclusive practice. The Report was particularly concerned with three legal areas, and how companies that use big data can comply with them: The Fair Credit Reporting Act (FCRA), various equal opportunity laws, and the Federal Trade Commission Act (FTCA).
Big Data and the FCRA
The FCRA applies strictly to “credit reporting agencies.” These are the agencies that furnish credit reports for consumer and commercial use. The FCRA mandates the agencies implement certain procedures and safeguards to ensure the sensitive financial data that they use, store, and move is secure.
The Report was particularly concerned with the legality of “predictive analytics” tools, which are being used more and more by credit reporting agencies in lieu of traditional analytics. Because these agencies furnish reports about the creditworthiness of individuals, they must use certain metrics to ensure accuracy and efficiently analyze many, many consumers. While these agencies used to look at reported metrics (like last late payment), they are moving toward looking at things like social media, shopping habits, and zip codes (most likely to make a wealth determination) to predict how creditworthy an individual may be.
The Report predicts that the FTC would ultimately find that such analytics would still fall under the FCRA if the results of the analysis were being used to determine an individual’s creditworthiness. A “consumer report” is partly defined as “bearing on a consumer’s personal characteristics or mode of living,” which include social media activity and purchasing history. The Report notes that if the results of the analysis were used for internal, commercial purposes like marketing research, then the FCRA would not apply.
The Report was also concerned with FCRA inclusion. The Report cited theSpokeo and Instant Checkmate settlement agreements as examples of non-traditional credit reporting agencies falling under the FCRA due to their practice of acquiring and analyzing big, financial data. The Report makes an important note: in both of those cases the companies had disclaimers saying that they were not credit reporting agencies. This, according to the Report, was not enough to escape FCRA jurisdiction, and both companies ultimately settled for hundreds of thousands of dollars due to FCRA compliance failure.
The Report also noted that many companies are now so large and vertically organized that they work within their own data sets gained from direct contact with their customers. This type of direct transferring and use of data does not fall under the FCRA, but if a third-party company were to come in and analyze or purchase the data, the FCRA would then apply to all parties.
Big Data and Equal Opportunity
There are many federal laws that aim to “level the playing field” and prevent unequal treatment based on factors such as race, wealth, and religion. The Report was mainly concerned with the Equal Credit Opportunity Act (ECOA) which prohibits discriminatory lending practices. Discrimination in this context is based on a “disparate impact,” meaning a “facially neutral” action may have the effect of discrimination. An example would be a lender only lending to persons in certain area codes.
The Report was most concerned with advertising, as this is what it calls a “grey area.” It advises that lenders should focus on Regulation B of the ECOA which “prohibits creditors from making oral or written statements, in advertising or otherwise, to applicants or prospective applicants that would discourage on a prohibited basis a reasonable person from making or pursuing an application” while also requiring lenders to “maintain records of the solicitations and the criteria used to select potential recipients.”
Where big data comes into play in this context is obvious: companies should not use big data to weed out certain “classes” of people it deems unworthy of business. Lenders and creditors, in particular, may violate ECOA if they do. But there are many other anti-discrimination laws that would land other types of businesses in trouble if they were to use big data in a discriminatory manner.
Big Data and the FTCA
The Federal Trade Commission Act (FTCA) is the godfather of consumer protection regulation. It is the enabling act that gives the FTC the power and direction to go after “unfair or deceptive acts or practices in or affecting commerce.” And the FTC’s jurisdiction in this area is as broad as it sounds—the Wyndham case extended the FTC’s jurisdiction to cybersecurity issues.
While violating the FTCA will most likely not land a company in court, the result may be even more damaging. The FTC has the power to bring enforcement actions, which almost always end in settlement agreements. Notwithstanding any monetary penalty, enforcement actions can be highly damaging to a company in that they generally require disclosure of the unfair or deceptive practices to their customers, and many times, the unfair or deceptive actions of the company make headlines. This can lead to a severe loss of trust in the company, and cripple it for years.
Questions for Legal Compliance
The Report laid out several key leading questions that companies should ask themselves to ensure legal compliance with respect to their big data:
- If you compile big data for a company that will use it for eligibility decisions (such as credit, employment, insurance, housing, government benefits, and the like), are you complying with the accuracy and privacy provisions of the FCRA?
- If you receive big data products from another entity that you will use for eligibility decisions, are you complying with the provisions applicable to users of consumer reports?
- If you are a creditor using big data analytics in a credit transaction, are you complying with the requirement to provide statements of specific reasons for adverse action under ECOA?
- Are you complying with ECOA requirements related to requests for information and record retention?
- If you use big data analytics in a way that might adversely affect people in their ability to obtain credit, housing, or employment:
- Are you treating people differently based on a prohibited basis, such as race or national origin?
- Do your policies, practices, or decisions have an adverse effect or impact on a member of a protected class, and if they do, are they justified by a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact?
- Are you honoring promises you make to consumers and providing consumers material information about your data practices?
- Are you maintaining reasonable security over consumer data?
- Are you undertaking reasonable measures to know the purposes for which your customers are using your data?
Clearly, there are many considerations to make if you are a business that uses, or plans to use, big data. With its Report, the FTC is very clearly aware of and concerned with the potential of big data to lead to unfair trade practices that disproportionately affect already underserved swaths of the population. Because the FTC can initiate enforcement actions on its own, listening to what it has to say about legal compliance is essential.
If you have any questions about legal compliance in relation to big data, please give our expert Internet Attorneys at Revision Legal a call at 855-473-8474.