Information contained in this publication is intended for informational purposes only and does not constitute legal advice or opinion, nor is it a substitute for the professional judgment of an attorney.
The Federal Trade Commission (“FTC”) issued a report this month entitled, “Big Data: A Tool for Inclusion or Exclusion.”1 The theme of the report is that a company’s use of big data—be it in identifying and providing services to consumers or in the hiring and management of its workforce—may have both benefits and risks. This article summarizes the report and provides recommendations for employers regarding the use of data science and predictive analytics in decision-making.
The 33-page report recaps a workshop held more than 15 months ago, on September 15, 2014. It describes the growing practice of using big data in marketing, fraud prevention, human resources and other fields. It notes that data science provides “numerous opportunities for improvements in society” and identifies market-wide benefits of increased efficiency in decision-making, and the creation of opportunities for low-income and underserved communities. The examples of data science improving lives include tailoring health-care to patient characteristics, increasing equal access and diversity in employment, improving educational opportunities, and augmenting access to credit by relying on more reliable and robust information about applicants. The report also identifies specific negative consequences associated with analytics done wrong. For instance, participants at the workshop expressed concern about the quality of data relied upon, its accuracy, completeness and representativeness.
Essentially, as described in greater detail in Littler’s August 2015 Report, The Big Move Toward Big Data in Employment, there are risks of bias being introduced and replicated if the data relied upon are the products of biased measures. One concern is that the appearance of unbiased data-science could be used to obscure problems of discrimination and unequal treatment. The FTC’s report notes that “big data analytics may lead to wider propagation of the problem and make it more difficult for the company using such data to identify the source of discrimination effects and address it.”
The report identifies specific laws of which to be mindful for companies using big data in decision-making: the Federal Trade Commission Act, Fair Credit Reporting Act (“FCRA”), Equal Opportunity Laws, including the Equal Credit Opportunity Act (“ECOA”), Title VII of the Civil Rights Act of 1964, the Americans with Disabilities Act (“ADA”), the Age Discrimination in Employment Act (“ADEA”), the Fair Housing Act (“FHA”), and the Genetic Information Nondiscrimination Act (“GINA”). The FTC advises that companies “should review these laws and take steps to ensure their use of big data analytics complies with the discrimination prohibitions that may apply.”
Lastly, the report offers several suggested questions to consider in the collection and use of data in strategic business decision-making:
- How representative is your data set?
- Does your data model account for biases?
- How accurate are your predictions based on big data?
- Does your reliance on big data raise ethical or fairness concerns?
Care should be taken to maximize the value of data science by being mindful of the quality of the data, and the meaningfulness of their interpretation.
Following are three tips for employers interested in using analytics in decision-making:
- Be wary of proxies for information. Work to reduce the gap between the features being considered and the desired outcome of the selection process. For instance, if a company is interested in predicting whether applicants will be successful employees, in the past, resumes served as proxies for things like intellectual curiosity, intelligence, possessing a strong work ethic, and overall achievement. However, resumes are a weak proxy for those traits. The weaker the proxy, the less tethered to the legitimate objective information being measured or predicted. Instead, strive to replace weak proxy inputs with stronger data that more directly measures outcome variables of interest. This will reduce lurking bias, and reduce the likelihood of risk materialization.
- Resist the urge to throw everything at the wall and see what sticks. Big data’s bigness is a double-edged sword statistically speaking. It is important to resist the temptation to allow artificial intelligence to entirely replace intuition and business intelligence. A better approach is to incorporate analytics into decision-making to augment and inform processes.
- Consider legal risks before they materialize and evaluate the legal issues associated with data analytics in an attorney-client privileged setting whenever possible.