If you got past the title to this point, congratulations – not sure that we would have. In exchange for your trust, we promise to try to address a complicated and fascinating subject in a relatively clear manner.
That’s pretty much what Commissioner Rebecca Slaughter of the Federal Trade Commission (FTC) did in a recent speech on this topic, and the many companies increasingly utilizing artificial intelligence (AI) should pay heed (and if you think your company is not using AI or algorithms, recheck – it probably is). Commissioner Slaughter emphatically rejects the idea that an outcome or practice cannot be unlawful or cannot be remedied because it is the result of AI. And what kind of harms could there be from rogue AI?
- Denial of a benefit – you don’t get the job or the loan.
- Higher cost of a benefit – you get the loan or other product but at a higher cost.
- Denial of opportunity – you aren’t even presented with the opportunity to apply for the job or loan or to purchase the home or other good.
And what can cause AI to go awry even when designed with the best of intentions? We still don’t know what happened to HAL in 2001: A Space Odyssey, but Commissioner Slaughter provides four examples of how algorithms can go bad.
- Faulty inputs (or garbage in, garbage out) – If the data used to “train” the algorithm is faulty, the algorithm itself is likely to be flawed. For example, if the resumes used to train a hiring algorithm are skewed male, then the resulting algorithm is likely to discriminate against women.
- Faulty conclusions – Most algorithms are intended to drive conclusions, but the ability of algorithms to reach accurate conclusions is sometimes suspect and unsupported. For example, many companies, including well-known tech companies, sell “affect recognition” technology, which promises it can accurately detect an individual’s emotional state by analyzing characteristics such as facial expressions, tone of voice, eye movements and gait. While Commissioner Slaughter notes that there is increasing skepticism that such technology can accurately make such assessments, it is nonetheless advertised and used for critical and consumer impactful decisions such as hiring and other life opportunities.
- Failure to test – As the FTC recognized in its settlement with Apple regarding kids’ in-app purchases, technology does not always roll out perfectly, but the key is to test and monitor technology and correct problems as they occur. Commissioner Slaughter notes several instances where algorithms clearly generated results that seem incredible or discriminatory; for example, typing in female names on LinkedIn that were similar to male names (for example, Stephanie) prompted the male version of the name, whereas doing a similar search based on typing in the male name would never prompt the female version of the name. Commissioner Slaughter argues that some basic pretesting would detect many problems and that she will consider whether testing and monitoring was done in determining whether use of an algorithm violated any of the statutes enforced by the FTC.
- Proxy discrimination – Proxy discrimination occurs when a facially neutral characteristic is utilized but that characteristic correlates with a suspect class such as race or gender. For example, Facebook’s Lookalike Audience tool is the subject of a housing discrimination case. The tool uses the advertiser’s best existing customers and looks for others who are similar to these customers based on things such as “likes,” page views, app usage, etc. The U.S. Department of Housing and Urban Development alleges that the use of these shared characteristics resulted in discriminatory exclusion. Commissioner Slaughter notes that the same issue can arise with the use of algorithms designed to maximize advertising clicks. She cites a study in which researchers specified an identical audience for three different job postings – lumber industry, supermarket cashier and cab driver. However, the use of AI resulted in the audience for the lumber job being 72% white and 90% male, the supermarket job being 85% female, and the cab driver job being 75% African American.
At this point you might be saying, “This is all fascinating stuff, but what does this have to do with the FTC (or me)?” Commissioner Slaughter answers this question as well. First, the FTC has basic Section 5 authority over deceptive claims, so any claims about the attributes or performance of algorithm-based products must be substantiated (for example, that AI can detect the best job candidates). Second, Commissioner Slaughter suggests that the FTC’s unfairness authority could be used against companies that secretly collect data to feed an algorithm or use an algorithm that excludes a consumer from a benefit or an opportunity based on his or her status in a protected class. However, as Commissioner Slaughter notes, unfairness enforcement is not for the fainthearted, and difficult questions could be raised as to whether any harm was “reasonably avoidable” or whether the algorithm provided “countervailing” consumer benefits. Finally, Commissioner Slaughter notes that the FTC has enforcement authority against credit discrimination, some of which may result from the use of flawed algorithms.
Commissioner Slaughter advocates for potential rule-making and legislative solutions that would encourage greater transparency surrounding the use of algorithms and AI as well as mandate testing, monitoring and correction of any issues that may arise.
It will be fascinating to watch the progression of AI and algorithms, both with respect to the marketing and advertising of consumer products and services and in numerous other areas (anyone remember the movie Minority Report?). As the FTC has noted, more often than not new technology does not generate the need for new enforcement principles, but rather calls for the application of the same rules to new situations. That may well turn out to be largely the case here. We will keep an eye out to see how the FTC responds in this area and to the many other evolving issues surrounding the marketing and advertising of goods and services.