Written By ESR News Blog Editor Thomas Ahearn
On April 8, 2020, the Federal Trade Commission (FTC) – a United States government agency that is the nation’s primary privacy and data security enforcer – issued guidance to businesses on the use of Artificial Intelligence (AI) for machine learning technology and automated decision making with regard to federal laws that included the Fair Credit Reporting Act (FCRA) that regulates background checks for employment purposes.
In the guidance titled “Using Artificial Intelligence and Algorithms,” Director of the FTC’s Bureau of Consumer Protection Andrew Smith wrote: “The good news is that, while the sophistication of AI and machine learning technology is new, automated decision-making is not, and we at the FTC have long experience dealing with the challenges presented by the use of data and algorithms to make decisions about consumers.”
Director Smith also warned in the guidance: “Headlines tout rapid improvements in artificial intelligence technology. The use of AI technology – machines and algorithms – to make predictions, recommendations, or decisions has enormous potential to improve welfare and productivity. But it also presents risks, such as the potential for unfair or discriminatory outcomes or the perpetuation of existing socioeconomic disparities.”
Enacted in 1970, the FCRA addresses “automated decision-making, and financial services companies have been applying these laws to machine-based credit underwriting models for decades,” Smith explained. “The FTC’s law enforcement actions, studies, and guidance emphasize that the use of AI tools should be transparent, explainable, fair, and empirically sound, while fostering accountability.” With regard to the FCRA, Smith wrote:
- If you make automated decisions based on information from a third-party vendor, you may be required to provide the consumer with an “adverse action” notice. Under the FCRA, a vendor that assembles consumer information to automate decision-making about eligibility for credit, employment, insurance, housing, or similar benefits and transactions, may be a “consumer reporting agency.” That triggers duties for you, as the user of that information. Specifically, you must provide consumers with certain notices under the FCRA. Say you purchase a report or score from a background check company that uses AI tools to generate a score predicting whether a consumer will be a good tenant. The AI model uses a broad range of inputs about consumers, including public record information, criminal records, credit history, and maybe even data about social media usage, shopping history, or publicly-available photos and videos. If you use the report or score as a basis to deny someone an apartment, or charge them higher rent, you must provide that consumer with an adverse action notice. The adverse action notice tells the consumer about their right to see the information reported about them and to correct inaccurate information.
- Give consumers access and an opportunity to correct information used to make decisions about them. The FCRA regulates data used to make decisions about consumers – such as whether they get a job, get credit, get insurance, or can rent an apartment. Under the FCRA, consumers are entitled to obtain the information on file about them and dispute that information if they believe it to be inaccurate. Moreover, adverse action notices are required to be given to consumers when that information is used to make a decision adverse to the consumer’s interests. That notice must include the source of the information that was used to make the decision and must notify consumers of their access and dispute rights. If you are using data obtained from others – or even obtained directly from the consumer – to make important decisions about the consumer, you should consider providing a copy of that information to the consumer and allowing the consumer to dispute the accuracy of that information.
- If you provide data about consumers to others to make decisions about consumer access to credit, employment, insurance, housing, government benefits, check-cashing or similar transactions, you may be a consumer reporting agency that must comply with the FCRA, including ensuring that the data is accurate and up to date. You may be thinking: We do AI, not consumer reports, so the FCRA doesn’t apply to us. Well, think again. If you compile and sell consumer information that is used or expected to be used for credit, employment, insurance, housing, or other similar decisions about consumers’ eligibility for certain benefits and transactions, you may indeed be subject to the FCRA. What does that mean? Among other things, you have an obligation to implement reasonable procedures to ensure maximum possible accuracy of consumer reports and provide consumers with access to their own information, along with the ability to correct any errors. A company that deployed software tools to match housing applicants to criminal records in real time or near real time learned this the hard way. The company ended up paying a $3 million penalty for violating the FCRA by failing to take reasonable steps to ensure the accuracy of the information they provided to landlords and property managers.
- If you provide data about your customers to others for use in automated decision-making, you may have obligations to ensure that the data is accurate, even if you are not a consumer reporting agency. Companies that provide data about their customers to consumer reporting agencies are referred to as “furnishers” under the FCRA. They may not furnish data that they have reasonable cause to believe may not be accurate. In addition, they must have in place written policies and procedures to ensure that the data they furnish is accurate and has integrity. Furnishers also must investigate disputes from consumers, as well as disputes received from the consumer reporting agency. These requirements are important to ensure that the information used in AI models is as accurate and up to date as it can possibly be. And, the FTC has brought actions, and obtained big fines, against companies that furnished information to consumer reporting agencies but that failed to maintain the required written policies and procedures to ensure that the information that they report is accurate.
The guidance is not the first time the FTC has addressed AI. In 2016, the FTC issued a report titled “Big Data: A Tool for Inclusion or Exclusion?” In 2018, the FTC held a hearing to explore AI, algorithms, and predictive analytics. The complete guidance from the FTC about using AI and algorithms is available at www.ftc.gov/news-events/blogs/business-blog/2020/04/using-artificial-intelligence-algorithms.
Employers using automated hiring platforms powered by AI in the belief that they are less biased and discriminatory than humans will discover using such machine learning technology in background screening will remain a work in progress in 2020, according to the “ESR Top Ten Background Check Trends” for 2020 compiled by leading global background check provider Employment Screening Resources® (ESR).
“AI has potential with screening but is unlikely to be used as quickly as predicted. Between the myriad of federal, state, and local laws regulating screening, as well as discrimination and privacy concerns, the reality is going to be much different than many people predict from a purely technological viewpoint,” explained Attorney Lester Rosen, founder and chief executive officer (CEO) of Employment Screening Resources® (ESR).
“Background checks impact the highly regulated area of employment that requires accuracy specific to the individual. Technology helps to some degree, but each individual is entitled to an ‘individualized assessment,’ and the law and court cases weigh heavily against the mass processing and categorizing of people for employment,” said Rosen, author of “The Safe Hiring Manual,” a comprehensive guide to background checks.
Employment Screening Resources® (ESR) – which Rosen founded in 1997 in the San Francisco, California area – offers the award-winning ESR Assured Compliance® system for real-time compliance that offers automated notices, disclosures, and consents. In 2019, ESR was named a top background screening firm for enterprise-sized organizations by HRO Today Magainze. To learn more about ESR, visit www.esrcheck.com.
NOTE: Employment Screening Resources® (ESR) does not provide or offer legal services or legal advice of any kind or nature. Any information on this website is for educational purposes only.
© 2020 Employment Screening Resources® (ESR) – Making copies of or using any part of the ESR News Blog or ESR website for any purpose other than your own personal use is prohibited unless written authorization is first obtained from ESR.