The structure and practices of the data broker industry and technology companies, such as large social media platforms and the gaps that exist in federal privacy law as well as the changes to federal law, including the Fair Credit Reporting Act (FCRA), that should be considered to give individuals real control over their data were the focus of discussion at a Senate Banking Committee hearing on Tuesday.
Dr. Alicia Cackley, Director of Financial Markets and Community Investment at the Government Accountability Office (GAO); and Pam Dixon, Executive Director of the World Privacy Forum answered the committee’s questions at this hearing titled, “Data Brokers and the Impact on Financial Data Privacy, Credit, Insurance, Employment, and Housing.”
Opening the proceedings, Sen. Mike Crapo, Chairman of the Senate Banking Committee said that more personal information was available to companies than ever before “as a result of an increasingly digital economy.”
“In particular, data brokers and technology companies, including large social media platforms and search engines, play a central role in gathering vast amounts of personal information, and often without interacting with individuals, specifically in the case of data brokers,” Crapo said.
Giving an example of how fintech was impacting unregulated credit scores, Cackley said, “Fintech lenders offer a variety of loans such as consumer and small business loans and operate almost exclusively online. In our 2018 report, we noted that while these lenders may still assess borrowers’ creditworthiness with credit scores, they also may analyze large amounts of additional or alternative sources of data to determine creditworthiness.”
Additionally, she said that the report also found that some fintech firms collected more consumer data than traditional lenders. “For example, fintech lenders may have sensitive information such as consumers’ educational background or utility payment information, and according to certain stakeholders, these data may contain errors that cannot be disputed by consumers under FCRA,” Cackley told the committee.
Dixon offered four observations about this subject during her testimony:
- Credit scores and predictions are being sold that are not regulated by the FCRA
- The technology environment is facilitating more scores being used in more places in consumers’ lives, and not all uses are positive
- These scores are created without due process for consumers
- These scores can cause consumers exceptional harm
She also offered two solutions to Congress to overcome these challenges by expanding the Fair Credit Reporting Act to regulate the “currently unregulated financial scores that affect consumers” and enacting a standards law that “will provide due process and fair standard setting in the area of privacy.”
Answering a question on whether all consumer scores were covered under the FCRA so that there’s a similar appeals process to resolve inaccuracies, Dixon said, “No consumer credit scores that are currently unregulated covered under the FCRA. Unless it is a formal credit score that is articulated by the FCRA and used under an eligibility circumstance it’s not covered.”
Answering a question by Sen. Crapo on how unregulated credit scores that were created for people and managed by artificial intelligence (AI) impacted consumer credit and their financial decisions, Cackley said that while these scores may not be the official credit scores taken from the credit bureaus that were regulated by FCRA, “they can be applied to decisions that companies make about the kind of products they offer to people, and the price those products are offered.”
Cackley added that the products were offered “based on a score that a consumer doesn’t necessarily see and can’t even tell if it is correct or can’t make any attempt to improve the score even if they know it exists.”
Speaking about how predatory lenders tend to take advantage of these scores, Dixon, answering a question by Ranking Member Sen. Sherrod Brown, said that her organization often got calls from people who received advertisements for financial products they didn’t understand “that they could have gone out in the market and affirmatively looked for the best offer.”
“So these predatory marketing devices based on unregulated scores are very significant. Other significant scores are those that predict repayment of the debt,” she said. “For example, the poorest of consumers are targeted the most for debt repayment by companies that use [unregulated data] like the consumer lifetime value scores that impact how well you’re treated by businesses.”
Similarly, she added that companies and education institutions also used a score called the “neighborhood risk scores that decide the way forward for a kid’s education.”
“This is a modern way of redlining because if we are going to be scored by where we live how have we advanced and how have all the laws that are meant to protect from such things operating if such things are still happening?” Dixon asked.
Click here to read Dixon and Cackley’s testimonies and watch the hearing.