Fintechs weigh in on the White House's executive order on AI

White House
Developers of lending and other banking tools powered by artificial intelligence say their firms are well positioned to weather regulatory changes in the wake of the White House's executive order, but remain mindful of how government agencies might respond.
Graeme Sloan/Bloomberg

As experts across the fintech industry closely monitor the ripple effects of the White House's executive order on artificial intelligence, leaders remain confident in their firms' abilities to adapt to changes in an already highly regulated environment.

The scope of President Biden's order is wide. The order asks the creators of qualifying AI models that are trained on broad datasets, are self-supervised and exhibit high performance levels at tasks "that pose a serious risk to security, national economic security [or] national public health or safety" to provide the government with records of training sessions and cybersecurity test results for the systems.

This level of scrutiny extends to the buyers of the models, who are required to report their purchase as well as "the existence and location of these clusters and the amount of total computing power available in each cluster." Users will also need to adhere to new standards on cybersecurity, consumer data privacy, bias and discrimination.

Lawmakers remain divided on how to effectively govern the evolving landscape of generative AI and its novel applications, pushing those at the helm of tech-savvy institutions to test individual use cases and build governance frameworks of their own.

But in addition to the banks and credit unions integrating AI-powered tools into operations, fintech providers that develop the products are no strangers to the growing number of compliance hurdles impeding adoption across the industry.

Yolanda McGill, vice president of policy and government affairs for Zest AI in Burbank, California, said conversations with legislators are often driven by fears that AI models could go off the rails, or that they could begin replacing human employees.

"In our [industry], there are concerns toward having a really good understanding about what an algorithm is actually doing. … I was concerned that [those fears] would mean we were not going to be able to have the conversations about practical use cases that are happening right now and are impacting people's lives every day for the good or for the ill," McGill said.

Regulators with the Consumer Financial Protection Bureau have increasingly complained about the lack of transparency in how "black-box" AI algorithms are both constructed and generate conclusions — including those used in underwriting models.

The CFPB's focus on rooting out the potential for biases ingrained during the developmental stages is shared by leaders of Zest, who use adversarial debiasing as part of the firm's race prediction tool to vet systems before they are put into use. In doing so, McGill says Zest can comply with existing guidance while waiting for new mandates to take effect.   

"The financial services industry has been grappling with these questions for a long time," McGill said. "Algorithms are not new to financial services, AI is not new to financial services and companies have been innovating within the guardrails of mandatory consumer disclosures, mandatory versions of explainability, prohibitions against discrimination and other requirements for some time now."

Beyond the requirements for reporting training and cybersecurity standards to the government, some fintech experts are unsure if federal agencies are capable of discerning productive models from those that could threaten national security.

The order "immediately posed the question of how do you identify these models without some sort of self-regulation," said Amitay Kalmar, co-founder and chief executive of the AI-powered auto lending fintech Lendbuzz in Boston. 

"From the outside, it's going to be hard for the U.S. government to identify it and I think eventually there is going to be some self-regulation within companies that are working on very powerful foundation models," Kalmar said. "I think it gives a direction for the regulatory bodies to focus on it, but specific areas need to be well defined."

Lendbuzz's software-as-a-service platform deploys machine learning algorithms to analyze consumer financial data such as bank transaction history and establish a credit score for qualifying borrowers. The firm then handles the underwriting of the loan at the point of sale, which is backed by funds provided through Lendbuzz's bank partners.

Kalmar is working to keep Lendbuzz ahead of the executive order's guidance by strengthening cybersecurity and improving transparency in the building of the models for stakeholders and regulators alike.

"The price a company or financial institution could pay for breaches and failure there is very significant, and I think AI will present new risks in that area that we have not been exposed to in the past," he said.

The popularity of products powered by generative AI has continued to rise in recent months, but friction points have caused many organizations to deprioritize their use.

A survey of 179 experts across the financial services and insurance industries conducted this year by Arizent, the parent company of American Banker, found that 40% of financial institution respondents said a lack of resources was the No. 1 obstacle to innovation. Other factors included legacy systems, regulatory burden and competing priorities.

Throughout a significant portion of the order's recommendations for regulators to increase their oversight is an emphasis on data gathering during the development phases. Agencies in Europe and the United Kingdom have approached the industry from a different angle, focusing instead on individual use cases.

Ed Maslaveckas, co-founder and CEO of the London-based open-banking and data intelligence firm Bud Financial, said agencies in the U.K. are better positioned to create guidelines for the use of AI in financial services by examining real-life applications.

"I'm glad that people are taking this really seriously, and that we've seen all the activity happening in the U.S., U.K. and Europe as the main concern was we would let the world run rampant for years, and then when something bad happened, we'd draft regulation. … I think we're moving in a positive direction, but outcomes" produced by the models in question are the No. 1 thing, Maslaveckas said.

Bud Financial, which expanded into the U.S. earlier this year, participated in the Financial Conduct Authority's first-ever regulatory sandbox in 2015 shortly after the company was founded. Through working with the FCA, which oversees the financial services industry in the U.K., the fintech better understood regulatory expectations.

"I like the way the U.K. is thinking about oversight being driven by use cases, and because it's use case driven, it's regulated by the subject matter experts," who have the technological understanding necessary to lead campaigns for change, Maslaveckas said.

Fintech leaders are closely monitoring the next steps from changemakers when formulating plans for future innovation.

Scienaptic AI in New York, which offers loan decision software that uses machine learning to analyze large amounts of data, vets models for fairness and accuracy long before they are put into production. This process, which has been in place for some time, helps the fintech stay ahead of regulatory shifts, the company said.

"There is a significant need to enhance financial inclusion in the U.S.," said Vinay Bhaskar, chief operating officer and head of AI and compliance initiatives at Scienaptic AI, who added that his company tries to push that agenda through its software. "The CFPB and Federal Trade Commission continues to do a solid job in highlighting the broad objectives and principles around lending and this Executive Order underscores the value of the those objectives as it proposes a risk-based approach to manage and implement AI."

Updated 11/09/23: This story was updated to provide added context into Scienaptic AI's operations.

For reprint and licensing requests for this article, click here.
Artificial intelligence Fintech Regulation and compliance Technology
MORE FROM AMERICAN BANKER