SEC Chairman Critiques AI: Compares Faulty AI to "Hallucinogenic Mushrooms" and Predicts Regulation and Oversight

Federal regulators continue to target Artificial Intelligence (AI), using colorful rhetoric to signal skepticism and justify regulatory oversight. In recent remarks addressing the use of AI in financial markets, SEC Chairman Gary Gensler colorfully compared investment advisors or broker-dealers who use flawed AI to those taking psychedelic drugs. To protect investors and the market from unpredictable new technology trends like the untested uses of AI, Chairman Gensler focused on the problematic use of AI in two scenarios. First, he described investment advisors or broker-dealers who use faulty AI, which is a regulatory issue, and second, he touched on the use of AI to defraud the public, which he characterized as a conflict of interest issue. 

On that first issue, he opined, “you don’t want your broker or adviser recommending investments they hallucinated while on mushrooms,” Chairman Gensler recently said during remarks at Yale Law School and later on CNBC. Gensler warned that if a company is using AI in a material way “and that program has a tendency to hallucinate, they have to consider … [whether using AI] … presents a material risk to investors.” He warned that, prior to using AI, investment advisors and broker dealers should test the AI’s models to determine whether they “take into account” existing legal protections of investors and the market and have a governance plan in place to manage regulatory and other updates. 

On the conflict of interest issue, Gensler stressed the importance of applying “guardrails” to protect against AI recommendations that put broker-dealers’ or investment advisors’ interests ahead of the investor or ultimately conflict with the interests of the investor. This is an area in which the SEC proposed conflict rules in July 2023. These rules would require broker-dealers and investment advisers to address potential conflicts of interest associated with AI and other forms of predictive analytics. Gensler warned broker-dealers or investment advisors against misleading investors by saying they are using an AI model when they are not or inaccurately describing their use of AI, which he labeled “AI washing.” During his recent appearances, Gensler did not specifically address the July 2023 proposed AI conflict of interest rules. That proposal is clearly controversial given the significant number of comments filed. With the 2024 elections looming, final rules issued later than this summer may be vulnerable to a Congressional Review Act invalidation. 

Gensler also addressed competition in AI, tying perceived market concentration to systemic risks. He compared the dominance of a few tech platforms (e.g., in search, retail, and cloud) to what he said may take place with too few dominant AI providers where one could see “[t]housands of financial entities … looking to build downstream applications relying on what is likely to be but a handful of base models upstream.” If these AI models are based on flawed data, Gensler said this could lead to interconnectedness that results in systemic risk or market crisis. These comments appear consistent with the overall interest of this Administration in tech sector competition and in promoting regulation of AI. We have seen numerous agencies taking action. Just last month, FTC Chair Lina Khan addressed the future of AI through the lens of competition, framing up “basic questions of power and governance. Will this be a moment of opening up markets to fair and free competition, unleashing the full potential of emerging technologies? Or will a handful of dominant firms concentrate control over these key tools, locking us into a future of their choosing?”

The SEC is just one of many agencies tackling AI, but its activities can have direct and meaningful effects across the economy. Public reporting suggests SEC interest is neither academic nor passing. As the Wall Street Journal reported, the SEC’s “examinations division has sent requests for information on AI-related topics to several investment advisers, part of a process known as a sweep.”

We will continue to track the SEC’s next steps on AI. Stay tuned for further insights on AI in All Things AI, Wiley’s artificial intelligence hub and from our Privacy, Cyber & Data Governance team or reach out for more information to: Megan Brown, Lyn Brown, and Sydney White.  


Wiley Connect

Sign up for updates

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.