67965336a9c78.preview

Virginia legislators are considering imposing a complex compliance regime on nearly everyone who touches artificial intelligence for a wide range of business uses. House Bill 2094 would affect not only well-funded firms like OpenAI or Google DeepMind and small AI startups, but nearly every Virginia business. In some contexts, it would even affect individuals.

It is, in other words, a staggeringly broad law that would push Virginia in the direction of the heavily regulated European Union.







Dean Ball

Dean Ball


No one should sugarcoat the fact that AI creates major challenges to society, even if you believe the benefits will greatly outweigh the risks. It may meaningfully enable large-scale cyberattacks and other catastrophic harms. It could upset the balance of power in our republic in ways that require intervention of some kind. And it could eventually result in rapid reductions in human employment in some industries, necessitating at least short-term relief for workers. However, HB 2094 addresses none of these challenges head-on.

People are also reading…

What the bill does target is “algorithmic discrimination,” or the notion that an autonomous system of some kind might discriminate against people based on age, religion, gender, race, or other civil rights-protected traits. It does so primarily with paperwork burdens and legal liability when an AI system is used to make a “consequential decision” that could have a “material effect” on a consumer’s access to things like insurance, financial services, employment and education, or the terms of such agreements.

These requirements apply to AI developers, as you might expect. They apply to “integrators” who combine AI with other software systems, and to cloud computing companies which distribute AI systems. Finally, they apply to “deployers,” which includes anyone who uses an AI system. And whether or not you are a fan of AI, its use will become much more prevalent and necessary to make a living moving forward.

Imagine being the owner of flower shop who needs to hire a new employee, or even a parent hiring a nanny. You post the job online and get a few dozen applications. Lacking time, you simply ask ChatGPT to pick out the applicants who have previous, relevant experience.

In most of America, that’s a normal and acceptable use of AI. However, in Virginia, that would make you a “deployer” of a “high-risk artificial intelligence system” used as a “substantial factor” in making a “consequential decision.”

As a result, you’d need a “risk management plan” and “algorithmic impact assessment.” The latter would need to be updated each time there is a “substantial modification” made by OpenAI to ChatGPT, which occurs roughly once per month. If you did not write these documents (or more likely, pay a consultant to write one for you), you’d face a fine of up to $1,000.

And if your system is found to have had a disparate impact on any protected class, regardless of whether you — someone who merely used a publicly available platform — intended to discriminate, you’d be subject to lawsuits.

This is merely one example; just as ChatGPT and other frontier language models have countless productive uses, the range of ways this law could be applied boggle the mind. Given how large AI’s presence in our lives is likely to become, the burdens this law imposes will only grow with time.

This is because AI is a general-purpose technology, much like electricity, the integrated circuit, or computers themselves. The difference is that we do not make businesses or individuals fill out paperwork to use electricity.

What’s more, the kind of discrimination HB 2094 envisions is already illegal under both Virginia and federal law. If a business discriminates against a particular group in its hiring decisions, that’s a crime, regardless of whether pen and paper, a computer, or an AI system was used. What HB 2094 really adds to the equation, then, is a mountain of compliance paperwork and a boon for consultants and lawyers.

Laws governing the use of general-purpose technology should be intuitive for the average user, whether it’s an individual or corporation. Members of the state legislature should be thinking about whether this bill is intuitive enough, and whether imposing so many complicated mandates on their constituents will weaken Virginia’s economic competitiveness. They should also consider whether this move would push America’s digital economy further toward the same kind of regulatory morass in the physical world that has made it so difficult to build new infrastructure.

The recent election showed us that Americans desire a simpler regulatory regime, not a more complicated one. Our state lawmakers should take this to heart, though it is no small task. Simplicity may be far harder to achieve than complexity, but it’s worth the effort.

Dean W. Ball is a research fellow with the Mercatus Center at George Mason University in Arlington. Contact Ball at dball@mercatus.gmu.edu.

Source link

Share this: