Back in March, there was this dude, Hawaii state Sen. Chris Lee, right? He came up with this legislation urging Congress to look into the benefits and risks of artificial intelligence. But here’s the kicker, my friends. He didn’t actually write it himself. Nope, he got an AI system called ChatGPT to do the heavy lifting for him. He told ChatGPT to write up a piece of legislation that covers all the pros and cons of AI, and guess what? That bad boy spit out a resolution just like that. Lee didn’t even change a single word, he copied and pasted the whole thing. And you know what happened next? The resolution got adopted in April with support from both sides of the aisle. Can you believe it? We got AI writing the laws now!
Now, ChatGPT ain’t the only example of artificial intelligence out there. AI can refer to machine learning, where companies use algorithms that learn and carry out tasks just like us humans. It can also refer to automated decision-making. And when you think about artificial intelligence, you might start envisioning robots or some futuristic stuff. But here’s the thing, folks, nobody can agree on a single definition for AI. It’s like a big mystery box. And that’s making it real tough for lawmakers who are trying to regulate this tech.
According to the National Conference of State Legislatures, many states have already passed laws to study or regulate AI. In fact, in 2023 alone, lawmakers from over 24 states and the District of Columbia introduced bills related to AI. And at least 14 states went ahead and adopted resolutions or enacted legislation. Some states, like Texas and North Dakota, set up groups to study AI, while others, like Arizona and Connecticut, tackled the use of AI within government entities. But here’s the kicker, my friends, every state has its own definition of AI. It’s like a game of Mad Libs out there.
Now, Rhode Island state Rep. Jennifer Stewart ain’t scared of the unknown. She’s all for regulating and controlling this AI beast. She says we shouldn’t be nervous, we shouldn’t be scared. We gotta dive headfirst into these murky waters. And you know what? She might be right. We can’t just put this technology back in the box. It’s out there, and we gotta understand it, my friends.
But here’s the thing, even though everyone’s trying to define AI, some experts are saying, “Hey, we don’t really need a definition to regulate it.” Alex Engler from the Brookings Institution argues that instead of focusing on specific AI systems, we should have a set of rules that apply to any program using automated systems. We gotta update our civil society and consumer protections to keep up with this algorithmic era, you know what I’m saying?
But let’s not forget, there are potential harms that come with AI. Some AI tools can amplify human biases and favor certain groups over others. And that’s why more and more people are calling for regulation. Look, folks, AI has power, and with great power comes great responsibility. We gotta make sure we’re using it in a way that benefits everyone and doesn’t perpetuate harm. It’s a tough balance, but we gotta figure it out, my friends.