AIcohol
The ethics1 of LLMs is a contentious topic. I think this is mainly because of the current reach of the technology (it's everywhere), the hype (people claim it can do anything), and that it's succesful (it's hard to ignore). Unfortunately, a lot of things are mixed into disussions around the ethics of LLMs, and so I've struggeled with figuring out what I think.
Here's my personal take on the ethics of LLMs, as of today.
LLMs are like alcohol
Alcohol is poison. I don't need to list out all of the bad things that happen in which alcohol is involved, but here's a couple of them: you can mess up your own life through addiction; you can destroy your family through alcohol-induced abuse or violence; you can kill random people by drunk driving; you can put increased load on your society due to reduced general health.
Still, alcohol is a part of my life, and I enjoy both beer, wine, liquor, and drinks, and I do feel conflicted about that. If I buy a glass of wine at a restaurant, am I contributing to people being killed in traffic by drunk drivers? I think I am, if only by an epsilon amount. I don't, however, feel responsible for drunk drivers, abusive drunk parents, or addicts who drink themselves to death.
Here are some reasons why LLM ethics is difficult:
- Power usage. LLMs require a lot of power, and we need less fossil energy.
- Copyright. The situation with LLMs and copyright is not clear, and until it is resolved, Big Tech is stomping on Starving Artist.
- Slop. Pretending LLM slop is humanly created can be a breach of social contract.
- Deceit. LLMs can enable deceit at a large scale, for instance with deepfakes or other impersonation, and do it automatically.
There's probably a lot more. I think all of these are valid concerns, and I do feel conflicted by my limited usage of LLMs. At the same time, I don't feel responsible for LLM-induced power price spikes or coal emissions, copyright infringement, bad summaries or software bugs, or deceit, because I used an LLM to do something else entirely. These are shitty things to be happening and we need to work to reduce or stop them, but I don't think auto-completing code a couple times a day using Claude makes me responsible for the slopification.
LLMs deceive, alcohol destroys. I'm still excited by going to a new wine bar, and I'll continue to causiously use LLMs to write code in the hopes that someday it'll actually save me time.
Thanks for reading.
Footnotes
-
... or lack thereof, am I right??! ↩
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License