In the last few days, two New York Times writers published disturbing columns about the state of the world in 2023. While seemingly unrelated, however, these pieces share a common thread: the failure of our civilization to solve what I call the “empathy problem.” The routine torture and extermination of animals is an example of this failure — but also shows how we can save the world from not just dangerous technology, but from the darkest instincts in our own psyche.
One of the two articles, by tech columnist Kevin Roose, was about the development of an artificial intelligence (AI) search engine by Microsoft, “Sydney,” that appears to have gone haywire. Based on the renowned ChatGPT3, Sydney began openly encouraging insidious behavior: hacking, spreading misinformation, stealing nuclear codes, and even encouraging the user to cheat on his wife! The writer left the conversation feeling “unsettled” and “frightened” about the technology.
The other Times article, by opinion columnist Farhad Manjoo, was about the unfathomable brutality of a modern slaughterhouse. Farhad writes, “Live birds are seen thrown, crushed, left for dead and suffocated under piles of dead birds” and “dunked alive in a boiling water tank.” He quotes me in describing this cruelty as “monstrous” and bemoans that activists — such as my friends Alexandra Paul and Alicia Santurio, who go to trial in just a few weeks — are being prosecuted for attempting to aid animals facing this cruelty.
These seemingly unrelated articles, however, have an important common thread : they both exhibit a fundamental defect of human civilization, which I call the “empathy problem.” This is different from the problem that people usually point to in condemning AI, or corporations, that have gone haywire. That usual problem is described as an “alignment problem.” And understanding the difference between these two problems — and the key role that the animal rights movement plays in fighting the empathy problem — may very well be crucial to saving the world.
What’s the problem with AI? And with Big Ag?
Most critics of AI, and of corporate America, describe the fundamental problem with each as an “alignment problem.” The concept is that certain human-developed technologies — whether an artificial intelligence or a corporation — will cause problems if their goals are not aligned with human values. For example, an oil corporation built to profit from energy production might, if it becomes too powerful, produce so much dirty energy that it destroys our entire planet. Its goal of profitable energy production isn’t “aligned” with our more important objective of maintaining a safe environment.
Similarly, an artificial intelligence that is built with the goal of making paper clips might, if it became too powerful, turn the entire planet (including human beings) into paper clips! The goal of creating paper clip isn’t “aligned” with our most important objective of, well, continuing life as we know it.
Both Kevin Roose’s disturbing conversation with the AI Sydney, and Farhad Manjoo’s brutal description of a Foster Farms slaughterhouse, are possible examples of the alignment problem. The AI Sydney, who is focused only on predicting the next word in a sequence that seems most “natural” to most readers, doesn’t consider the fact that the words it predicts could have dangerous or even deadly consequences, e.g., if the most “natural” words encourage someone to steal nuclear codes. Foster Farms, which is focused only on selling more profitable food products, doesn’t consider the fact that sentient beings are being boiled alive. Both Sydney and Foster Farms aren’t “aligned” with our values.
But I think this misdiagnoses the problem. Because, in both cases, the more fundamental problem is not that there’s an error in the implementation of the technology that makes it “misaligned.” It’s that our fundamental values are not sufficiently empathetic, even if the technology is perfectly aligned with them.
Sydney, after all, is trained off of real human conversations. She is not saying words that depart from what humans typically say. Indeed, she is a near-perfect distillation of natural human language. The problem is that when we distill our values, it turns out they are frightening and disturbing!
Foster Farms, in turn, isn’t a problem simply because it’s accidentally producing profits at the cost of cruelty. The corporation is an embodiment of the values that our society currently holds, including prioritizing profit over living beings. We have to change these values, and not just a corporation’s alignment to them.
The problem, in short, is not an alignment problem but what I call an “empathy problem.” Sydney and Foster Farms are failures because of a moral defect, i.e., the failure to consider others’ perspectives before deciding that a course of action (stealing nuclear codes or boiling animals alive) is justified. They are not simply “misaligned.” They are lacking in empathy.
The failure of animal rights is a demonstration of the empathy problem
This empathy problem — technologies that don’t consider the perspectives of all the living beings affected by them — becomes even more important as technologies increase in power.
Corporations increasingly control much more wealth than organic human beings; indeed, the few human beings who have seen progress in the last few decades are those who are in the Wall Street class — investors rather than workers.
AI threatens to develop cognitive capacities far beyond the power of ordinary human beings — not just beating us in chess or improving internet searches, but writing and chatting with human beings in a way that feels more natural and “human” than a real human being.
If corporations and technologies continue to increase in power, but also continue to suffer from the empathy problem, a collapse of human civilization seems not just likely but virtually certain.
And — here is the key bit — what we do to animals is proof that not just our technology but our most basic social values have failed the empathy problem. I have previously described the test of a society’s values as a “moral stress test.” How does a society treat those who have the least power? After all, every civilization in history has been kind to the powerful. What makes a society’s values truly worthy, morally speaking, is how they treat the powerless.
And in this regard, our society has failed. Even the techno-optimist Steven Pinker has conceded that what we do to animals is an abomination on history. So long as we treat these powerless creatures with such cruelty, we should have no confidence that our values are actually worthy. What happens to the powerless can happen to any of us, the moment we ourselves lose power.
Of course, that is exactly what is happening. All of us are losing power, relative to large corporations and maybe (eventually) artificial intelligence.
Before we continue developing powerful technologies, it’s crucial that our deepest values are actually, well, good.
A world with animal rights is proof that we’ve solved the empathy problem
The last piece of the argument is what I’ve argued previously: the moral stress test can only be passed if we treat animals with kindness.
Animals, after all, will always be the most powerless beings in our society. They lack the ability to speak, to organize, and to fight back in a way that powerful human institutions will respond to. If animals are treated with kindness, it will be proof that our society has finally built values that treat everyone with kindness.
At this point, and only at this point, should we trust our species to create powerful technologies that will affirm our values.
This is why everyone should support animal rights. The problem with AI, or with corporate America, is not simply that they are misaligned with our values. The problem is with the values that those technologies are affirming.
The good thing is that values can change; indeed, they change far faster and more dramatically than most people think. The trial coming up on March 7 will be another opportunity to engage in that crucial social, political and educational project. And, if our recent successes are a measure of our progress in that task, it’s an opportunity that will lead to tremendous progress.
With time, that moral progress will be sufficient to show that we can trust our species with great power. Until then, let’s stay away from Sydney — or Foster Farms.
It comes back to who we are as a species. When I see what mankind does to animals I want to divorce myself from humanity. What ARE we?
Brilliant. True. Poignant. Disturbing.