Discover more from The Simple Heart
In Defense of Sam Bankman-Fried
What we can all learn from the collapse of the vegan billionaire's empire.
It seems everyone in the world is attacking FTX and its vegan founder, Sam Bankman-Fried. But what if the collapse of the FTX crypto exchange is not the result of Sam’s personal flaws, but of deeper collective failures? That’s the argument I want to make in this blog: three social forces — tribalism, favoring ideology over experience/evidence, and favoring theoretical impact over intrinsic meaning — were at the root of FTX’s collapse. And these forces are not Sam’s failures. They are all of ours. The good news is that if we can all learn from what happened, we can resist those forces and help build a world where catastrophes like the FTX collapse — and other, even graver injustices — don’t happen.
Thanks for reading The Simple Heart! Subscribe for free to receive new posts and support my work.
In June 2014, I got an unexpected Facebook message from an acquaintance named Sam Bankman-Fried, or, as he is now known, “SBF.” At the time, Sam was not yet SBF, the vegan crypto billionaire. He was just Sam, a recent MIT grad who was deeply committed to animal protection. I had conversed with him, off and on, for many years in various forums for Effective Altruism, or “EA”, and he was reaching out to me, surprisingly, for advice about animal activism.
It was surprising because I was an unwelcome person in EA communities. While I had many of the credentials people respect within EA — graduate study in economics at the University of Chicago and MIT; a former faculty position at Northwestern Law — my reputation within the EA community took a major hit when I founded the animal rights network Direct Action Everywhere (DxE). DxE, which focused on building a grassroots movement of nonviolent protest, was derided by many of the most powerful figures within EA. This included the single most prominent animal EA, Nick Cooney, who regularly told EAs that my work was hurting the cause of animal protection. (I gave a talk at a conference in 2013, titled the Science of Social Change, responding to some of Nick’s critiques. You can watch it here.)
While personal animosity with Nick was probably part of the reason for this backlash — as a lawyer, I had briefly represented, on a pro bono basis, a woman who had accused Nick of sexual assault back in 2011 — there was also a bad cultural fit between DxE and EA. EAs were generally soft-spoken, hyper-rational, and almost Spockian in their lack of emotional expression. DxE, in contrast, was inspiring people to be loud and proud, to wear their emotions on their sleeves — and most EAs found DxE (and therefore me) very disconcerting. I was the subject of many weird rumors, whispered by one EA to the next, and it quickly turned into a social cascade. By 2014, I had been effectively ousted from the EA movement, with many of my posts prohibited in EA Facebook groups, and one prominent EA even publicly accusing me of being a likely mole for the meat industry. “Why else would he support such ridiculous tactics?” the person wrote.
Sam’s message to me, then, was a little like a member of a church reaching out to someone who had been excommunicated from the faith. And, honestly, it impressed me.
That willingness to talk to someone with a poor reputation, given Sam’s high standing in the EA community, was not the only thing that impressed me about him, though. The fact that he was primarily interested in animals — hardly a popular cause in the mainstream at MIT or in EA — showed to me that this was someone who was trying to do the right thing, even if the right thing is unpopular. Sam was also humble. In a wide-ranging text conversation that covered everything from the cultural norms in finance, to the power of political protest, Sam struck me as earnest, well-meaning, and (above all) open to learning. He was a kid — very bright, and very motivated — who just wanted to help animals, however he could. Indeed, he was going into finance to do exactly that. He had previously been advised, by the philosopher and EA leader Will MacAskill, that the best thing to do was to “earn to give.” Go into a high-paying profession and earn money to give to effective charities. I gave him the opposite advice: consider working directly for animals, I said, both because the movement was in need of highly-committed people, and because the culture of high-earning professions could be corrupting.
But Sam didn’t take my advice.
The rest, as they say, is history. Sam went into high finance then jumped into cryptocurrency and built a fortune of billions. By all accounts, he gave a massive amount of this money away, and was widely applauded for doing so. But the company upon which those billions were earned, FTX, collapsed earlier this month and triggered a nationwide spasm of finger-pointing. There are few stories as irresistible as a hero who has fallen. Even former friends and mentors of Sam, including MacAskill himself, have publicly denounced him.
What has been missing from most of these conversations, however, is clear evidence — screenshots, documents, etc. — that Sam did something unlawful or even morally wrong. It’s clear he made bad calls, at least with the benefit of hindsight.1 For example, the lack of so-called corporate controls (i.e., policies to ensure responsible decision-making as an organization grows) is one likely culprit in FTX’s collapse. The left hand lit a match (lending out customer funds for "safe" investment), not realizing that the right hand was filling the room with gas (promising to return customer funds on a moment’s notice). The lack of such controls was certainly Sam’s responsibility, as CEO of FTX, but it is a mistake I have seen in literally every startup I have ever advised (which numbers a couple dozen, as I was a venture lawyer from 2012-2017). And there is a big difference between intentional misconduct — where a perpetrator is seeking selfish benefit at the cost of harming others — and a mistake made with good intent.
I would be surprised if any such evidence of misconduct by Sam comes out at this point, given the massive incentive for anyone who had such evidence to publicly reveal it. (Moreover, recent reports suggest Sam was trying to save the company — and his customers’ money— even when everyone else concluded that everything had been lost. That is not the sign of someone committing fraud but of someone trying to correct a mistake.)
But the bigger point I want to make has nothing to do with the evidence of misconduct but rather with root causes. And regardless of whether Sam simply made mistakes, or committed fraud, the scapegoating of Sam Bankman-Fried allows us to ignore the fact that all of us have contributed to the dynamics that caused FTX’s collapse.
The first force is tribalism, i.e., favoring people or organizations simply because they are part of “our group.” This favored group could be a political party. It could be a race or religion. It could even be veganism or animal rights. But regardless of the nature of the group identity, it is a very human tendency. And what seems unquestionable, in all the analysis of FTX’s collapse, is that it was caused in part because Sam surrounded himself with only people in his tribe, i.e., fellow EAs. His executive team, employees, and social circle were comprised entirely of adherents of Effective Altruism. The problem with this is that, when your group is caught up in a view that is factually or even morally wrong, it will be very hard to see your error because you are all part of the same tribe. That is likely what happened at FTX. They were all part of the same group and therefore couldn’t see the blindspots in their strategies. Yet our society generally, and EA specifically, are pushing us even further in the direction of greater tribalism.
We are told that those who disagree with us are not just interlocutors or adversaries; they are enemies to be defeated. Efforts to bridge gaps between warring groups are often seen as, not just ineffective, but actively traitorous. And the result is that each tribe can stay isolated from others — and stuck in its own flawed beliefs, even if these beliefs will lead to disastrous outcomes. We need to resist this.
The EA movement’s focus on “selectivity” and “cultural” fit taught Sam the opposite lesson. People who are different are stupid or even dangerous; you should only trust the members of your tribe. Many years ago, when a group of animal advocates applied to present at EA Global in 2018 on the subject of the California fur ban — plausibly one of the most effective interventions to help animals in the last decade — I was shocked when our conversation with the conference administrators had almost nothing to do with the evidence of impact, or even our way of thinking about effective advocacy, and everything to do with where we “fit” in the community of EA.
“What groups do you work with?” we were asked. “And are they committed to EA?”
I responded that the groups involved in the fur ban campaign were not formally “EA groups” but that many of the campaign’s leaders, including me, were deeply committed to EA principles.
The administrators denied us a presence at the conference.
This was the culture that was formative for Sam, and which apparently led him to surround himself with only people who identified as EA. This is unhelpful if we want to make good choices. If he had a co-worker or good friend who was a crypto skeptic, an avowed anti-capitalist, or heck, even someone who was just not an EA, perhaps he would not have made the decisions he made. We should all strive to be more like Sam in 2014, when he reached out to an EA apostate like me, and less like Sam in 2022, who lived in an EA bubble. To test whether you are doing that, ask yourself: do I regularly converse with people whose identities are strange to me, and maybe even a little jarring? If the answer is no, go out there and try it! It’s not as scary as you might think.
The second force is favoring ideology over experience or evidence. Ezra Klein, among many others, has written about how ideology doesn’t just motivate us to defend our positions; it actively distorts the way we see the world. Yet politics, policy, and even individual decision-making are increasingly motivated by ideology rather than experience or evidence. This happens very often in the movements I am a part of, such as veganism. Whenever evidence comes out about the environmental benefits of veganism, we all cheer. But when evidence comes out about the health benefits of a high protein diet, we dismiss it out of hand. And I think this tendency — to see the world through an ideological rather than evidentiary lens — is part of the reason for FTX’s collapse.
You see, much of FTX’s profit, and the profit of its sister company Alameda, was based on an ideological commitment to certain financial ideas — the liquidity of free markets, the promise of so-called risk-free arbitrage— that were not well grounded in evidence. Specifically, Sam saw a theoretical trading opportunity—e.g., a small discrepancy between the price of Bitcoin in Japan and in the United States — and thought he could make a near limitless amount of money if he could just borrow enough to capitalize on that and similar trades, indefinitely.
The idea, however, was grounded in a particular ideology and not in the lessons of history — or in the experiences of wiser hands who have seen how such supposedly risk-free trades can cause catastrophic collapse, when you borrow enormous amounts of money to fund them. In particular, if the people you are borrowing from all suddenly ask for their money to be returned, you will quickly go bankrupt. Sam was almost certainly aware of this risk, but his ideological commitment to the strategy — and perhaps to the deeper philosophy of earning to give — may have blinded him to its risks.
The EA movement, while nominally focused on evidence-based approaches, has gone through a similar shift, from the empirical to the theoretical. Its original focus on global poverty -- a well-documented problem -- has transitioned rapidly to speculative, theoretical, and ideological threats, such as stopping a future hostile artificial intelligence. Fighting evil AI is fun and cool and puts us in a space where there’s an awful lot of money, given the millions that AI engineers are paid. But it’s also a tech bro’s fantasy; there’s no real evidence, yet, that AI or something like AI is a threat to human well-being. And the overwhelming focus on AI is a demonstration that EA has become deeply anti-empirical.
I’ve seen this firsthand in the realm of animal advocacy. Theoretically-sound interventions, such as cage-free eggs, have received massive investments – and distorted the entire landscape of animal rights work as a result – in part because most hadn’t really bothered looking at things in the real world.2 In particular, no one bothered to check to see if the theory behind cage-free eggs was actually being lived out in reality, e.g., by actually visiting a cage-free factory farm.
Sam was born from this culture. And it’s not surprising to me that, as a result, his company focused on theoretical strategies for doing good — exploiting supposedly risk-free trades — rather than evidence-based strategies. It’s also not surprising that his team lacked someone with the experience from the 2008 financial crisis to warn him that risk-free investment strategies are rarely actually risk-free.
But, again, Sam was not unique in being victim to these forces. All of us, at times, are probably doing the same thing. So we should ask ourselves: when I say that I know something is right, that aligns with my ideology or identity, how often do I actually do my homework and look at the evidence? And how often do I ask someone with actual experience to assess whether I’m right?
The third and most important force causing the collapse of FTX, however, as favoring theoretical impact over intrinsic meaning. Having been surrounded by high-performing professionals for most of my life, I’m stunned by how few people actually spend time working on things that actually matter — even to them. We have all been taught that our lives are basically transactional.
“There are things you have to do — work, go to school, make some money — to achieve the things that actually matter,” we’re told.
The problem with this transactional mentality is that the transaction (when repeated day after day, week after week) changes us, to the point that we lose sight of why we were engaging in the transaction in the first place. Transactional thinking turns our entire life into a transaction, and we — and the world — suffer as a result.
Sam was an example of this. He was advised by Will MacAskill in 2012 that he should do the high status, high-wealth thing — go work for a bank — so he could have a greater impact on the world. It was a transactional mentality on life, and it led him down the path to see everything else as transactional too. Including his customers’ funds (and the devastating risk posed by using those funds for trades).
But imagine Sam was given a different set of advice, similar to the advice that I received when I was a young, mathematically-inclined student 10 years earlier. I had offers from McKinsey, a prestigious consulting firm, and felt confident I could get similar offers from Goldman Sachs and other well-paying financial institutions. While earning to give was not yet a thing, I rationalized my pursuit of such opportunities by saying to myself, “You’ll have a greater impact. You’ll have more money and influence at McKinsey than by doing charitable work.”
But my mentor in college, a young economics professor named Mark Duggan, told me I had it all wrong.
“Don’t chase money or prestige. Do something that matters,” Mark told me. “That’s what defines people who truly have a positive impact.”
Mark lived this in his own life. He worked at McKinsey but left it to study how to improve health care for the poor. And his charitable work was not purely academic. I could see it in the way he interacted with the people in his life. He knew the names of the janitors at the University of Chicago, and he always thanked them for their work.
Of course, not all of us will find our work intrinsically meaningful, on a moment to moment basis. But we can all find work that aligns with our values, even if our specific task isn’t almost the most exciting or meaningful. In this way, even a janitor can feel inspired by their work, if it’s for an organization with good values. We can all, as my friend Steve Roggenbuck has hilariously put it in the video below, make something beautiful before we die. It’s not just good for the world. It’s good for our personal growth, cultivating our empathy and kindness and hope.
Sam didn’t listen to this advice. And in reading some of his recent interviews, he seems a very different person than the earnest kid I spoke to nearly 10 years ago. He talks about winning rather than being kind; he attacks the people around him as adversaries, rather than fellow sentient beings in pursuit of the same simple needs; and he has lost trust in the very notion of ethical work. That was the result of his choice to live a transactional life. It transformed him for the worse.
But it’s transforming us, too. We all have one life to live. So we should ask ourselves: are you working for things that are good and meaningful, or are you living your life as a transaction? Are you making something beautiful before you die?
Some notes and updates:
I’m changing my blogging schedule, and will now be publishing on Tuesdays and Thursdays rather than Tuesdays and Fridays. The Thursday blog will usually be short — just a description and a few key points from the latest podcast. (This week, I’m publishing a conversation with green crypto entrepreneur Helena Merk.) I still want to get your feedback on what to write about, though, so please take this survey and let me know what you’d like me to write about next.
Upcoming Events. This Saturday, December 3, I’ll be running the Open Rescue Experience in San Francisco. Space is limited, though, and by the time you read this email the slots may have already been filled. But apply to attend if you’re interested, and even if you don’t get in this time, we’ll try to include you on the roster for the next event. Next week, we have two great events. First, we’re organizing a meetup for vegans in tech on the evening of Saturday, Dec 10, with sundaes for everyone! Second, on the morning of Sunday, Dec 11, I’m giving a breakfast talk titled, How Factory Farming Endangers the Future. Food will be served! Hope to see you at some or all of these events!
Rethinking the podcast. I’m taking some massive new projects in coming months, and I’m debating what I can cut from my schedule to make sure those projects go well. The podcast has been lots of fun, but it reaches far fewer people than the blog (or social media videos). So we are thinking about whether to continue it. If you’re a huge fan of the podcast, and it’s had an impact on you, let us know! It’ll help us/me assess whether it should continue.
It’s worth pointing out that many decisions are good decisions, even if they look back in hindsight. This is the very nature of risk.
I want to shout out the Open Philanthropy Project for being one of the few EA organizations that actually did a deeper dive into cage-free eggs before committing funds. I still believe their decision was incorrect, and their ultimate conclusion probably motivated by confirmation bias. But I respect that they engaged with the evidence.