Donate

Sam Altman and the cult of effective altruism

How did such a mad, apocalyptic ideology gain so much influence?

Andrew Orlowski

Topics Politics Science & Tech World

Want to read spiked ad-free? Become a spiked supporter.

The effective altruism (EA) movement has emerged the loser after its first high-profile brush with reality this week.

OpenAI, the quasi-NGO behind ChatGPT, was founded on the principles of EA and its board consists of EA enthusiasts. Last Friday, Open AI’s board abruptly dismissed its superstar CEO and founder, Sam Altman, without much explanation. Owing to the bizarre structure of OpenAI, the board had the power to fire the CEO and even nuke the firm completely, should it ever stray from EA’s principles.

However, Altman’s removal triggered a revolt among staff. By Monday night, almost all of OpenAI’s 700-odd employees had signed a pledge to quit too, unless the board resigned and Altman was returned. A few days later, he was duly reinstated and most of the old, mutinous board members were removed.

After the release of ChatGPT, OpenAI had become the hottest ticket in town and Altman had become a superstar. In a tussle with staff, investors and the CEO on one hand, and with a board committed only to serving the EA cult on the other, Altman’s reinstatement was more or less inevitable.

The attempted coup at OpenAI was a rare moment in the spotlight for the EA movement. Although it failed to oust Altman, this strange intellectual cult does hold a great deal of power. Thanks to the sheer gravitational pull of the money behind it, it has managed to bend technology policy across the West to its own bizarre preoccupations. EA has been able to fund dozens of political advisers in the US Senate, UK government and the UK Labour Party.

So what is EA? Its philosophy is grimly utilitarian and consequentialist. EA supporters claim to want to ‘find the best ways to help others, and put them into practice’. But that isn’t really the unique thing about it.

Enjoying Spiked?

Support us with an instant, one-off donation.

Please wait...
Thank you!

Instead, imagine a cross between QAnon, the conspiracy cult, and Mensa, the social society for ‘bright people’, as it bills itself today.

On the surface, EA is a high-minded intellectual and charitable endeavour. But it’s also a fiercely competitive status-signalling exercise. EA supporters want to outbid each other to be seen doing the most amount of ‘good’. That might sound anodyne, until you dig into the details of what these cultists believe.

Both the QAnon and EA cults are obsessed with the apocalypse. Effective altruists compete to posit wildly imaginative doomsday scenarios, invariably involving the arrival of a Terminator-style artificial intelligence, which they ludicrously claim could be about to wipe out humanity. For their part, QAnon followers are confident that mass arrests of paedophile politicians are imminent, followed by their incarceration or execution. In each case, the apostles hope to be vindicated for both their faith and foresight. And the more work you put in, the more confident you come across in your mad predictions, the higher your status among your peers.

It really is no overstatement to call EA a cult. As the Wall Street Journal notes, some of its followers live together or have started businesses together. Holden Karnofsky, co-founder of the biggest EA grant-making operation, Open Philanthropy, was married in an EA-themed wedding to Daniel Amodei, an OpenAI employee who later decamped to start a rival organisation, Anthropic.

Culty groups like this soon reach the limits of sanity and sail confidently beyond them. But where QAnon is widely understood to be deranged, the implausible apocalyptic fantasies of effective altruists are taken very seriously by policymakers. Politicians, civil servants and think-tanks are dancing to their mad tunes.

Effective altruism would have remained deservedly obscure but for its billionaire promoters. Crypto billionaire Sam Bankman-Fried donated over $190million to projects, institutions and individuals, including the University of Oxford’s Future of Humanity Institute. He is now awaiting sentencing for one of the largest financial frauds in US history. After his arrest, other billionaires were around to take up the slack, chiefly Facebook co-founder Dustin Moskovitz. As Vox notes, EA luminary Ben Todd estimates that EA-aligned donors have $46 billion at their disposal, of which one per cent is given to EA projects every year. That’s an extraordinary amount.

Causes championed by effective altruists include veganism and even welfare rights for insects. British users of X / Twitter may be familiar with the daft concept of ‘street votes’ – an EA-approved policy pushed by right-wing think-tankers with the aim of making suburban streets more dense. But the EA cult’s favourite cause is undoubtedly preventing the rise of Terminator AI.

As Laurie Clarke at Poltico recently reported, effective altruists were able to hijack the UK-hosted Artificial Intelligence Safety Summit. ‘Key government advisers sympathetic to the movement’s concerns, combined with [UK prime minister Rishi Sunak’s] close contact with leaders of the AI labs – which have longstanding ties to the movement – have helped push “existential risk” right up the UK’s policy agenda’, Clarke noted. In truth, the likelihood of a dysfunctional product like ChatGPT posing an existential risk to humanity is nil.

Living in an online self-created fantasy world has not only encouraged EAs to lose their grip on reality. They have also taken flight of humanist values and basic morality. EA co-founder William MacAskill’s book, What We Owe The Future, advanced the idea that the welfare of future, unborn humans must be considered in every utilitarian calculation. This became a governing principle for the cultists, which often translates into an astonishing indifference to human suffering in the here and now. ‘Catastrophes that have historically caused human suffering are the mere ripples on the surface of the sea of life compared to existential risks like AI’, argues Nick Bostrom, founder of the SBF-funded Future of Humanity Institute.

‘You can use this philosophy to justify almost any terrible act’, writes Ted Gioa, a music critic who gave up studying at Oxford University to escape the EA cult. ‘Ends not only justify the means nowadays, but obliterate them.’

Terminator AI is a science-fiction scenario. It is astonishing that our political and administrative class did not dismiss it as such. Instead, they just fell into line with the EA fanboys. Just how inattentive and vacuous must they be, to have followed a cult all the way to its fantasy armageddon?

Andrew Orlowski is a weekly columnist at the Telegraph. Visit his website here. Follow him on Twitter: @AndrewOrlowski.

Picture by: Getty.

This is what we're up against...

A media ecosystem dominated by a handful of billionaire owners, bad actors spreading disinformation online and the rich and powerful trying to stop us publishing stories. But we have you on our side. Supporters help to fund our journalism and those who choose All-access digital enjoy exclusive extras:

  1. Unlimited articles in our app and ad-free reading on all devices
  2. Exclusive newsletter and far fewer asks for support
  3. Full access to the Guardian Feast app

If you can, please support us on a monthly basis and make a big impact in support of open, independent journalism. Thank you.

Please wait...

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today