If Anyone Builds It, Everyone Dies
by Eliezar Yudkowsky and Nate Soares
AI (so-called) has been invading our lives stealthily for several years. Its power increases exponentially (or faster) by the month, the week and possibly even the day. It brings many benefits to humanity; nearly all machines do that. There have always been negative elements of course, and I won’t bother to catalogue the obvious.
About ten years ago, the late Professor Stephen Hawking said, ‘Success in creating AI could be the biggest event in the history of our civilisation. But it could also be the last – unless we learn how to avoid the risks.’

In more recent times, the incomprehensible advance in computer power has brought those risks more into focus. Driven by money and greed, this advance has the potential to truly threaten humanity in its passage from beneficial machines to their goal of artificial superintelligence [ASI].
If Anyone Builds It, Everyone Dies [IABIED], a recent book by Eliezar Yudkowsly and Nate Soares, themselves scientists working in the AI field, explores that threat in detail.
Comparing the computer power of today with that of even a few years ago is, I think, a bit like comparing our rather narrow view of this beautiful planet with the vast majesty of the universe as a whole. Transistors can work billions of times faster than human neurons.
‘We don’t know where the threshold lies,’ Yudkowsky and Soares say, ‘for the dumbest AI that can build an AI that builds an AI that builds a superintelligence.’
There is the nature of the threat in a single sentence – the Terminator scenario, if you like. We simply don’t know!
IABIED is a difficult book in two respects. First, there is the plausible, though not inevitable future it predicts. Secondly, parts of it are quite technical: the authors delve into such things as ‘gradient descent’, the process by which AI scientists train their AI to predict by trial and error; another is the ‘alignment problem’, which describes the idea of the impossibility of aligning the AI’s ‘wants’ with human wants, no matter how well it is trained.
The book is divided into three parts. In the first, the writers identify the problem, outline their thesis and explain the processes used to train computers to predict. The second describes one nightmare scenarion, in which ASI ascends to dominance and no longer needs humanity to direct it. Part Three evaluates the human challenge and examines ways in which we, the humans, might avoid dooming ourselves to distinction.

My own concerns about AI go back to the moment when I discovered a few years back that AI researchers were dabbling in the arts – in music, the visual arts and literature. At the time, I asked myself (and am still asking) one very important question. Do I want to read books, look at paintings, listen to music made by machines; do I want to listen to lectures given by them? [Yes, they are already out there!!]
My answer is an unequivocal NO. And there is a simple reason. Each of those activities for me – reading, watching, listening – involves in human terms a contract between creator or provider and user. This is the very essence of art. If you think differently, I suggest you ask yourselves ‘why?’ and examine the consequencies of human replacement in those fields.
I have to say I’m almost equally suspicious of what are commonly considered ‘aids’, especially in the creation of written work – aids to grammar, spelling, syntax and a whole host of other things that ‘dinosaurs’ like me learned at school. We did stuff with our brains. I fear the runaway use of such aid methods will lead to atrophy of the human brain, already showing signs of taking place in the brains of some politicians who use machines to write their speeches, not to mention the ridiculously high speed lectures (eg on Youtube), which leave no time for thought or understanding.
It cannot surely escape most people’s notice that AI – or at least computers – are already replacing human beings in supermarkets, banks, engineering and I suspect (as defined in the preceding paragraph) teaching. We should ask whether all this change is with the objective of making obscene profit.
Don’t worry, nurses, doctors, lawyers . . . you will be next!
Yes, IABIED is a difficult book at times, but it is an important book, a serious book, and a scary book, presenting as it does the huge challenges facing humanity in the next decade.
Read it and be very afraid!
***