If Anyone Builds It, Everyone Dies

aw_product_id: 
41303569945
merchant_image_url: 
merchant_category: 
Books
search_price: 
22.00
book_author_name: 
Eliezer Yudkowsky
book_type: 
Hardback
publisher: 
Vintage Publishing
published_date: 
18/09/2025
isbn: 
9781847928924
Merchant Product Cat path: 
Books > Science, Technology & Medicine > Technology, engineering & agriculture > Technology & engineering: general
specifications: 
Eliezer Yudkowsky|Hardback|Vintage Publishing|18/09/2025
Merchant Product Id: 
9781847928924
Book Description: 
The founder of the field of AI risk explains why superintelligent AI is a global suicide bomb and we must halt development immediately AI is the greatest threat to our existence that we have ever faced. The technology may be complex but the facts are simple. We are currently on a path to build superintelligent AI. When we do, it will be vastly more powerful than us. Whether it 'thinks' or 'feels' is irrelevant, but it will have objectives and they will be completely different from ours. And regardless of how we train it, even the slightest deviation from human goals will be catastrophic for our species - meaning extinction. Precisely how this happens is unknowable, but we what do know is that when it happens, it will happen incredibly fast, and however it happens all paths lead to the same conclusion- superintelligent AI is a global suicide bomb, the labs who are developing it have no adequate plan or set of policies for tackling this issue, and we will not get a second chance. From the leading thinkers in the field of AI risk, If Anyone Builds It, Everyone Dies explains with terrifying clarity why in the race to build superintelligent AI, the only winning move for our species is not to play.

Graphic Design by Ishmael Annobil /  Web Development by Ruzanna Hovasapyan