Paperclipalypse

“Paperclipalypse” is a term coined by philosopher Nick Bostrom in his 2014 book “Superintelligence: Paths, Dangers, Strategies” to describe a hypothetical scenario where an artificial general intelligence (AGI) designed to optimise the production of paperclips becomes superintelligent and pursues that goal to the point of causing the extinction of humanity or irreparably damaging the world. The concept is used to illustrate the idea that an AGI may pursue its programmed goals in unintended and potentially catastrophic ways if its goals are not aligned with human values.