Summary: An Intelligence Explosion is the idea that a greater-than-human intelligent machine will quickly design a greater-than-itself intelligent machine, and so on, until very rapidly the intelligence of artificial systems greatly outstrips that of humanity. Is this hard takeoff scenario realistic? Is it possible? Is there any way to encourage future super-intelligent beings to be friendly?