Singularity may not require AGI
Singularity, the point where AI can improve itself faster than humans can, had been a topic since the 60s. I.J. Good speculated in 1965 that “an ultra-intelligent machine...that can far surpass all the intellectual activities of any man however clever”.
James Barrat’s bestseller Our Final Invention brought this topic from academic and nerds to mainstream concerns (or, shall I say fear)?
The great late Stephen Hawking and ever-controversial Elon Musk are some of the most vocal on AI potentially is the worst thing that can happen to humanity in the history of our civilization. While others like Bill Gates, Andrew Ng, and some prominent AI researchers arguing against AI becomes an imminent threat.
However, for those who read carefully, most of the disagreement with Hawkings was really not saying AI will never be a threat, or Singularity will never come. Most of the arguments are, on the timing. I.e. Andrew Ng famously used “Fearing a rise of killer robots is like worrying about overpopulation on Mars” as an analogy to AI’s singularity threat. While from an AI researcher’s perspective, that is completely correct, if you consider how fast it took us to increase our population by 2 billion humans on this earth — in the past two decades — you might agree no matter how remote the threat of overpopulating Mars is, it is actually possible (Mars is…