Singularity may not require AGI

Alan Tan
Towards Data Science
10 min readOct 13, 2020

Singularity, the point where AI can improve itself faster than humans can, had been a topic since the 60s. I.J. Good speculated in 1965 that “an ultra-intelligent machine...that can far surpass all the intellectual activities of any man however clever”.

James Barrat’s bestseller Our Final Invention brought this topic from academic and nerds to mainstream concerns (or, shall I say fear)?

Image by Author

The great late Stephen Hawking and ever-controversial Elon Musk are some of the most vocal on AI potentially is the worst thing that can happen to humanity in the history of our civilization. While others like Bill Gates, Andrew Ng, and some prominent AI researchers arguing against AI becomes an imminent threat.

However, for those who read carefully, most of the disagreement with Hawkings was really not saying AI will never be a threat, or Singularity will never come. Most of the arguments are, on the timing. I.e. Andrew Ng famously used “Fearing a rise of killer robots is like worrying about overpopulation on Mars” as an analogy to AI’s singularity threat. While from an AI researcher’s perspective, that is completely correct, if you consider how fast it took us to increase our population by 2 billion humans on this earth — in the past two decades — you might agree no matter how remote the threat of overpopulating Mars is, it is actually possible (Mars is…

Published in Towards Data Science

Your home for data science and AI. The world’s leading publication for data science, data analytics, data engineering, machine learning, and artificial intelligence professionals.

Written by Alan Tan

Pan-intelligence futurist — Don’t fear AI as in “Artificial Intelligence” in machines; Be fearful of AI as in ‘Absence of Intelligence’ in humans.

Responses (4)

What are your thoughts?

Really great article. Unfortunately, I’m not sure a single politician or legislator or individual with power and authority is likely to read this and reflect. 😕 But they should. The unintended consequences of what our various technological…...

A good article raising the possiblility that the path to Singularity may not require the development of an AGI.
The question of whether we humans should be concerned is not the focus and as such is rightfully addressed only in the introduction.
Humans…...

There's a couple of fundamental problems with AI improving itself.
First: Need to define intelligence. I have not read a clear specific definition. The graph confused me. How are people today more intelligent than 100 or 1000 years ago. Sure, we know…...