Is "the singularity" crazy?
As with hardware above, I would expect these "shit hits the fan" moments to happen before fully human-level AI.
Some technology pioneers, innovators, developers, business and policy leaders, researchers and activists answered this question in a canvassing of experts conducted in the summer of When Stuart Russell, author of the standard AI textbookmentioned this during his Puerto Rico talkthe audience laughed loudly.
Companies and militaries aren't stupid enough not to invest massively in an AI with almost-human intelligence. So why do most people, including many of society's elites, ignore strong AI as a serious issue?
Between isolated regions of the world the situation was sometimes different -- e. Why is the subject suddenly in the headlines?
In my opinion, the hard part of AGI or at least, the part we haven't made as much progress on is how to hook together various narrow-AI modules and abilities into a more generally intelligent agent that can figure out what abilities to deploy in various contexts in pursuit of higher-level goals.
On the other hand, one argument in favor of differential equations is that the economy has fairly consistently followed exponential trends since humans evolved, though the exponential growth rate of today's economy remains small relative to what we typically imagine from an "intelligence explosion".