After the already-here 90%, another 9% of the future a decade hence used to be easily predictable.
You look at trends dictated by physical limits, such as Moore's Law, and you look at Intel's road map, and you use a bit of creative extrapolation, and you won't go too far wrong.
What we're getting, instead, is self-optimizing tools that defy human comprehension but are not, in fact, any more like our kind of intelligence than a Boeing 737 is like a seagull.
So I'm going to wash my hands of the singularity as an explanatory model without further ado—I'm one of those vehement atheists too—and try and come up with a better model for what's happening to us.
History, loosely speaking, is the written record of what and how people did things in past times—times that have slipped out of our personal memories.
Ruling out the singularity Some of you might assume that, as the author of books like "Singularity Sky" and "Accelerando", I attribute this to an impending technological singularity, to our development of self-improving artificial intelligence and mind uploading and the whole wish-list of transhumanist aspirations promoted by the likes of Ray Kurzweil. I think transhumanism is a warmed-over Christian heresy.
While its adherents tend to be vehement atheists, they can't quite escape from the history that gave rise to our current western civilization.
Then happened, and the future began to change, increasingly rapidly, until we get to the present day when things are moving so fast that it's barely possible to anticipate trends from month to month.
As an eminent computer scientist once remarked, computer science is no more about computers than astronomy is about building telescopes.