What if the singularity does not happen?

7 posts / 0 new
Last post
Panoptic Panoptic's picture
What if the singularity does not happen?


Given the title of my talk, I should define and briefly discuss what I mean by the Technological Singularity:

It seems plausible that with technology we can, in the fairly near future, create (or become) creatures who surpass humans in every intellectual and creative dimension. Events beyond this event - call it the Technological Singularity - are as unimaginable to us as opera is to a flatworm.

The preceding sentence, almost by definition, makes long-term thinking an impractical thing in a Singularity future.

However, maybe the Singularity won't happen, in which case planning beyond the next fifty years could have great practical importance. In any case, a good science-fiction writer (or a good scenario planner) should always be considering alternative outcomes.

I should add that the alternatives I discuss tonight also assume that faster-than-light space travel is never invented! Important note for those surfing this talk out of context. I still regard the Singularity as the most likely non-catastrophic outcome for our near future.

There are many plausible catastrophic scenarios (see Martin Rees's Our Final Hour), but tonight I'll try to look at non-singular futures that might still be survivable.

The idea of grumpy old nerds complaining about the lack of a singularity tickles my funny bone. But there are some solid thoughts in the transcript.

On 'IC Talk': Seyit Karga, Ultimate

Character Profile

DivineWrath DivineWrath's picture
I think it is a perfectly

I think it is a perfectly valid point. We don't actually know when the technological singularity will happen. Humanity has a terrible track record in tracking technological progress. To be practical, humanity should makes plans where it does not happen in the foreseeable future (just as it should make plans for it does not happen in our favor). As for if, maybe climate change will hit the Earth hard in the near future, halting technological progress for the next few centuries. Or world war 3 could happen too.

MDFification MDFification's picture
Good find with this paper! I

Good find with this paper! I find it an interesting challenge to explain away the singularity, although my ideas are probably less practical than his and usually have something to do with the nature of intelligence. I don't think its necessary for technological progress to stop - maybe we just can't quite make it over the hump of increasing intelligence beyond a certain point. Maybe more complex minds are less stable - there's plenty of evidence that a whole swathe of personality/mental disorders disproportionately affect the more intelligent. Maybe there's a point in the singularity where minds fail faster than they evolve, at which point we've well and truly plateaued.

In the future, would my job be called anthropology, transanthropology or memetic research?

Chrontius Chrontius's picture
Slow Singularities

It's more like a question of, what if the singularity is an era, like the Space Age, and not a brief, violent event?

I'm personally in favor of a slow singularity, to be entirely honest. It will, in hindsight, be considered something of a golden age, and we won't be quite so tempted to blow ourselves to radioactive ash in the process.

The question is only one of scope and schedule - existing technology demonstrates that we can improve humanity, the only questions that remain are how, how much, and how fast?

ringringlingling ringringlingling's picture
An even more disturbing thought...

What if stupidity is a survival trait?

ringringlingling ringringlingling's picture
The Beast

Everything we take for granted about being feral becomes a survival trait in a crisis situation. I wonder just how soft people are going to be after the singularity. We are already nearly sessile. How much longer before technology robs us of our motion completely?

At some point the ability to pick up a club trumps being able to microwave tacos with your mind.

Chrontius Chrontius's picture
I, for one, would prefer a

I, for one, would prefer a very different body, given the choice. Meaner, stronger, and more mobile.

Also, more reliable.

I imagine that after the singularity, a lot of people will go soft, and a shockingly large minority will be complete beasts, even if they're also soft. Once given the necessary motivation, they'll sharpen up nicely...