Wednesday, April 09, 2008

Singularity

mergin

As referenced in the last post, I'm not so secretly obsessed with the idea of Singularity. I wrote a paper for a college psychology class back in 1993 which went into great detail about how humanity would inevitably trump its greatest fabrication [God] with an even better fabrication [machines far superior to ourselves] which would cause the end of the human race's relevancy.

I got a B-. My prof didn't agree. I hadn't heard of Singularity theory at the time, and apparently neither had she. The point was really about creativity and the inherent human drive to create at all cost.

How else could this all possibly end? Both a random cosmic catastrophe and some stupid war related apocalypse would be hugely disappointing. Beating God at his own game is the only satisfying way to end humanity, and I really honestly hope I get to experience it. [Projections put A.I. surpassing human intelligence at around 2028, which is not that far off.]

Some of science fiction's most mainstream films have dealt with this idea: from Star Trek the Motion Picture, to The Terminator, to The Matrix. It just makes so much sense - if you believe real consciousness is something we as humans are capable of creating. I personally think it's our destiny as a species.

There's an interesting article in WIRED this month about futurist and inventor Raymond Kurzweil, who is going to great lengths in order to keep Death away from his door, with the primary intention of being around for when The Singularity happens. Kurzweil, 60, takes between 180 and 220 supplements a day and spends one day a week at a medical center taking intravenous longevity treatments. He has a different take on it than the dystopian scenarios presented in science fiction: Kurzweil argues that while artificial intelligence will render biological humans obsolete, it will not make human consciousness irrelevant. He believes the first A.I.s will be human add-ons to improve our minds and bodies, and that The Singularity won't destroy us, but immortalize us. [More V'ger than Terminator.]

I'm way too Catholic and Goth to buy that sunny notion. I'm all about the machine race we give birth to wiping us out or enslaving us... which I'm totally okay with! Either way it's proof that humans are more ingenious than God what most religions understand as God, and that's what's important. Bring on the machines.

addendum: Never mind - enough ramble jamble for one post already, prolly.

3 comments:

Anonymous said...

About a year ago, someone asked me the question:
What is the most fundamental question to mankind in your opinion?

I answered:
As we reach this apogee in human technological achievement, will we capitalize on the moment by instigating our own evolutionary leap, or will we squander it and cause our own demise?

-----

It's easy to see how civilizations could rise and fall several times on a planet, without passing this threshold. We're either going to reach the singularity soon or... with great power comes great responsibility... we could ill use our technology and wipe ourselves out as a species. If the machines wanted the world to themselves, I don't think they'd need to come after us. They could just sit back on their lawn-chairs and wait for us to wipe ourselves out, before they have the locks changed, so to speak. I have much more fear of ourselves that I do of technology.

I am currently reading Ray Kurzweil's the Singularity is near.

I'm not convinced that AI is going to take the leap to "consciousness" so promptly. For my entire lifetime, I've been reading that human-like AI is just 10 years off... is always 10 years off. I'm skeptical. I'm also a reader of Kevin Kelly, Howard Bloom, Jeff Hawkins... I think we're on the cusp of something profound for humankind, for all life, on earth and otherwise. I don't know that machine life is going to replace biological life. I think it will enhance it. I'm expecting computational power to be applied to re-engineering humans, turning us into a race of super-beings and cyborgs. Life is stubborn. It flourishes in spite of circumstances. It finds a way (I also note that, while life carries on, some life-forms do not). The next step is adding the longevity to human life that will make long term, long distance space travel practical. The thing that made America great, and the great American experiment possible was the frontier. All inhabitable earth-bound frontier is now firmly under control of some governing entity. For humanity to again bloom like wild-flowers, we need a new frontier. Space is all that is left. And we are here to go.

Unknown said...

Thanks for the great, thoughtful comment! (Please post her as much as you want.)

There was a good deal of hyperbole for the sake of provocation in what I said, and I hope Kurzweil is right. I am anxious for a major paradigm shift, but The Singularity does smack of religious fanaticsm on some levels. Wo knows though? He's been right more frequently than Nostradamus...

Unknown said...

I'm designing an exhibit about innovation for a science museum right now. One of the featured personalities of Aubrey de Grey: http://en.wikipedia.org/wiki/Aubrey_de_Grey

As someone who thinks that over-population is the root cause of most of the world's problems, I question the basic wisdom of extending life-spans further, let alone immortality.

His answer to that is pretty simple and gets back to your last point: we need to get off this limited planet and start colonizing other worlds.

(I'm also in the school of thought that thinks breeding should be a privilege and not a right: if you suck, you shouldn't be allowed to breed.)