Friday, September 12, 2008

Who's afraid of the big bad robot? (personal)

"Hard" technological determinism (hereafter referred to as "TD" to help delay the onset of carpal tunnel...) conjures up stark images in the deepest recesses of the technophobes's mind. A Terminator-like cyborg running amok and laying waste to civil society. If it was put in human form, it would be the dude from Grand Theft Auto but with a flat Austrian accent.

Or, for those of you who are more mathematical in your thinking, you could also think of this phenomenon in these terms:

Advanced Technology + Paranoia + Hollywood CGI = The Governator

Was there really any other way these three variables could add up to anything else?

Haley Joel Osment, the cute robot boy from AI who was ubiquitous years ago*, couldn't bring in the box office dollars compared to the frightening tale of technology gone awry in the Terminator series. And that's despite having the help of Jude Law playing a randy robot. There's just something eminently more compelling to the popular consciousness in an apocalyptic tale of futuristic dystopia than Pinocchio 2.0 striving for technological self-actualization through his mother.

Funny business aside, I honestly get the heebeegeebees when I entertain the concept of hard TD. Just the idea of technology having agency and effecting change in an autonomous way scares me.

This of course leads to the $64,000 question.

Why?

Recognizing that I am old-school in this sensibility, I stated the following hypothetical scenario in class.

I wonder if I would be more comfortable with hard TD if I was born X years from now, when artificial intelligence would be a bigger part of everyday life?

I initially answered yes. I would be more comfortable born in the future when confronted with this phenomenon. With the presumed almost imperceptible long-term integration of increasingly smart machines across a variety of products, I would imagine that it would only be that much easier to accept the reality of hard TD.

But the existentialism-pondering, Descartes-reading philosopher in me thinks perhaps that would not be the case. I don't know if it would ever feel completely "natural" that we, as human beings, would have our fates inextricably and inescapably linked to the whims of technology, whether it be something as malevolent as a Terminator-like machine, the initially benevolent Hal in 2001 A Space Odyssey, or maybe, one day, Apple's iPresident.

But human ego aside, given the increasing complexity of the world, with the highly likely increase in the intensity of intermingling of cultural, economic, political and social factors, wouldn't technology, even if it permeated every pore of post-industrial life, only be one wave in an ocean full of currents?

Who knows?

See you in fifty years...



* For a two-year period it appeared that everybody's favorite scamp, Haley Joel Osment, was in every major Hollywood production - The Sixth Sense, Pay It Forward, AI, Different Strokes.

Oh sorry. The last one was Gary Coleman.

But Mr. Osment has succumbed to one of Hollywood's most fickle fates - the "awkward teen stage". Now he is too old to play a child, but too young to morph into a romantic lead.

At least I hope he had the sixth sense not to pay forward that trust fund money...

Ha. Ha ha ha.

1 comment:

Unknown said...

HI DINO!
I think AI (and hard determinism) is scary on an Darwinian level. Humans are the dominant species on Earth. Assuming all of our AI dreams come true, we will have robots to help make human life better. However, that ideal may be far from what will actually happen. The possibility of AI championing human intelligence is a scary thing. Very scary. I think your anxiety is warranted.

The question of generations and their anxiety related to AI seems similar to the integration of any other technology. The more children use computers, video games, etc. in their young years, the more easily accepted advances in these technologies will be. That's just a thought.

Interesting read. Cheers!