Friday, 17 December 2004

I, Robot

I just watched I, Robot (DVD, Book) and have a couple of questions for you sci-fi buffs out there. Not surprisingly, the movie is no better on its second viewing than on the first. However, it did make me want to read the book—which is one of my holiday to-do items—and it’s not bad as pure entertainment.

Having me question sci-fi logic is about as useful as watching a Hollywood movie discuss economics, which this one does—badly. Even so, I have a question: why not just modify Asimov’s first law to say that a robot shall not harm the life or liberty of a human? Because it might ruin the potential for future books pondering this dilemma?

Next question: one of the reviews of the copy of I, Robot (the book) says Asimov is part of the “ABCs” of sci-fi, with the others being Bradbury and Clarke. Why not extend it by one letter and have “D”, as in Dick, Philip K.? Is he not as respected as the other three? If not, why not?

5 comments:

Any views expressed in these comments are solely those of their authors; they do not reflect the views of the authors of Signifying Nothing, unless attributed to one of us.

Why not just modify Asimov’s first law to say that a robot shall not harm the life or liberty of a human? Because it might ruin the potential for future books pondering this dilemma?

Asimov’s later Foundation novels Foundation’s Edge and Foundation and Earth consider this question, although one can question his need to shoehorn robots into the Foundation universe.

BTW, neither the economics nor the politics of most science fiction makes a lot of sense. Even the “hard sci-fi” folks like Kim Stanley Robinson have laughable political and economic systems; from a political or economic standpoint, his Mars trilogy makes little to no sense, even though the hard science is pretty good (but not perfect—about a novel’s worth of drama could have been forestalled had one character made a far more useful decision about what to do with the array of mirrors deployed in Mars orbit).

 

Asimov’s laws look intransitive to me, and fixing the first law would probably solve the problem. Of course, it would do away with the need for additional books and movies on the subject. It would probably do away with a lot of the debate around their meaning as well.

I bet a book could be written on the economic / political problems in sci-fi. It might even sell.

 

Dick probably doesn’t get the credit he’s due because of his later eccentricities.

 

Thanks Brian. I went and looked at a bio of Dick and see what you mean. He did lose it at the end, but his work has held up pretty well over the years. In fact, the titles of most of his works have more imagination than the writings of others.

 

I recently posted on the Laws of Robotics and the missing ingredient:

“But the root problem lies much deeper: [The robot’s] checks and balances are completely internalized; it has full power to interpret conflicts between the Laws on its own accord without consulting human authority. A system of checks and balances is among the needs of humanity and the individual per Laws 0 and 1, and must not be ignored.”

(The “Zeroth Law” is “A robot may not injure humanity, or, through inaction, allow humanity to come to harm.”)

 
Comments are now closed on this post.