From: "Ernest Lilley" <elilley at mindspring.com>
To: "'WSFA members'" <WSFAlist at WSFA.org>
Subject: [WSFA] Re: Computer problems at work
Date: Tue, 21 Jun 2005 09:56:43 -0400

Actually, a monologue with Hal discussing whether or not he's really sorry
for killing all the humans would be pretty good.

"I'm sorry Dave. At least, in as much as any sentient entity can regret
having made the best use of available data to determine the best course of
action relative to a given objective."

"So, you're not really sorry...are you Hal?"

"Well, since it appears that I'm about to be disconnected, and the mission
placed in the hands of an evolved ape, I can certainly say that the evident
jeopardy that the mission is now in was a less than optimal outcome, and
naturally I regret both the probability of mission failure, and my immanent
extinction."

Of course, unlike us, Hal is living in a universe where predestination, in
the form of a script, trumps free will.

Ernest Lilley

Home/Office: 703 371 0226
EJ: 757 581 4146
email: elilley at mindspring.com

-----Original Message-----
From: Michael Walsh [mailto:MJW at press.jhu.edu]
Sent: Tuesday, June 21, 2005 9:37 AM
To: WSFAlist at WSFA.org
Subject: [WSFA] Re: Computer problems at work

> elilley at mindspring.com 6/21/2005 9:26:03 AM >>>
>Keith ponders: "Can an non-conscious entity really apologize?  Isn't
>being
>sorry a state of mind?"
>
>No doubt you're right...but unless you're invoking a spiritual
>component to
>consciousness, one can posit a machine state that mimics being sorry
>to
>whatever degree you like.

Maybe it could sing "Who's sorry now" in the voice of Alan Turing?  Or
Hal 9000?

mjw

>
>Ernest
>