Welcome to
Science a GoGo's
Discussion Forums
Please keep your postings on-topic or they will be moved to a galaxy far, far away.
Your use of this forum indicates your agreement to our terms of use.
So that we remain spam-free, please note that all posts by new users are moderated.


The Forums
General Science Talk        Not-Quite-Science        Climate Change Discussion        Physics Forum        Science Fiction

Who's Online Now
0 members (), 388 guests, and 4 robots.
Key: Admin, Global Mod, Mod
Latest Posts
Top Posters(30 Days)
Previous Thread
Next Thread
Print Thread
Page 2 of 2 1 2
#4100 10/21/05 03:31 PM
Joined: Jun 2005
Posts: 1,940
T
Megastar
Offline
Megastar
T
Joined: Jun 2005
Posts: 1,940
Rob, okay. I see what you're saying and I agree. OTOH, I think it's natural that we will want to learn what we can about emotions while we're learning what we can about intelligence. It's not even clear the extent to which these things can be isolated from each other.

However, it's possible we could run into a problem even if the machines don't have emotions. Contrast Saberhagen's Berserker Saga with Ellison's short story "I have no mouth and I must scream", e.g. (Predecessors to The Terminator.)

In the Saberhagen's story (which is a fun story even though it's pure christian propaganda), the machines have no emotions. In Ellison's, the machine is experiences hate and revulsion at the humans who created it.

No doubt we ought to consider carefully before we go down certain roads (but someone will anyway). It's all quite a ways off, I'd think. (Like I said, I don't agree with Kurzweil.) Still, it's good to consider the possibilities in advance.

The hurdles should not be, ahem, misunderestimated (hehe).

Turing (about the time of WW II) predicted there would be machines that could play grandmaster level chess within 20 years - turns out it was more like 50 or 60 years. (There's honorable precedent for software engineers' inability to reliably predict level of effort and completion time for projects.)

There's another story (which I have not confirmed) I heard once about how (I think it was) Marvin Minsky when he was a grad student was asked to develop a machine vision program as like a summer project. The prof had apparently thought it wasn't going to be a huge deal. Turns out it's the problem of pattern-matching - huge, huge problem in CS. OTOH, we had thought teaching machines logic was going to be hard. Turns out that one is trivial (well, almost). There is an unspoken part of logic - the translation phase - that is laden with assumptions and ambiguity. (It's primarily for this reason that logic is not trivial.) But the actual performance of logical operations is a trivial thing for a computer to do, even though it's apparently a bear of a problem for most humans. We're good at pattern matching. The machines are good at logic operations.

.
#4101 10/21/05 07:55 PM
A
Anonymous
Unregistered
Anonymous
Unregistered
A
Xennos, can you imagine the havoc that a hacker could wreak with AIs? You could theoretically program someone to commit murder. Could that be defended in court by saying "I was under the influence of outside forces" instead of an insanity plea?

#4102 11/09/05 02:04 AM
Joined: Sep 2005
Posts: 636
J
jjw Offline
Superstar
Offline
Superstar
J
Joined: Sep 2005
Posts: 636
Amaranth:

You could argue that what ever happened was without intent on your part.
"Murder", by definition requires intent.
jjw

#4103 11/09/05 04:45 AM
A
Anonymous
Unregistered
Anonymous
Unregistered
A
Hmm. Would that reduce it to manslaughter then?

#4104 11/13/05 04:37 AM
Joined: Sep 2005
Posts: 636
J
jjw Offline
Superstar
Offline
Superstar
J
Joined: Sep 2005
Posts: 636
Amaranth, it could, depending.

I thing that AI is well on it's way and may be here in some form or other right now.OTOH there is no way you can program a "human" to commit murder. You can deceive them into it, you can force them into it and you frame them into it- but not knowingly and willingly. Hypnotysts of note have tried to induces people to do thing that are against their basic beliefs and failed. It is almost impossible to hypnotise a woman to have sex with a stranger- but you might trick her into thinking it is here husband and under the right conditions she may fall for it.

In my mind AI is nothing to fear. We need it to progress. Our minds are fruitful but not as quick as our computers. Imagination is something else. Many good people seem to lack it. We should strive to develope more of it. AI will not likely get there in our lifetimes. A thought.
jjw

#4105 11/13/05 04:20 PM
A
Anonymous
Unregistered
Anonymous
Unregistered
A
"Reality is what you're stuck with when your imagination goes on the blink.---R S

#4106 11/14/05 12:44 PM
Joined: Oct 2005
Posts: 560
R
RM Offline
Superstar
Offline
Superstar
R
Joined: Oct 2005
Posts: 560
jjw004, watch ?the Manchurian candidate"

#4107 11/15/05 01:12 AM
Joined: Sep 2005
Posts: 636
J
jjw Offline
Superstar
Offline
Superstar
J
Joined: Sep 2005
Posts: 636
Rob:

I saw that a long time ago. I may have read the book as well. That was quite involved fiction. Not to rule out brain washing in combination with Hypnosis as a combined means of getting control of people. If you can subvert their memory they have little to nothing to guide them for future behavior. You then create a new memory with the details you wish to build upon and the prospect for behavior control is much more likely. Hypnosis normally does not eliminate retained life experience and when we are pushed to a certain action we fall back on our life experience. This is not my best topic for me but I think this sampeling is accurate.
jjw

Page 2 of 2 1 2

Link Copied to Clipboard
Newest Members
debbieevans, bkhj, jackk, Johnmattison, RacerGT
865 Registered Users
Sponsor

Science a GoGo's Home Page | Terms of Use | Privacy Policy | Contact UsokÂþ»­¾W
Features | News | Books | Physics | Space | Climate Change | Health | Technology | Natural World

Copyright © 1998 - 2016 Science a GoGo and its licensors. All rights reserved.

Powered by UBB.threads™ PHP Forum Software 7.7.5