Re: AI in FT (was Re: Be gentle...)
From: Samuel Penn <sam@b...>
Date: Thu, 17 Jul 1997 13:37:05 -0400
Subject: Re: AI in FT (was Re: Be gentle...)
In message <199707171616.MAA27081@sparczilla.East.Sun.COM>
Joachim Heck - SunSoft <jheck@East.Sun.COM> wrote:
> Chris McCurry writes:
>
> @:) this is what a true AI is right?
> @:)
> @:) living, feeling, thinkning, worring, growing inteligence..
> @:) (warring is included)
>
> But why? We don't know of any fundamental law that suggests that
> any of these traits (living, feeling, worrying) are related to
> thinking. They are in us but why should they be in something else?
> And if they're not required, I imagine we won't spend a lot of time
> and money building them in.
First, it depends on how well we understand human minds
when we come to build the first AIs. It may well turn
out to easier to just simulate a complete human mind,
rather than building it up bit by bit, with full
knowledge of what each part does.
With current day neural networks for instance, it is
known that they work, but not always _how_ they work.
It's not obvious which part of the network is doing
what, and what effect cutting out a neuron, or changing
a weighting factor will have.
Building an AI that thinks as well as we do might well
have the same problems - the thing works, but its
creators don't know which part controls love, which
part hate, and which part recognising a good military
tactic when it sees one.
But, let's suppose we do need to fully understand all
the parts of a mind before building one. Do we need
emotions in any way? Given that what we're really after
is a machine mind capable of independant and original
thought, it needs to have a sense of what is a good
idea, and what is bad. It needs priorities, a desire to
find better solutions (a simple algorithm to do this
logically would tend to rule out original thought).
These are emotions of a sort, and it's difficult to know
whether it would evolve stronger emotions (a desire to
protect its allies could evolve into loyalty or love,
a desire to hurt the enemy into hate).
Lastly, if it's possible to put emotions into a machine
mind, it will be done. There will be someone with the
knowledge, resources and will to do it.
--
Be seeing you,
Sam.