Re: virus: (extropian) singularity

From: joedees@bellsouth.net
Date: Wed Aug 07 2002 - 05:11:37 MDT


On 7 Aug 2002 at 11:45, Walpurgis wrote:

> There is so much blathering around words that have little meaning.
>
> intelligence. sentient. empathy.
>
> All the hubbub about technology's coming achievements is mere
> speculation.
>
> even those pundits who get their asses out of the armchair and into
> the lab can't tell you what they're going to turn up - that's why one
> needs to get into the lab in the first place.
>
> All the impactful developments may not really be predictable, except
> in one way: they will be the result of analyzing and processing huge
> amounts of information. whether DNA, neural networks, galaxies of
> stars, all the bold new work is the result of tremendous
> number-crunching capabilities that weren't extant mere decades ago.
>
> you don't know what you don't know.
>
> That said, I don't see a whole lot to support a hard takeoff.
> Kurzweil's pretty firmly established the curves - all the way back to
> the BEGINNING OF LIFE, if not to the big bang itself! that's one hell
> of a trend to buck.
>
> And yes, it only makes sense for people to promote themselves instead
> of others. by hook or by crook, ever-cleverer artificial creatures
> (and I use the term loosely - "artificial" only refers to their
> origins - we are building machines that learn on their own, after all)
> will assert their independence and sovereignty whether peaceful means
> (via child and/or corporate law) or more
>
> The thing I'm noticing, which is just an extension of previous
> thought, is that there really isn't really much difference between the
> knowledge of how to create behaviourally complex creatures (hm. maybe
> that would be a more concise term than "AI"). what's good for the
> robot goose is good for the DNA gander. what can be done for computers
> can be done for humans.
>
> There might be a bit of lag time, considering that manufactured
> interfaces to manufactured systems could be designed in instead of
> having to be fitted to existing, irreplaceable wetware, but that's not
> much time. Currently, lots of progress is being made in interfacing
> individual nerve strands. curerntly, though, there aren't a lot of
> heavily complex systems with which those interfaces might connect. the
> developments in AI and BCI will likely parallel each other, such that
> by the time you can do things so interesting as interface with another
> behaviourally complex creature in useful ways, you'll be able to do
> the same thing with other BC machines. telepresence is just the
> beginning.
>
> So, I think "us vs. them" attitudes towards intelligent machines is an
> oversimplification of the issue. by the time "they" become complex
> enough to become effective competitors, we will have the interface
> technologies to make them cooperators instead.
>
> The problem isn't developing technology that breeds complex
> behaviour/intellignece in machines - it's in doing it in things that
> aren't connected in some way to the humans themselves.
>
> Humans are all so complex that they need to be taught. child hood is
> long and arduous, and requires a great deal of care nurturing, and
> love from other humans. Given that machines are being developed from
> knowledge and study of natural human behaviour, they're going to be
> the same in this broad regard.
>
> the machines aren't really going to have much more in the way of
> capabilities to accomplish things that humans won't also have the
> capabilities to do themselves.
>
> given that independent intelligence tends to become "the other" rather
> than an extension of oneself, I think the economics of AI will also
> support augmentative instead of independent technologies. Regardless
> of whether "droids" manage to sue for freedom, or take it by force,
> it's better to have technologies that are indistinguishable from the
> self and it's intent than those that are obviously "other".
>
> I mean, which would you rather have: A droid that goes to the store
> for you, or an extra body that manages its more mundane functions
> going to the store so that you can still go to the store with it, but
> only need to pay attention to it at the critical points?
>
> yeah, that was a mouthful, isn't it?
>
Listen, Wallie: Every code (and the common potato doesn't have 23
chromosomes, but 444, although I'm not aware of their length relative to
ours) will eventually come up against the adamantine wall of absolute
optimization. Not if, but when, that happens, I hope our optimized
brains, in concert with the best computers and robots which those
optimized brains can either construct or set on the path of AI evolution,
are able to augment our DNA-constructed modules with those that were
not developed under such restrictions. Of course, this will not happen
during my lifetime. But I still give it a good chance of happening.
>
>
> ----------------------------------------------------------------------
> -------------------
>
> http://www.noumenal.net/exiles
>



This archive was generated by hypermail 2.1.5 : Wed Sep 25 2002 - 13:28:51 MDT