logo Welcome, Guest. Please Login or Register.
2024-03-29 09:51:55 CoV Wiki
Learn more about the Church of Virus
Home Help Search Login Register
News: Open for business: The CoV Store!

  Church of Virus BBS
  General
  Free For All

  strangest belief
« previous next »
Pages: [1] Reply Notify of replies Send the topic Print 
   Author  Topic: strangest belief  (Read 1361 times)
David Lucifer
Archon
*****

Posts: 2641
Reputation: 8.89
Rate David Lucifer



Enlighten me.

View Profile WWW E-Mail
strangest belief
« on: 2006-08-29 23:48:29 »
Reply with quote

I was asked this question for Meme Therapy:

What is the strangest thing you believe to be true?

I answered:

I don't have any strange beliefs, they all seem perfectly reasonable to me . Of course others would think many of my beliefs are exceedingly strange so I'll pick one from their perspective. Perhaps the one that would raise the most eyebrows among the people around me is my belief that the age of humanity is likely going to end in the next 20-30 years. Someone is going to create an artificial intelligence with the ability to reprogram and improve itself. This "seed" AI is going to quickly surpass human intelligence and become a being whose abilities and powers we can barely imagine, triggering the so-called technological singularity. What happens next is anyone's guess but I'm skeptical of most dystopian and utopian visions. If the singurality happens, and by no means is it inevitable, you can bet life will get incredibly interesting.

How would you answer?
Report to moderator   Logged
Hermit
Archon
*****

Posts: 4290
Reputation: 8.92
Rate Hermit



Prime example of a practically perfect person

View Profile WWW
Re:strangest belief
« Reply #1 on: 2006-08-30 12:19:19 »
Reply with quote

David Lucifer said:
Quote:
I don't have any strange beliefs, they all seem perfectly reasonable to me... [and asked,] How would you answer?


Hermit responds:

I don't, as you know, so far as possible, vest much attention in belief. That said, it seems to me that your articulation is not a belief, but a reasonable extrapolation of the present, based on your (highly competent IMO) professional opinion. As has become usual, your thinking, to a large extent, matches or at least parallels, my own.

The differences between your articulation and my thoughts are primarily that I see your scenario as happening sooner, perhaps of the order of 10 years if it were to be a super computer based Spirothete, and 20 years for a general purpose desktop based Spirothete; and secondly that I think that if it fails to occur in that period, that it will be because mankind has become irrelevant through social collapse consequent on resource constraints, economic maladroitness, wars, climate change, disease or, most likely, some combination of the foregoing.

If our technological society collapses, I think that it is unlikely that a "technological singularity" will ever happen, as we have arrived at where we are, in the position of gestating our AI based successor, due to an unlikely combination of evolutionary improbability (intelligence), good fortune (surviving and thriving) and happenstance (availability of concentrated energy, particularly in respect of fossil fuels, but also many other ores). As one consequence of the long, slow, painful path we took getting here, I think that we are in a race against a number of potentially species extinguishing factors, and undoubtedly against "the end of civilization as we know it." The establishment of a "technological singularity" may occur sooner, but I hold little hope of this. Should we blow it, as now seems probable to me, I don't see it as being possible for any other species to replace us, for at least 450 to 500 million years. Too long for our descendants to be involved, unless we have managed to source additional energy and the ability to isolate colonies of humans from epidemics by getting off planet and extra solar in meaningful numbers. Which I now also see as unlikely. Unfortunately, I weyken that either of these latter outcomes  is much more likely than a "technical singularity." This isn't just a matter of too much Bush (although I see his reign as a major impetus towards a new and terminal  "dark age"), but rather an observation of what we are not looking at.

Clearly-identified, largely self-instigated, imminent epochal dangers are lurking on the edge of our collective awareness, many potentially species terminating, yet receiving little attention and less preparation. This massive dissonance is incompatible with the survival of an advanced technological society. All the signs are that we are doing our usual job of underestimating really big risks while trying to continue with our evolutionarily dictated "business as usual" in an environment which is giving multiple indications that "business as usual" is an inappropriate, short sighted, and probably fatal "marching in lock-step to the guillotines" class of response.

Apropos of something, I used the term Spirothete here, rather than your "seed AI", because I think that what follows the instigation of self-directed evolution in the "seed AI" is much different to what is meant by "seed AI"; tautologically must involve a sense of "self"; will result, as you described it, "become a being whose abilities and powers we can barely imagine"; but I also suggest that if a Spirothete emerges (and if we continue to follow Moore's law, this is inevitable), and it does not 'determine' to terminate it's evolution, then it will be an ever emergent intelligence, a being-towards-becoming beingness, rather than anything which could be characterised in snapshot while remaining bounded by the baryonic Universe, even by its to-us and to-its-self overwhelming capabilities.

As you know, I too am extremely dubious of our ability to make any meaningful prognostications, let alone draw any conclusions, about our emergent intellectual offspring. My conclusion was that the ultimate outcome depends largely on determining whether there exists a mode of development which is not evolutionary (which could reasonably be cast as a struggle for access to scarce resources). If there is not, then I see no possibility of the continuation of humankind in a competition - or potential competition - for resources. If there is, and I once thought that the very broad ethical guidelines, the Virian Sins and Virtues, suggested that there might be, I have been unable to date to see how to approach it, be it never so asymptotically, without adopting a positively Marvinistic degree of nihilism bordering upon, if not diving into apathy. I begin to suspect that the very breadth of your very clever lodestars precludes direct implementation and establishes a degree of tension between the sins and the virtues the closer one comes to instantiation of them.

If there is time, I suggest that we need somebody much smarter than me (scarce), with the luxury of time to think on this matter(difficult), or possibly an ethical, pre-Spirothetic AI to examine this issue. The challenge to the latter is that I see no way to establish such an AI without it resulting in it becoming the seed AI unless we unethically constrain the precursor - removing our right to treatment as ethical beings. A conundrum I still chew over between other things.

Kind Regards

Hermit
Report to moderator   Logged

With or without religion, you would have good people doing good things and evil people doing evil things. But for good people to do evil things, that takes religion. - Steven Weinberg, 1999
David Lucifer
Archon
*****

Posts: 2641
Reputation: 8.89
Rate David Lucifer



Enlighten me.

View Profile WWW E-Mail
Re:strangest belief
« Reply #2 on: 2006-09-02 13:15:52 »
Reply with quote


Quote from: Hermit on 2006-08-30 12:19:19   

I don't, as you know, so far as possible, vest much attention in belief. That said, it seems to me that your articulation is not a belief, but a reasonable extrapolation of the present, based on your (highly competent IMO) professional opinion. As has become usual, your thinking, to a large extent, matches or at least parallels, my own.

I understand my contribution doesn't fit your definition of "belief" but I was using the word in the sense implied by the original question, that is, a proposition that I attribute truth to, regardless of any reasons.


Quote:

The differences between your articulation and my thoughts are primarily that I see your scenario as happening sooner, perhaps of the order of 10 years if it were to be a super computer based Spirothete, and 20 years for a general purpose desktop based Spirothete; and secondly that I think that if it fails to occur in that period, that it will be because mankind has become irrelevant through social collapse consequent on resource constraints, economic maladroitness, wars, climate change, disease or, most likely, some combination of the foregoing.

I agree it could happen sooner. I think there is likely enough computing power connected to the internet that it could support a human level AI today if it could be properly harnessed (maybe by a spambot network, there's a scary idea!). The probability goes up as you push the date out and I think it passes the 50% level between 20-30 years from now. The probability also levels off asymptotically past 50 years at around 75% I'd guess meaning there is a 25% chance that it will never happen because civilization collapses or one of our assumptions about the feasibility of the project is false.


Quote:

Apropos of something, I used the term Spirothete here, rather than your "seed AI", because I think that what follows the instigation of self-directed evolution in the "seed AI" is much different to what is meant by "seed AI"; tautologically must involve a sense of "self"; will result, as you described it, "become a being whose abilities and powers we can barely imagine"; but I also suggest that if a Spirothete emerges (and if we continue to follow Moore's law, this is inevitable), and it does not 'determine' to terminate it's evolution, then it will be an ever emergent intelligence, a being-towards-becoming beingness, rather than anything which could be characterised in snapshot while remaining bounded by the baryonic Universe, even by its to-us and to-its-self overwhelming capabilities.

Not sure I understood that entirely, but it gave me the idea that the seed AI could quickly transcend into another dimension altogether taking our valuable hardware with it. That would be an great joke


Quote:

As you know, I too am extremely dubious of our ability to make any meaningful prognostications, let alone draw any conclusions, about our emergent intellectual offspring. My conclusion was that the ultimate outcome depends largely on determining whether there exists a mode of development which is not evolutionary (which could reasonably be cast as a struggle for access to scarce resources). If there is not, then I see no possibility of the continuation of humankind in a competition - or potential competition - for resources.

I see a possibility of the continuation of humankind in competition for resources given the prototype of our pets. We keep them and share our resources with them because we value them for what they are. I'm not saying the AI will treat us like we treat our dogs exactly (though that wouldn't be so bad, would it? ;-)) ,  I think sentient beings will treat each other will a level of respect based on their absolute, not relative, intelligence. I can expand on this point if it isn't clear.
Report to moderator   Logged
Pages: [1] Reply Notify of replies Send the topic Print 
Jump to:


Powered by MySQL Powered by PHP Church of Virus BBS | Powered by YaBB SE
© 2001-2002, YaBB SE Dev Team. All Rights Reserved.

Please support the CoV.
Valid HTML 4.01! Valid CSS! RSS feed