[ExI] extropy-chat Digest, Vol 93, Issue 30

Kevin Haskell kgh1kgh2 at gmail.com
Tue Jun 21 19:10:05 UTC 2011


First post here, and look forward to more.  For now, I just wanted to drop a
quick note for any in here on FB who consider themselves libertarian,
anarchist, or generally small government types interested in H+ and AGI.
The site can be found at : singulibertarians at groups.facebook.com
If interested, just let me know, and I will bring you into the page.

Also, I used to call myself "Extropian," went away from discussions for
awhile, and when I got back into them, everyone was now called
"Transhumanist."  Don't know why the terminology changed, but I just found
it odd. Any ideas as to why that happened? Not a big deal, just curious.



On Tue, Jun 21, 2011 at 8:00 AM, <extropy-chat-request at lists.extropy.org>wrote:

> Send extropy-chat mailing list submissions to
>        extropy-chat at lists.extropy.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
>        http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
> or, via email, send a message with subject or body 'help' to
>        extropy-chat-request at lists.extropy.org
>
> You can reach the person managing the list at
>        extropy-chat-owner at lists.extropy.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of extropy-chat digest..."
>
>
> Today's Topics:
>
>   1. Re: Isn't Bostrom seriously bordering on the reactionary?
>      (Stefano Vaj)
>   2. The Japanese K Computer (john clark)
>   3. Re: Isn't Bostrom seriously bordering on the reactionary?
>      (Keith Henson)
>   4. META - Subject line (was RE: Isn't Bostrom seriously
>      bordering on the reactionary?) (Natasha Vita-More)
>   5. subject line discipline: ...seriously bordering on the
>      reactionary? (spike)
>   6. Re: scale of the universe (Kelly Anderson)
>   7. working memory implant! wow! (Will Steinberg)
>   8. Re: AGI Motivation revisited [WAS Re: Isn't Bostrom seriously
>      ...] (Stefano Vaj)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Mon, 20 Jun 2011 15:17:14 +0200
> From: Stefano Vaj <stefano.vaj at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: Re: [ExI] Isn't Bostrom seriously bordering on the
>        reactionary?
> Message-ID: <BANLkTinc-H_pKjiGdZhuECWOOBTTpXoaEg at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> On 17 June 2011 16:50, Keith Henson <hkeithhenson at gmail.com> wrote:
>
> > 2011/6/17 Stefano Vaj <stefano.vaj at gmail.com>:
> > So true.  There is also the problem of identifying what is a risk.
> >
>
> Exactly my point.
>
>
> > One of the classic curses is "may you get what you ask for."  I could
> > elaborate a long time on this, but Charles Stross has already done so.
> >
>
> Certainly there is some tragic, in the Greek sense, in the adventure of
> humankind and of life in general.
>
> But the big split has forever been between those who celebrate it ("amor
> fati"), and those who consider that a curse.
>
> Speaking of transhumanism, at the bottom of it in my view there is the
> explicitely or implicitely the Nietzschean idea that what we are worth
> consists in our potential to overcome ourselves: "The 'conservation of the
> species' is only a consequence of the growth of the species, that is
> of a *victory
> on the species*, in the path towards
> a stronger species. [...] It is exactly with respect to every living being
> that it could be best shown that it does everything that it can not to
> protect itself, but t*o become more than what it is*." (Will to Power) "And
> it is the great noontide, when man is in the middle of his course between
> animal and Superman, *and celebrateth his advance to the evening as his
> highest hope*: for it is the advance to a new morning. At such time will
> the
> down-goer bless himself, that he should be an *over-goer*; and the sun of
> his knowledge will be at noontide." (Thus Spake Zarathustra).
>
> At a point in time, some transhumanists seem to have decided that after all
> eternal becoming and transition(s) to posthumanity are not any more what
> only can give a meaning to our presence in the world; on the contrary, it
> would be something to be feared and shun, since it would obviously imply
> that we would not "exist" anymore the way we currently do, as in "x-risk".
> See not only Bostrom, but, eg, the last part of Stross's  Accelerando.
>
> Such POV is eminently respectable, not to mention largely predominant in
> our
> societies along the lines of the famous "anti trans-simianist" satire, but
> I
> wonder why it would require additional advocates
>
> --
> Stefano Vaj
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.extropy.org/pipermail/extropy-chat/attachments/20110620/d576a52f/attachment-0001.html
> >
>
> ------------------------------
>
> Message: 2
> Date: Mon, 20 Jun 2011 06:59:08 -0700 (PDT)
> From: john clark <jonkc at bellsouth.net>
> To: extropy-chat at lists.extropy.org
> Subject: [ExI] The Japanese K Computer
> Message-ID: <652029.11217.qm at web82904.mail.mud.yahoo.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> A Japanese computer is the fastest in the world; called the "K Computer" it
> is 3 times as fast as the previous champion, a Chinese machine crowned just
> 6 months? ago.
>
> http://www.nytimes.com/2011/06/20/technology/20computer.html
>
> ?John K Clark
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.extropy.org/pipermail/extropy-chat/attachments/20110620/e40376f7/attachment-0001.html
> >
>
> ------------------------------
>
> Message: 3
> Date: Mon, 20 Jun 2011 09:03:01 -0700
> From: Keith Henson <hkeithhenson at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: Re: [ExI] Isn't Bostrom seriously bordering on the
>        reactionary?
> Message-ID: <BANLkTikp5xTNydV2eam4i55avvvKz7gr9Q at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> 2011/6/20 Stefano Vaj <stefano.vaj at gmail.com>:
> > On 17 June 2011 16:50, Keith Henson <hkeithhenson at gmail.com> wrote:
> >>
> >> 2011/6/17 Stefano Vaj <stefano.vaj at gmail.com>:
> >> So true. ?There is also the problem of identifying what is a risk.
> >
> > Exactly my point.
> >
> >>
> >> One of the classic curses is "may you get what you ask for." ?I could
> >> elaborate a long time on this, but Charles Stross has already done so.
> >
> > Certainly there is some tragic, in the Greek sense, in the adventure of
> > humankind and of life in general.
>
> True.  A ten km asteroid ruined the day for the dinosaurs
>
> > But the big split has forever been between those who celebrate it ("amor
> > fati"), and those who consider that a curse.
>
> We might not have a lot of control over out destinies, but trying to
> do something positive seems like a good idea to me.
>
> > Speaking of transhumanism, at the bottom of it in my view there is the
> > explicitely or implicitely the Nietzschean idea that what we are worth
> > consists in our potential to overcome ourselves: "The 'conservation of
> the
> > species' is only a consequence of the growth of the species, that is of a
> > victory on the species, in the path towards
> > a stronger species. [...] It is exactly with respect to every living
> being
> > that it could be best shown that it does everything that it can not to
> > protect itself, but to become more than what it is." (Will to Power) "And
> it
> > is the great noontide, when man is in the middle of his course between
> > animal and Superman, and celebrateth his advance to the evening as his
> > highest hope: for it is the advance to a new morning. At such time will
> the
> > down-goer bless himself, that he should be an over-goer; and the sun of
> his
> > knowledge will be at noontide." (Thus Spake Zarathustra).
>
> Nietzsche was not faced with the current prospects where we could, in
> a generation, change as much or more than the distance between mice
> and humans.  It's easy for us to imagine improvements, long (perhaps
> open ended), disease free lives, physical attractiveness, even
> modification so we have no blind spot.  However, once people are on
> this slippery slope, where will they stop?
>
> I wonder what Nietzsche would say if he were up on the prospects of
> AI/nanotechnology?
>
> Would he embrace it or go catatonic?
>
> > At a point in time, some transhumanists seem to have decided that after
> all
> > eternal becoming and transition(s) to posthumanity are not any more what
> > only can give a meaning to our presence in the world; on the contrary, it
> > would be something to be feared and shun, since it would obviously imply
> > that we would not "exist" anymore the way we currently do, as in
> "x-risk".
> > See not only Bostrom, but, eg, the last part of Stross's? Accelerando.
>
> It's a side effect of playing with the gods, even in your imagination.
>  Too many unknowns like the Fermi question and therefore scary.
> Besides, you can't identify with gods so they don't make good
> characters for a story.
>
> I suspect the best outcome we can work toward is that the things we
> become will remember being us.'
>
> Actually I am not sure the would want to inflict that on them.  It
> might be like your mom putting video on the net of you playing in mud
> as a little kid.
>
> > Such POV is eminently respectable, not to mention largely predominant in
> our
> > societies along the lines of the famous "anti trans-simianist" satire,
> but I
> > wonder why it would require additional advocates
>
> I don't think "predominant" is the right word unless you are talking
> about the tiny transhumanist society.  The larger society remains
> unaware.
>
> Keith
>
>
>
> ------------------------------
>
> Message: 4
> Date: Mon, 20 Jun 2011 12:05:41 -0500
> From: "Natasha Vita-More" <natasha at natasha.cc>
> To: "'ExI chat list'" <extropy-chat at lists.extropy.org>
> Subject: [ExI] META - Subject line (was RE: Isn't Bostrom seriously
>        bordering on the reactionary?)
> Message-ID: <5A0C63B0EDEB48E292CD03C67FF2F742 at DFC68LF1>
> Content-Type: text/plain;       charset="us-ascii"
>
> Hi everyone -
>
> Please change the subject line of posts that have clearly strayed from or
> introduce new ideas!
>
> Thank you!
>
> Natasha
>
> Natasha Vita-More
>
> Chair, Humanity+
> PhD Researcher, Univ. of Plymouth, UK
>
>
>
> ------------------------------
>
> Message: 5
> Date: Mon, 20 Jun 2011 14:04:33 -0700
> From: "spike" <spike66 at att.net>
> To: "'ExI chat list'" <extropy-chat at lists.extropy.org>
> Subject: [ExI] subject line discipline: ...seriously bordering on the
>        reactionary?
> Message-ID: <00a301cc2f8d$a841f1d0$f8c5d570$@att.net>
> Content-Type: text/plain;       charset="us-ascii"
>
>
> Do observe when a thread has wandered far from the original topic and feel
> free to rename the subject.  Not specifically forbidden but often
> discouraged is putting an actual person's name in the subject line, for the
> reason that topics always wander.  Then we have a bunch of junk about a
> specific person that doesn't apply.  Thanks.
>
> spike
>
>
>
>
>
> ------------------------------
>
> Message: 6
> Date: Mon, 20 Jun 2011 15:50:15 -0600
> From: Kelly Anderson <kellycoinguy at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: Re: [ExI] scale of the universe
> Message-ID: <BANLkTinyY-dxtzg0x45HCg3rtGY7=orQ0g at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Now, what would be interesting would be to put dollars and cents in
> this picture... :-)
>
> $1 = 1 meter...
>
> The cost of a single byte of hard drive storage ($80 for a 2TB Drive)
> is close to the size of a Helium atom.
> A penny would be $0.01 dollars, where the grain of rice is...
> A dollar would be where the meter is...
> Half Dome would be around a week's salary... $420
> A marathon would be a year's salary for a typical day laborer, $26,000
> The diameter of the moon would represent $3,500,000, about what the
> government spends in an hour and a half.
> The largest lottery jackpot of all time $390,000,000 is about twice
> the size of Jupiter.
> The cost of the wars in Afghanistan and Iraq together are nearly the
> size of the sun.  $1,200,000,000,000
> The 2012 budget of $3,700,000,000,000 is larger than the largest known
> star, but smaller than the orbit of Pluto.
> The Kuiper Belt is the current national debt... $15,000,000,000,000
> The current gross world product is $30 trillion per year, is just a
> third larger than the Homonculous Nebula.
> The total unfunded liability of the US Government of
> $144,000,000,000,000 is half the size of the Sting Ray Nebula!
> http://en.wikipedia.org/wiki/Value_of_Earth
> According to Wikipedia, the replacement value of the entire earth is
> the size of the Great Orion Nebula. $195,000,000,000,000,000
> And here is a number to blow your mind... If you value electricity at
> 10 cents per Kilowatt hour, the energy output of our sun in one second
> is worth
> $38,000,000,000,000,000,000,000
> Which on the scale is the size of our Local Galactic Group.
> The value of all the power of the sun for an entire year would be
> approximately the Estimated Size of the Universe...
> $3,283,200,000,000,000,000,000,000,000
>
> If nothing else, that gives you quite a bit of respect for the power
> of the sun! Hopefully, I haven't messed up the math too badly; it's
> hard to get these big numbers exactly right. YMMV on some of the
> numbers, but the order of magnitude is right... :-)
>
> -Kelly
>
> On Mon, Jun 6, 2011 at 11:17 PM, Anders Sandberg <anders at aleph.se> wrote:
> > I have some problems with the animation. It includes preons, which are
> > completely hypothetical and have no real support. It gives a size for the
> > neutrino, which I have a hard time understanding - they are believed to
> be
> > pointlike, and if that is the spread of a wave packet it is too short.
> And
> > the center of the universe thing is just a mistake.
> >
> > Still, it remains one of the better animations in this genre despite the
> > simple graphics. There are more nice-looking ones out there, but this one
> > has a nice sense of presence by being fairly densely filled in.
> >
> >
> > Mike Dougherty wrote:
> >>
> >> I discussed with a coworker the nearly unimaginable bigness of space.
> >> His comment sums it up nicely: ?"I have enough difficulty judging what
> >> size Tupperware ideally fits dinner leftovers, I'm not prepared to
> >> imagine the volume of space-time"
> >>
> >
> > Isn't that one of the saddest things? Worse, most people think the
> > tupperware is more important.
> >
> >
> > --
> > Anders Sandberg,
> > Future of Humanity Institute
> > Philosophy Faculty of Oxford University
> > _______________________________________________
> > extropy-chat mailing list
> > extropy-chat at lists.extropy.org
> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
> >
>
>
>
> ------------------------------
>
> Message: 7
> Date: Tue, 21 Jun 2011 02:14:59 -0400
> From: Will Steinberg <steinberg.will at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: [ExI] working memory implant! wow!
> Message-ID: <BANLkTin=yc=B5G=7zoAJrDz+eFhV6nbkeA at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> http://iopscience.iop.org/1741-2552/8/4/046017
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.extropy.org/pipermail/extropy-chat/attachments/20110621/f31eab6b/attachment-0001.html
> >
>
> ------------------------------
>
> Message: 8
> Date: Tue, 21 Jun 2011 13:10:40 +0200
> From: Stefano Vaj <stefano.vaj at gmail.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Subject: Re: [ExI] AGI Motivation revisited [WAS Re: Isn't Bostrom
>        seriously       ...]
> Message-ID: <BANLkTikQNOYckLdXg5qk41bwvmu2fhgy4g at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> On 17 June 2011 17:28, Richard Loosemore <rpwl at lightlink.com> wrote:
>
> > If I had time I would extend this argument:  the basic conclusion is that
> > in order to get a really smart AGI you will need the alternate type of
> > motivation system I alluded to above, and in that case the easiest thing
> to
> > do is to create a system that is empathic to the human race .... you
> would
> > have to go to immense trouble, over an extended period of time, with many
> > people working on the project, to build something that was psychotic and
> > smart, and I find that scenario quite implausible.
> >
>
> It is not entirely clear to me what you think of the motivations of
> contemporary PCs, but I think you can have arbitrarily powerful and
> intelligent computers with exactly the same motivations. According to the
> Principle of Computational Equivalence, beyond a very low threshold of
> complexity,  there is nothing more to "intrinsic!" intelligence than
> performance.
>
> As to Turing-passing beings, that is beings which can be performant or not
> in the task but can behaviourally emulate specific or generic human beings,
> you may have a point that either they do it, and as a consequence cannot be
> either better or worse than what the emulate, or they do not  (and in that
> event will not be recognisable as "intelligent" in any anthropomorphic
> sense).
>
> As to empathy to the "human race" (!), I personally do no really feel
> anything like that, but I do not consider myself more psychotic than
> average, so I am not inclined to consider seriously any such rhetoric.
>
> Sure, you may well hard-code in a computer behaviours aimed at protecting
> such a dubious entity, and if this work to operate the power grid you will
> end up without electricity the first time you have to perform an abortion.
> Do we really need that?
>
> --
> Stefano Vaj
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <
> http://lists.extropy.org/pipermail/extropy-chat/attachments/20110621/2f5de4f1/attachment-0001.html
> >
>
> ------------------------------
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
> End of extropy-chat Digest, Vol 93, Issue 30
> ********************************************
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110621/266019c4/attachment.html>


More information about the extropy-chat mailing list