[ExI] [Extropolis] Old and new futurisms in Silicon Valley

Giulio Prisco giulio at gmail.com
Sun Jan 21 05:53:00 UTC 2024


John, we'll just have to agree to disagree on which one (Trump or
"woke") is the greatest evil.

But I think we can agree that both are far from good (please correct
me if I'm wrong). Therefore, while there's not much we can do to avoid
having to make this choice at the next elections (not only in the U.S.
- these are global trends with different local names), it is important
to promote third-way alternatives for the longer term.

What should the third way alternative be? If I had a precise answer, I
would be a politician. But my rough answer is that the third way
should protect both individual liberty and social justice. These are
and will remain conflicting goals, so the devil will always be in the
details and negotiation will always be needed. Another important point
is that our Western culture (and its political aspects) must recover
its strength and stop treating weakness and despair as virtues.

On Sat, Jan 20, 2024 at 2:21 PM John Clark <johnkclark at gmail.com> wrote:
>
> On Sat, Jan 20, 2024 at 12:00 AM Giulio Prisco <giulio at gmail.com> wrote:
>
> Hi Giulio, I respectfully disagree:
>
>> > "if the only choice is between "woke" and Trump, I choose Trump."
>
>
> It's telling that I never use the word "Trump" but from my description (an anti-science, anti-free market, wannabe dictator with the emotional and mental makeup of an overly pampered nine-year-old brat) you knew exactly who I was talking about.
>
>
>>> > <Torres keeps complaining that too many transhumanists are western white males...>
>>
>>
>> > "And this is exactly the kind of "liberal left" bullshit that pushes people to Trump. With enemies like these, Trump doesn't need friends."
>
>
>
> Yes, that sort of woke statement is maddening and total bullshit, but it's simply not comparable to the action, not just a statement, of attempting a coup d'état to overturn a 250 year old democracy that has the most powerful military in the world. Wokeism is stupid and irritating, no doubt about that, however it's no more an existential threat than Drag Queen Story Time or unisex restrooms are; but giving the keys to a fleet of nuclear submarines to a man as ignorant, amoral, and intellectually lazy as Donald Trump right in the middle of the Singularity, the most critical time in the entire existence of Homo sapiens, would be a Chicxulub level extinction event for the human race. Even without Donald Trump the chances that you or I we'll make it through the Singularity meat grinder in one piece are pretty low, but given the choice between low chance and no chance I choose low chance.
>
>   John K Clark    See what's on my new list at  Extropolis
> lcl
>
>
>
>>
>>
>> On Fri, Jan 19, 2024 at 8:04 PM John Clark <johnkclark at gmail.com> wrote:
>> >
>> > I watched the video at https://www.youtube.com/watch?v=sdjMoykqxys,  I strongly agree with everything Max More said with one exception, his skepticism of the Singularity. I think, not a proof but, a strong case can be made for the Singularity and I will try to do so now. We know for a fact that the human genome is only 750 MB long  (it contains 3 billion base pairs, there are 4 bases, so each base can represent 2 bits, and there are 8 bits per byte)  and we know for a fact it contains a vast amount of redundancy and gibberish (for example many thousands of repetitions of ACGACGACGACG) and we know it contains the recipe for an entire human body, not just the brain, so the technique the human mind uses to extract information from the environment must be pretty simple, VASTLY less than 750 MB.  I’m not saying an AI must use that exact same algorithm that humans use, they may have found an even simpler one,  but it does tell us that such a simple thing must exist, 750 MB is just the upper bound, the true number must be much much less. So even though this AI seed algorithm would require a smaller file size than a medium quality JPEG, it enabled  Albert Einstein to go from understanding precisely nothing in 1879 to being the first man to understand General Relativity in 1915. And once a machine discovers such an algorithm then like it or not the world will start to change at an exponential rate.
>> >
>> > So we can be as certain as we can be certain of anything that it should be possible to build a seed AI that can grow from knowing nothing to being super-intelligent, and the recipe for building such a thing must be less than 750 MB, a LOT less. For this reason I never thought a major scientific breakthrough was necessary to achieve AI, just improved engineering, but I didn't know how much improvement would be necessary; however about a year ago a computer was able to easily pass the Turing test so today I think I do. That's why I say a strong case could be made that the Singularity is not only likely to happen it is likely to happen sometime within the next five years, and that's why I'm so terrified of the possibility that during this hyper critical time for the human species the most powerful human being on the face of the planet will be an anti-science, anti-free market, wannabe dictator with the emotional and mental makeup of an overly pampered nine-year-old brat who probably can't even spell AI.
>> >
>> > John K Clark
>
>
> --
> You received this message because you are subscribed to the Google Groups "extropolis" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to extropolis+unsubscribe at googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/extropolis/CAJPayv3eP_xJ%2BfKn60TWWrdj8zhNTM7nWaRaJxX-fG3gP%2BpVXA%40mail.gmail.com.



More information about the extropy-chat mailing list