sen.otaku at gmail.com
Wed Mar 11 22:28:20 UTC 2020
I wasn’t doing anything useful, lol.
I spent the other half of the day cutting and recombining a PDF textbook into 4 smaller books: the instructions, the review, the exercises, and the answer key.
It’s not the cutting that’s the issue, it’s the making sure I cut it the right way.
One day AI can do it for me! Until then...
> On Mar 11, 2020, at 5:12 PM, William Flynn Wallace via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> Well, did not want to take your day off, so I hope that you are doing something for yourself and not just for me. Obviously you have done a lot of soul-searching the past couple of years. No hurry!
> bill w
>> On Wed, Mar 11, 2020 at 4:58 PM SR Ballard via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>> I started to reply to this, but it turns out the reply was a lot longer than I expected — it’ll be a hot minute but I’ll get back to you.
>> I’ve literally been working on it for 4 hours. But I think it’s important to make it as clear as possible.
>> There goes my day off!
>> SR Ballard
>>> On Mar 11, 2020, at 11:01 AM, William Flynn Wallace via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>> I’m willing to answer any kinds of questions.
>>> SR Ballard
>>> I am highly interested in your changes. As I posted earlier, attitude change is difficult at best, and changing to something nearly opposite is quite rare. So - what motivated your changes? Were your earlier feelings mostly emotional? Mostly factual and rational? How much of it was due to your religious attitudes changing? Are you comfortable now with your attitudes in the sense of arriving at a permanent thing?
>>> Well, that's enough. As objectively as possible, please, and thank you in advance.
>>> bill w
>>>> On Wed, Mar 11, 2020 at 10:54 AM SR Ballard via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>> > Quoting SR Ballard:
>>>> >> If we’re talking about overall perception of the world, I actually used to hold an alt-right position to the lines of “the personal and professional freedom of modern women, and general modern egalitarianism will cause/is causing the downfall of western civilization”.
>>>> > At the time did you rationalize this position or simply feel it?
>>>> > Stuart LaForge
>>>> This was ABSOLUTELY rationalized by myself. There’s a whole sea of videos and facts to bolster my opinion.
>>>> It’s a difficult position to be in, when one considers oneself an accursed destroyer of modern civilization.
>>>> I was also more religious at the time (and in a different way than I currently am) as well as more economically insecure.
>>>> I think for a lot of stupid opinions (such as the one I had above) I think there is probably a high correlation with an insecure lifestyle and general religiosity in that period of life.
>>>> Most of these opinions people have (flat earther, antivaxx, space isn’t real, young earth, no evolution) are actually victims of cults in plain sight. It doesn’t really seem that way, initially, because they are generally only tangentially religious, and have ill defined leadership.
>>>> However, I think it’s impossible to look at the followings of come icons and not see it. Personalities such as Sargon of Akkad. I used to be an avid watcher and had spent hours and hours of my life transcribing out his videos and seeking evidence for them, so I could add citations.
>>>> There are other personalities as well, such as Jordan Peterson, who initially seems somewhat harmless. But it’s “hero worship” out of control, a cult of personality, if you will.
>>>> Now the primary audience of these channels is 15-30 year old men. I would say that the majority of the content I watch has that same primary audience. But it also poses some issues: namely a lot of opinions about women get shared, but there aren’t a lot of women to say, “hey, wait a minute!” In things like gaming or discussing sports anime it’s usually fine, but in the political realm, with a sub-base that’s a good portion self-proclaimed “incel”, the idea that women are destroying western civilization by not being baby factories, or that women should be assigned to men by some sort of sex-slavery-lotto, or should be partially stripped of their rights so that they are unable to leave their husbands (mainly financially, but also socially)... these ideas can be swallowed easily when there aren’t that many women around.
>>>> In the “Dark Web”, much like the alt-right, there is a facade of logic, objective knowledge, and science. However, it’s degenerated into an anti-SJW cesspool, on a general level.
>>>> It’s fine and well to think that this kind of thing is an isolated thing, but it’s actually partially driven by AI attempts — YouTube’s algorithm was changed specifically because people were getting sucked into this niche without actually having ever seemed it out. After watching one mild videos, they’d be presented with two which were a bit more intense, then 4 which were medium, and so on until they were hardcore decrying the women’s rights movement. They were literally groomed for extremism by YouTube’s AI.
>>>> I’m not sure if members of this list can really relate to the process I’m talking about, or the mindsets involved. But honestly, I think the “Intellectual Dark Web” is actually a source of a really big push against Transhumanist interests, in the future, just as hyper-religiosity has been in the past.
>>>> I’m willing to answer any kinds of questions.
>>>> SR Ballard
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat