[ExI] A science-religious experience

Jason Resch jasonresch at gmail.com
Fri Feb 21 12:37:13 UTC 2025


On Thu, Feb 20, 2025, 4:30 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Thu, Feb 20, 2025 at 1:11 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>
>> On Thu, Feb 20, 2025, 1:15 PM Brent Allsop via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>>
>>> On Wed, Feb 19, 2025 at 10:51 AM Jason Resch via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>>  for the type 2 gods to find the beings to save, they must still
>>>> simulate the universes where bad things happen.
>>>>
>>>
>>> Hi Jason,
>>> Lots of very interesting thoughts...
>>> But I don't buy this particular theodicy or justification for evil.
>>> There are at least two types of computation/simulation, as illustrated in
>>> this image:
>>> [image: The-Strawberry-is-Red-0480-0310.jpg]
>>>
>>> Future gods could simulate everything with Abstract R type simulators
>>> which aren't like anything, so no suffering.
>>>
>> Bottom line, any supper being running a phenomenal simulation full of
>>> evils like we experienced with WW II, while hiding from the phenomenally
>>> suffering beings, would be devils who we should fight against and overcome,
>>> showing them better abstract ways to do simulation searches for
>>> phenomenally suffering beings.
>>>
>>
>>
>> If it is possible to simulate consciousness minds in full detail without
>> invoking their consciousness, then I agree.
>>
>> But if philosophical zombies are not logically possible, then this is a
>> feat no god can do.
>>
>> Jason
>>
>
> I believe that philosophical zombies are not logically possible, as you
> can't be physically identical without qualia, since qualia are physical
> facts.
>

But as a functionalist, I believe in a stronger form of the impossibility
of zombies. Zombies don't need to be physically identical under
functionalism, just brain-behaviorally identical.

Since minds are generally speaking, chaotic systems, behavior can't be
predicted without simulation.

This implies a type of free will: neither the universe, nor God, can know
what you will do without invoking you, your mind, and your consciousness in
the process. Accordingly, your actual behavior cannot be "predicted," by
anyone, it can only be "watched."

Jason



> R is not a philosophical zombie.  It just computes in a very different
> way.  R is designed to use discrete logic gates for the computational
> binding, and any old abstract substrate independent (requires a dictionary)
> representation of information.
>
> There is the additional issue of there must be something it is like to
> experience something like redness, so you would need to add the additional
> abstract logic to act as if it was really like redness, even though it
> wasn't.  But all this type of stuff is simple good moral engineering, like
> all engineering used to alleviate suffering and solve problems.
>
>
>
>
>
>
>
>
>
>
>>
>>
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250221/6111b84d/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: The-Strawberry-is-Red-0480-0310.jpg
Type: image/jpeg
Size: 76053 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250221/6111b84d/attachment-0001.jpg>


More information about the extropy-chat mailing list