[ExI] Why stop at glutamate?
Brent Allsop
brent.allsop at gmail.com
Thu Apr 13 19:13:17 UTC 2023
You guys are talking about lots of interesting complex stuff. Recursion,
turtles all the way down, sophisticated algorithms, isomorphism, complex
neural networks, abstractions, and on and on. At least nobody here is
throwing in quantum uncertainty... yet. ;)
But does any of that have anything to do with a single simple stand alone
elemental quality of conscious experience like redness? How does any of
what you guys are talking about provide any insights on what a redness
quality is like, and how to communicate the nature of redness? If it is
"perceptions of interpretations of perceptions" all the way down, possibly
recursively so, spread across a complex network, and so on, what is it that
would enable one person to be engineered to be inverted, so they could have
an experience of a single pixel of red, that is like your grenness?
It seems to me if someone thinks there is a complex answer to that simple
question, or even worse, thinking that it is so complex, humanity may never
be able to comprehend it, they are thinking of things in entirely the wrong
way. In my opinion, all we need to know, we already learned in elementary
school. We just need to apply that simplicity to our knowledge, in our
head, instead of to the stuff out there. That's all there is to it. As
our elementary teacher taught us, it's as simple as: "THAT is redness."
On Thu, Apr 13, 2023 at 6:52 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
>
>
> On Thu, Apr 13, 2023, 5:29 AM Ben Zaiboc via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> On 13/04/2023 04:49, Giovanni Santostasi wrote:
>> > I want to make a correction to Ben's statement that is Turtles all the
>> > way down. The Turtles go deep but not all the way down. It stops in a
>> > place similar to the way we derive set theory from the null set.
>>
>> Hm, that's interesting. I was thinking about information (in the brain
>> specifically, and other similar machines), and that the informational
>> turtles go round in big complex loops, where every piece of information
>> is relative to other pieces of information, so in that sense there'd be
>> no end.
>>
>> In the wider world, though, I'm sure you're right. I tried to read about
>> information theory, conservation of information, etc., but it just
>> bamboozles me. The idea of the total energy (and presumably,
>> information) in the universe being zero does make sense, though (erm,
>> provided there can be such a thing as 'anti-information'?).
>>
>
> Perhaps that's entropy (uncertainty)? A coin flip, for example, has
> entropy of 1 bit.
>
> The Heisenberg uncertainty principle shows us the more information we
> learn about some properties of a system, the more we must unlearn (make
> uncertain) other aspects of that system.
>
> Information is sometimes described by physicists as negative entropy. QM
> shows that learning information (acquiring negative entropy) requires an
> equal creation of more uncertainty (entropy). So in a way the conversation
> of information might be the deeper principle behind the second law of
> thermodynamics and the conservation of energy.
>
> Jason
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230413/f1b91c4d/attachment.htm>
More information about the extropy-chat
mailing list