[ExI] Meaningless Symbols

Ben Zaiboc bbenzai at yahoo.com
Mon Jan 11 16:21:30 UTC 2010


Stathis Papaioannou <stathisp at gmail.com> wrote:

2010/1/11 Ben Zaiboc <bbenzai at yahoo.com>:

>> Whether this information is produced by a 'real body' in the 'real world' or a virtual body in a virtual world makes absolutely no difference (after all, we may well be simulations in a simulated world ourselves. Some people think this is highly likely). ?I imagine it would lead to a pretty precise meaning for whatever internal signal, state or symbol is used for "two metres to the left".

>
> If we find an intelligent robot as sole survivor of a civilisation
> completely destroyed when their sun went nova, we can eventually work
> out what its internal symbols mean by interacting with it. If instead
> we find a computer that implements a virtual environment with
> conscious observers, but has no I/O devices, then it is impossible
> even in principle for us to work out what's going on. And this doesn't
> just apply to computers: the same would be true if we found a
> biological brain without sensors or effectors, but still dreaming away
> in its locked in state. The point is, there is no way to step outside
> of syntactical relationships between symbols and ascribe absolute
> meaning. It's syntax all the way down.

No argument here about it being syntax all the way down, as long as you apply this to 'real-world' systems as well as simulations.

In your example, you may be right about us not being able to understand what's going on, because we don't inhabit that level of reality (the alien sim), and have no knowledge of how it works.  But so what? Just because we don't understand it, doesn't mean that the virtual mind doesn't have meaningful experiences.  

Presumably the sim would map well onto the original aliens' 'real reality', though, which might baffle us initially, but would be a solvable problem, meaning that the sim would also in principle be solvable (unless you think we can never decipher Linear A).

In a human-created sim, of course, we decide what represents what.  Having written the sim, we can understand it, and relate to the mind in there. 

In this case, there's no difference (in terms of meaning and experience) between a pizza being devoured on level 1 or on level 2, as long as the pizza belongs to the same reality level as the devourer, and one level is well-mapped to the other.

(I am of course, talking about proper simulations, with dynamic behaviour and enough richness and complexity to reproduce all the desired features at the necessary resolution, rather than a 'Swobian simulation', such as a photograph or a cartoon).

Ben Zaiboc


      



More information about the extropy-chat mailing list