[extropy-chat] Precognition on TV

Eliezer S. Yudkowsky sentience at pobox.com
Fri Mar 16 05:18:14 UTC 2007


Damien Broderick wrote:
> 
> Well, to be other than sardonic for a moment: why not? The groundwork 
> for a lot of academic work is begun in the corridor, the cafeteria, 
> after the colloquium, or in emails. If you feel strongly that 
> considerations of conditional independence (or whatever; you were 
> cryptic) necessarily undermine *in advance* both Cramer's current 
> physics experiment and Utts's statistical analyses, I can't see why 
> you shouldn't mention to them the gaping hole in their apparatuses. 
> If not via personal communication, then perhaps in the form of a 
> brief letter to a scientific or mathematical journal?

People looking for crazy things don't always find what they're looking 
for but they often find all sorts of interesting other things; it's why 
I cheer on the people looking for quantum effects in neurons.

The reason my head would explode if someone sent a message into the past 
is that it would violate the principle that points in spacetime 
influence only neighboring points in spacetime - that's what I mean by 
the Markov principle.  This seems to be built in an absolute and 
fundamental sense into the structure of reality.

But that's not a physical calculation, just a belief about the character 
of physical law.  And just because my head would explode if it failed 
doesn't mean that it shouldn't be tested as hard as anyone can.  If I 
wrote a letter like that, it would be an explanation of how important it 
was to test the theory - not of why the test should not be performed.

A novel and interesting test whose negative answer you are absolutely 
sure of, is a good experiment to perform.  I wouldn't want to see some 
idiot peer reviewer cancelling their grant because they thought only 
experiments with expected positive results should be performed.

Nonetheless, Damien, I'm pretty damned sure that neurons can't predict 
the future.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list