<br><div><span class="gmail_quote">On 4/7/06, <b class="gmail_sendername">ben</b> <<a href="mailto:benboc@lineone.net">benboc@lineone.net</a>> wrote:</span><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
Now - can somebody satisfactorily define 'Consciousness'? :-P</blockquote><div><br>This is problematic since we normally use it in the sense of "conscious" and "unconscious". Using that is how I came around to the perspective of an emergent property of seeking to satisfy needs and/or goals. (I expect there is a huge body of literature in the AI community regarding consciousness and presumably a discussion of that belongs on the SL4 list and not the ExI list.) To my way of thinking consciousness involves free will (freedom to select from choices) and acting upon that free will. Extending it a bit further and it involves the potential to have such free will and actions -- but then of course one is rapidly wading into the swamp as the printer next to my desk, if sufficiently "enhanced", could perform an analysis of whether or not it wanted to print my document. ("I" don't print when my ink is low and I certainly don't print on Sundays.)
<br><br>However, if we assume that for the most part copies are still "me", varying perhaps to the degree that I myself may vary when I wake up on Monday vs. Friday (or a sunny day vs. a rainy day, etc.) then we still have to wrestle with what "rights" should be granted or recognized by society for "me". Now one interesting aspect revolves around the extent to which one has a "collective" consciousness (
i.e. the quantity and quality of information being shared between what may be physically separate entities). This is interesting from the perspective considering what happens in individuals with surgically separated brain hemispheres where the right and left halves may be generally unaware (and have little control over) the actions taken by the "other" half.
<br><br>If ones right brain murders someone, does ones left brain have to endure the punishment for that action? I think I can construct scenarios where the left brain could not have prevented the actions of the right brain. Now of course with copies one can get into very interesting scenarios -- copies 1, 3 & 4 knew copy #2 was considering killing someone but cut off communication from copy #2 shortly before and during the act then rejoined the collective consciousness after the action was taken. Who is responsible? Who gets punished for murdering someone and who gets punished for conspiracy to commit murder?
<br><br>And of course, if one says that the solution to this is to punish the actor (i.e. society chooses to shut down copy #2) when of course #2 is only a roughly equal sub-part of the entire consciousness -- then one gets into the question of when society should have the right to reach into your mind and delete "offending" thoughts. This is extending recent discussions of being arrested for *thinking* about doing something "wrong" and things like AT&T being in the news for forwarding a significant fraction of its data traffic to the machines scanning it for the NSA. --
E.g. "society" gets to monitor all of your "thoughts" and gets to eliminate the neurons, hard drives, flash chips, etc. responsible for producing them. Of course the down side to this is perhaps a reality with significantly less creativity because presumably it is thinking outside of the box (heavier than air flight, walking on the moon, etc.) which has given rise to some of the things we consider to be great achievements by humanity [1]. I don't know how one would go about constraining a random thought generator to only produce "good" thoughts and not produce "bad" thoughts.
<br><br>Robert<br><br>1. And then of course, heaven forbid one would even *think* about using nuclear weapons to take out Mecca. They should of course only be considered (as the news is reporting today) for the purpose of eliminating underground uranium enrichment plants in Iran.
<br></div><br></div><br>