[ExI] Censorship
William Flynn Wallace
foozler83 at gmail.com
Sat May 28 14:35:58 UTC 2016
I actually think this kind of quibbling is hamstringing the conversation
about censorship. Legally, we may want clear definitions (but in countries
where censorship is a big problem the law is often part of the problem).
But when trying to come up with solutions and improvements things often
descend into a morass of semantics. anders
My example still fits the dictionary definition, but apparently it does not
fit yours. Yes, it is all semantics - every discussion depends on a shared
definition of terms.
I see nothing derogatory about calling something semantics.
If my example is not censorship, then just what is it? It is a given that
your definition will not agree with the dictionary one, so who is confusing
the discussion, you or me? Over to you, sir.
bill w
On Sat, May 28, 2016 at 3:30 AM, Anders Sandberg <anders at aleph.se> wrote:
> I actually think this kind of quibbling is hamstringing the conversation
> about censorship. Legally, we may want clear definitions (but in countries
> where censorship is a big problem the law is often part of the problem).
> But when trying to come up with solutions and improvements things often
> descend into a morass of semantics.
>
> The essence of censorship is that group A prevents group B from
> communicating something to society, based on it originating in group B
> (suppressing their power) or containing something A doesn't like. One can
> construct legitimate cases, both in the sense that group A is legitimately
> appointed and that the suppression is for a legitimate reason. The problem
> is that a lot of cases are not legitimate, either formally - nobody
> appointed A as the moral guardians - or from a moral standpoint - the
> reasons for suppression are not valid.
>
> There are lots of intermediate levels. Nobody appointed parents, yet they
> might have a legitimate say in how conversations are held in their family.
> The publisher that refuses to print a book is legitimate in their decision,
> yet they might have a morally bad reason (maybe they didn't like the race
> of the author). Most of these cases can be dealt with by the various local
> rules we have about families, companies and the like.
>
> The key ones, the ones I think we *need* to get right, are the ones that
> have society-wide reach. If the censorship affects everybody, then it is
> everybody's problem. In particular, it interferes with the key functioning
> of an open society: that anything is open for criticism, and if the members
> think the criticism is valid, the thing can be changed through collective
> decisions. If certain things cannot be critiqued or if it is not possible
> to have a debate about whether they should be changed, then society is not
> open. Hence censorship by powers that can affect all of society is deeply
> problematic, and legitimate censorship needs to be kept on a very tight
> leash.
>
> One interesting issue is how to handle the emergence of new, globalised
> platforms of power. In the past this rarely happened and most thinking
> about how to handle censorship was based on states. However, Facebook,
> Apple and Google certainly perform censorship within their domains, yet
> their domains are often so wide that they can be said to exert society-wide
> effects. Does that mean we need to have a global oversight over their
> activity? Things get even trickier since the global realm includes non-open
> societies.
>
>
> On 2016-05-27 21:19, William Flynn Wallace wrote:
>
> Well Dan I hate to tell you this, but we have censorship now in TV,
> movies, books and maybe more. I read recently that about 40k books are
> published every month and some one has the say-so about its going on sale
> somewhere (where might be determined by its rating).
>
>
> No, that is not censorship. If you as a publisher tell me that you will
> not publish my book because it is crap/politically incorrect/will not
> sell/it is Friday that is your prerogative. There is no right to have
> stuff published. Censorship occurs is when a centralized power can decide
> to prevent publication because of content. (Some iffy definitions for
> post-publication action, but the core is prepublication approval).
> anders
>
> Censorship is the suppression of speech, public communication or other
> information which may be considered objectionable, harmful, sensitive,
> politically incorrect or inconvenient as determined by governments, media
> outlets, authorities or other groups or institutions. dictionary
>
> I don't want to quibble about words, but what I wrote is well within the
> definition above. Certainly the type Anders mentioned is far more
> dangerous and threatening. This has nothing to do with free speech. Of
> course Anders
> is right that no one has the right to have his stuff published anywhere.
> College newspaper editors found that out for sure a few years ago in a
> court case. Not letting a college writer put his stuff in a campus
> newspaper is not a violation of free speech, but it is censorship.
>
> bill w
>
>
>
> On Fri, May 27, 2016 at 3:10 AM, Anders <anders at aleph.se> wrote:
>
>> On 2016-05-26 21:49, William Flynn Wallace wrote:
>>
>> Why would it be ethical to have censorship in the first place? It's like
>> saying "Put an AI in charge of slavery..."
>>
>> Dan
>>
>> Well Dan I hate to tell you this, but we have censorship now in TV,
>> movies, books and maybe more. I read recently that about 40k books are
>> published every month and some one has the say-so about its going on sale
>> somewhere (where might be determined by its rating).
>>
>>
>> No, that is not censorship. If you as a publisher tell me that you will
>> not publish my book because it is crap/politically incorrect/will not
>> sell/it is Friday that is your prerogative. There is no right to have stuff
>> published. Censorship occurs is when a centralized power can decide to
>> prevent publication because of content. (Some iffy definitions for
>> post-publication action, but the core is prepublication approval).
>>
>> I can easily see an AI being used for some of the labor of digesting all
>> this material. I also think an AI would never be in charge of actual
>> censorship, but the AI could kick out books, movies, that fudge certain
>> guidelines so that a human, or a committee, or the Supreme Court could
>> decide what to do with it.
>>
>>
>> In a sense this is happening with YouTube, where copyright infringing
>> material is blocked - officially after a human has looked at what the
>> algorithm found, but obviously often without any human oversight. For
>> various sad, hilarious or rage-inducing examples, just search Boing Boing
>> or Slashdot's archives.
>>
>>
>> Now whether there should BE any kind of censorship is an entirely
>> different question, one that could be debated in this group if it hasn't
>> before (not likely).
>>
>>
>> As I have mentioned, I am starting to study information hazards (
>> http://www.nickbostrom.com/information-hazards.pdf ) Some of these may
>> actually be serious enough that we rationally should want some form of
>> censorship or control.
>>
>> Others are not serious enough, but we may want to have systems that
>> discourage them (libel law, boycotts, whatever).
>>
>> But we have to be careful with that (e.g.
>> <http://blog.practicalethics.ox.ac.uk/2014/04/the-automated-boycott/>
>> http://blog.practicalethics.ox.ac.uk/2014/04/the-automated-boycott/ ). I
>> recently enjoyed reading a series of case studies showing how information
>> concealment played an important role in many big disasters (
>> http://aleph.se/andart2/risk/the-hazard-of-concealing-risk/ ).
>> Generally, limiting information cuts out the good with the bad, and we are
>> not very skilled at distinguishing them a priori. Plus, management requires
>> information: if the problem is an underlying structure or something
>> concrete rather than bad information per se, then the agencies that manage
>> - whether institutional or the open society - need to get that information
>> to do something. Far too often censorship just looks for surface detail.
>>
>>
>>
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>>
>
>
> _______________________________________________
> extropy-chat mailing listextropy-chat at lists.extropy.orghttp://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
>
> --
> Dr Anders Sandberg
> Future of Humanity Institute
> Oxford Martin School
> Oxford University
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20160528/62592ca6/attachment.html>
More information about the extropy-chat
mailing list