[ExI] Censorship

William Flynn Wallace foozler83 at gmail.com
Fri May 27 20:19:10 UTC 2016


Well Dan I hate to tell you this, but we have censorship now in TV, movies,
books and maybe more.  I read recently that about 40k books are published
every month and some one has the say-so about its going on sale somewhere
(where might be determined by its rating).


No, that is not censorship. If you as a publisher tell me that you will not
publish my book because it is crap/politically incorrect/will not sell/it
is Friday that is your prerogative. There is no right to have stuff
published. Censorship occurs is when a centralized power can decide to
prevent publication because of content. (Some iffy definitions for
post-publication action, but the core is prepublication approval).
​anders

Censorship is the suppression of speech, public communication or other
information which may be considered objectionable, harmful, sensitive,
politically incorrect or inconvenient as determined by governments, media
outlets, authorities or other groups or institutions.  dictionary

I don't want to quibble about words, but what I wrote is well within the
definition above.  Certainly the type Anders mentioned is far more
dangerous and threatening.  This has nothing to do with free speech.  Of
course Anders
is right that no one has the right to have his stuff published anywhere.
College newspaper editors found that out for sure a few years ago in a
court case.  Not letting a college writer put his stuff in a campus
newspaper is not a violation of free speech, but it is censorship.

bill w
​


On Fri, May 27, 2016 at 3:10 AM, Anders <anders at aleph.se> wrote:

> On 2016-05-26 21:49, William Flynn Wallace wrote:
>
> Why would it be ethical to have censorship in the first place? It's like
> saying "Put an AI in charge of slavery..."
>
> Dan
>
> Well Dan I hate to tell you this, but we have censorship now in TV,
> movies, books and maybe more.  I read recently that about 40k books are
> published every month and some one has the say-so about its going on sale
> somewhere (where might be determined by its rating).
>
>
> No, that is not censorship. If you as a publisher tell me that you will
> not publish my book because it is crap/politically incorrect/will not
> sell/it is Friday that is your prerogative. There is no right to have stuff
> published. Censorship occurs is when a centralized power can decide to
> prevent publication because of content. (Some iffy definitions for
> post-publication action, but the core is prepublication approval).
>
> I can easily see an AI being used for some of the labor of digesting all
> this material.  I also think an AI would never be in charge of actual
> censorship, but the AI could kick out books, movies, that fudge certain
> guidelines so that a human, or a committee, or the Supreme Court could
> decide what to do with it.
>
>
> In a sense this is happening with YouTube, where copyright infringing
> material is blocked - officially after a human has looked at what the
> algorithm found, but obviously often without any human oversight. For
> various sad, hilarious or rage-inducing examples, just search Boing Boing
> or Slashdot's archives.
>
>
> Now whether there should BE any kind of censorship is an entirely
> different question, one that could be debated in this group if it hasn't
> before (not likely).
>
>
> As I have mentioned, I am starting to study information hazards (
> http://www.nickbostrom.com/information-hazards.pdf ) Some of these may
> actually be serious enough that we rationally should want some form of
> censorship or control.
>
> Others are not serious enough, but we may want to have systems that
> discourage them (libel law, boycotts, whatever).
>
> But we have to be careful with that (e.g.
> http://blog.practicalethics.ox.ac.uk/2014/04/the-automated-boycott/ ). I
> recently enjoyed reading a series of case studies showing how information
> concealment played an important role in many big disasters (
> http://aleph.se/andart2/risk/the-hazard-of-concealing-risk/ ). Generally,
> limiting information cuts out the good with the bad, and we are not very
> skilled at distinguishing them a priori. Plus, management requires
> information: if the problem is an underlying structure or something
> concrete rather than bad information per se, then the agencies that manage
> - whether institutional or the open society - need to get that information
> to do something. Far too often censorship just looks for surface detail.
>
>
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20160527/77b8e200/attachment-0001.html>


More information about the extropy-chat mailing list