<div dir="ltr">Historically, "enforcing truth" has not been a good look.</div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Oct 28, 2022 at 10:22 AM Adrian Tymes via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">> If software can drive a car, is there any reason to think it can’t moderate content?<br><br><div>Recognizing what words mean is a different sort of task, and rather more difficult, than recognizing safe physical conditions on the road. In theory it could someday be done effectively, but not without a lot more work than has gone into Tesla - and once it was, we would be much closer to "true" AI.</div><div><br></div><div>> That has nothing to do with politics.</div><div><br></div><div>The political angle is, as has been famously put, "Reality has a liberal bias." Or to put it more accurately, many so-called "conservative" politicians rely on outright lies, far more on average than so-called "liberal" politicians these days, to the point that anything enforcing truth in politics is going to be seen as having an anti-conservative-politician bias. Twitter's moderation tries to veer toward verified truth.</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Fri, Oct 28, 2022 at 6:32 AM spike jones via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div><div lang="EN-US"><div><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">OK so Elon owns Twitter now. If you and I bought that outfit, we would do the same thing Elon is doing: working towards automating the content moderation. Isn’t that a perfectly obvious thing to do? If software can drive a car, is there any reason to think it can’t moderate content? Couldn’t you have something like the way car automation works, where a human still hasta sit behind the wheel to kinda watch over it? You could have humans (way fewer of them probably) to just supervise the software. You could make the software filters public.<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">Wouldn’t you do the same if you owned Twitter? Humans are expensive, software is cheap. That has nothing to do with politics. Nothing personal, just business.<u></u><u></u></p><p class="MsoNormal"><u></u> <u></u></p><p class="MsoNormal">spike<u></u><u></u></p></div></div>_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</div></blockquote></div>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>