[ExI] bots replacing humans...

William Flynn Wallace foozler83 at gmail.com
Thu Jun 1 15:37:34 UTC 2023


Replace humans with bots and the bots mess up and no one is liable? Sounds
like an idea that managers might like - get rid of people.
Somebody has to be liable if bots cause injury.  The bot people as I
understand it, do not know how to make the bots stop giving wrong answers.
Do we have a Catch 22 here?   bill w

On Thu, Jun 1, 2023 at 10:17 AM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> ...> On Behalf Of MB via extropy-chat
> Subject: [ExI] bots replacing humans...
>
> Maybe the suits shouldn't jump so fast?  Thorough testing of such things is
> necessary, no? ;)... Regards,
> MB
>
>
> MB the advice the chatbot dispensed is good advice for some people.  But
> not
> all.  So can we trust humans to do better than the chatbots?  We don't
> know,
> not necessarily  But... if an eating disorder hotline hires humans who give
> out bad advice, the organizer of the hotline is legally liable.  If it is
> set up with a chatbot and a legal disclaimer stated up front, the hotline
> provider is not legally liable.
>
> Same with a suicide prevention hotline.  We can set up software to provide
> mechanized generic empathy.  (Now there's three words that fight each
> other:
> mechanized generic empathy.)  But we can still imagine better empathy
> provided by life forms theoretically capable of actual emotion.  But with a
> suicide prevention hotline, there is liability for what the humans say, not
> for what the software would suggest.
>
> spike
>
>
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230601/bfb9cc18/attachment.htm>


More information about the extropy-chat mailing list