[ExI] bots replacing humans...

Keith Henson hkeithhenson at gmail.com
Thu Jun 1 18:46:05 UTC 2023


On Thu, Jun 1, 2023 at 10:17 AM William Flynn Wallace via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
>
> Spike, I am told that bots run municipal water systems.

That's not exactly the case.  I noted that the controls for water
systems are on the Internet.  There was a case where someone broke
into one and increased the fluoride to the max.  Was not enough to
hurt, but they could have shut off the pumps.

Keith

Who is liable if they mess up?  Somebody has to be liable.  "Act of
God" just doesn't do it for me.  Some kind of warranty must exist.
And, how do people assess the dangers of bots if they aren't tech
people?    bill w
>
> On Thu, Jun 1, 2023 at 10:59 AM spike jones via extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>
>>
>>
>>
>>
>> From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of William Flynn Wallace via extropy-chat
>>
>> Subject: Re: [ExI] bots replacing humans...
>>
>>
>>
>> >…Replace humans with bots and the bots mess up and no one is liable? Sounds like an idea that managers might like - get rid of people.
>>
>> Somebody has to be liable if bots cause injury.  The bot people as I understand it, do not know how to make the bots stop giving wrong answers.  Do we have a Catch 22 here?   bill w
>>
>>
>>
>>
>>
>> Billw, I was in a movie theatre a bunch of years ago.  A scene showed a public bus swerving to avoid hitting the local shepherd’s flock and crashing into a tree.  The passengers staggered off and wandered away with various injuries.  The movie audience burst out laughing at the unintentionally funny scene: Americans have grown accustomed to someone being liable for anything bad that happens.  In that country, no one is.  If there is an accident, too bad for you, best wishes on a full recovery, etc.
>>
>>
>>
>> This is a big problem for national parks.  They are not liable if tourists try to pet the bison, which are wild animals and are not separated from the boardwalks, filled with city proles accustomed to having everything made safe.  Everything isn’t safe.  In Yellowstone, they are reintroducing wolves.  If some Rufus McGoofus goes hiking in the back country and is devoured by wolves, the national park system is not liable for that.  Shock.  But city people and especially Americans, grow up accustomed to everything being made safe and someone being responsible for everything.  So they don’t exercise a lick of sense or take any reasonable precautions.
>>
>>
>>
>> An eating disorder hotline can be set up with a recording that says: The staff unionized and went on strike, haven’t seen them since, but you may log on to talk to a chatbot, for which no one is liable, use at your own risk, sign up for an account at PhatGPT.  If that works for you, we also have a suicide prevention hotline, likewise staffed by AI.  Sign up for a free account at ShotGPT.  Use both at your own risk, good luck, there’s no one left working here, for we couldn’t afford the liability.
>>
>>
>>
>> With the proper disclaimers, I see no reason why a suicide prevention bot and a diet bot would present liability.  You get free generic diet advice and free mechanized generic empathy, worth every cent you pay.
>>
>>
>>
>> spike
>>
>>
>>
>>
>>
>>
>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat



More information about the extropy-chat mailing list