[ExI] ai in education
John Clark
johnkclark at gmail.com
Sun Mar 8 11:41:42 UTC 2026
On Sat, Mar 7, 2026 <spike at rainier66.com> wrote:
* >>> A soldier does not want the guy in the foxhole next to him pondering
>> values and making nuanced decisions on whether or not to defend him. He
>> doesn’t want his own weapons doing that either.*
>
>
>
>> *>>…OK I can understand why the military doesn't like that, but you're
>> not in the military so why do you dislike it? John K Clark*
>
>
>
> *> The same reasons the military distrusts Anthropic would cause me to
> distrust it:*
>
*But do you distrust Anthropic more than you distrust the US military? I
don't, not when the commander in chief of that military is He Who Must Not
Be Named. *
*And as is your custom without constant repetition you don't answer
questions that might cause you to doubt your worldview, so I will repeat it
now for the third time: *
"*Who do you believe has a history of telling fewer lies, the scientist
Dario Amodei who is the head of Anthropic, or the most famous twice
divorced TV game show host in America?" *
*And now all I'll ask another question which I'll probably have to repeat
many times before I get an answer: *
*I understand why the military might not want to purchase Anthropic
products but they have done much more than just that, they have designated
the company a supply chain risk, something that has never happened before
to a US company. So do you really believe, as POTUS does, that Anthropic
deserves to be assassinated because it places too much emphasis on AI
safety?? *
*> How do we prioritize which targets to protect, which to sacrifice, which
> missiles to fire, which to hold back? Something like that doesn’t need all
> the capability of the software we are think of as AI. It needs more
> specific training for a more specific task.*
*I'm sure the military has something like that but it's old technology that
has been around for decades, at one time that may have been called AI but
it's not AI in the modern sense of the term. It might be useful for VERY
SPECIFIC tactical situations such as the one you described but it will be
useless more generally, useless at strategy or managing logistics or
intelligence analysis or weapons development. *
* > John you assure us with complete confidence that such a system doesn’t
> already exist *
*The primitive system you described certainly does exist, but who cares. *
*> and that anyone who makes it to the top of the military is stupid. *
*I would maintain that there is empirical evidence that the person at the
very top of the US military is not only very stupid and showing clear signs
of Alzheimers, he is also evil. And no, I am not afraid to use that word. *
*> We are buying AI. We need complete control of it before we can trust it
> with our defenses,*
*If the US military demands complete control and certainty at how an AI
will behave before they use it then the US military will NEVER be able to
use AI. And the Chinese military will beat the US military into a bloody
pulp. *
* John K Clark *
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20260308/a4bcc037/attachment.htm>
More information about the extropy-chat
mailing list