[ExI] Chatgpt is replacing therapists

spike at rainier66.com spike at rainier66.com
Wed May 24 18:06:11 UTC 2023



-----Original Message-----
From: extropy-chat <extropy-chat-bounces at lists.extropy.org> On Behalf Of BillK via extropy-chat
...
>...(There is a new profession appearing—A ChatGPT prompt advisor)! BillK

This article teaches how to ask the right questions.
<https://joyninja.com/chatgpt-self-therapy-personal-growth/>



Sure is, but when you think of it, that is round three.  The first round was when I was teaching my young friends at the chess club how to use the software to work out specific chess skills.  The second round was in my volunteer lectures at the high school, I offered advice on how to effectively use Google, back in the days when teachers were still struggling with the notion that education would never be the same, because students could just Google the subject and find a website somewhere where someone had done the assignment or something close enough to it.  We still needed to teach students how to use Google right: how to evaluate the cited sources.  That's how we do what we do now, ja?

Now, round three: teaching students how to verify what GPT is saying.  It is really a meta-skill of round two.  If one practices with GPT, one finds a bunch of ways to get to the right answers.  A good way to practice is to query GPT on a subject you already know well, something you can personally verify or refute.  That exercise will send you away with a whole nuther attitude.  GPT doesn't know all the answers.  It acts like it does, and speculates with fully self-assured confidence and perfect grammar.  But it isn't always right and doesn't know it isn't right.

GPT doesn't seem very good at indicating it is speculating the way non-arrogant humans do.

spike




More information about the extropy-chat mailing list