[ExI] GPT-3 Improving accuracy

BillK pharos at gmail.com
Thu Jan 27 19:48:07 UTC 2022

On Thu, 27 Jan 2022 at 19:19, spike jones via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
> Ja this is worrisome BillK.  It is easy enough to foresee that these kinds of tools will be used to write most news stories, allowing the news agencies that use them to produce their stories with fewer staff, costing less.  However... someone somewhere must identify what is misinformation.  The notion that covid was created in a lab and leaked into the public was once considered misinformation, and this was very recent.
> Now... that scenario is plausible.  > extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chatVery credible sources admit it is the most likely scenario.  In the meantime, those actual scientists who figured it out early have been identified as sources of misinformation.  It isn't clear how to undo that damage, if it is even possible.
> Regardless of that, if the resource exists, news agencies will use them, for they must compete if they wish to stay in business.  The big mainstream news sources are making their living selling ad space.  With this chatbot, one guy can have it read their stories, generate new ones and offer the same info with no ad space.
> spike
> _______________________________________________

The mainstream news will reduce costs by requiring fewer writers.
GPT-3 is already writing articles with just a human checking
afterward.  If the article is too bad, GPT-3 just writes another in

Re disinformation, one comment I saw suggested that GPT-3 could be
used to write *better* disinformation articles and be used to run a
whole disinformation campaign.

It won't be long until we are questioning everything we read on the internet.
And questioning images we see on the internet as deepfakes get better
all the time.


More information about the extropy-chat mailing list