[ExI] text singularity
atymes at gmail.com
Thu Jan 12 04:19:53 UTC 2023
To have a singularity as described, it is necessary to completely remove
human - non-exponentially-accelerating, or at least not doing so nearly
fast enough to reduce generation times small enough to achieve a
singularity any time soon - input from the loop.
The author notes that this can not currently be done.
The author does propose AI examples from the real world that try, but these
examples fall laughably short in practice. Not of the form "they're trying
and they are likely to improve", either: years of work has not made the
recommendation engines of Amazon, Google, and the like noticeably better
despite trying exactly the approach that should theoretically work.
Perhaps some day, AI will be able to actually learn and realistically
measure human values and tastes, so as to be able to effectively judge new
works in this light. That day does not appear to be today, and it will
take some new invention - not just the refinement of existing AIs along
their currently implemented paths - to pull that off.
On Wed, Jan 11, 2023 at 3:48 PM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> Reason has an interesting take on Vinge’s singularity, by focusing on a
> particular aspect. We think of a general AI singularity, but this writer
> Michael Munger looks at only how ChatGPT and its cousins will impact
> writing. He makes some really good points here:
> The Future of GPT4 Chatbot Text Production (reason.com)
> I am tempted to steal some of them.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat