[ExI] chatgpt plus google translate

Jason Resch jasonresch at gmail.com
Mon Apr 10 19:45:39 UTC 2023


On Sun, Apr 9, 2023 at 10:16 PM Giovanni Santostasi via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> F Incredible. I used a different instance of GPT-4 to translate the
> symbolic language back to English.
> I asked an AI to make up a symbolic language and write a short essay on a
> particular topic. I will show the sequence of symbols and I want you to try
> to translate the best you can or at least guess the topic of the essay
> expressed by these symbols. ������������������������������
>
> Based on the sequence of symbols you've provided, it appears that the
> AI-generated symbolic language might be related to the following topic:
>
> The interconnectedness and collaboration of people around the world in
> various domains such as environmental conservation, education, art, and
> culture.
>
> Here's a possible translation of the symbols:
>
> �� - Earth, representing the global context �� - Interconnectedness or
> collaboration �� - People, indicating a focus on human relationships �� -
> Cooperation, emphasizing collaborative efforts �� - Environmental
> conservation or sustainable growth �� - Excellence or achievements �� -
> Global community or international relations �� - Diversity or a range of
> perspectives �� - Integration or blending of ideas �� - Communication or
> dialogue �� - Education or knowledge sharing �� - Art and creative
> expression �� - Precious or valuable contributions �� - Culture,
> representing various traditions and customs �� - Music or harmony, possibly
> symbolizing unity
>
> From this interpretation, the short essay might be discussing the
> importance of global collaboration and interconnectedness in addressing
> environmental issues, promoting education, and fostering cultural exchange
> and artistic expression.
>
>
>
>
>
At this point I think it's safe to say GPT-4 isn't just human level AGI,
nor just "sparks of AGI", but is already in the domain of ASI (artificial
superhuman intelligence).

While some highly trained humans in their fields may be able to do better
than GPT-4 in their limited domain, GPT-4 can out-do nearly every human in
every task, across a range of tasks that might take dozens or hundreds of
lifetimes to master.

It has an advantage a bit like Tesla's autopilot, which has access to
millions of years of driving experience to learn from (when any human might
gather at most 70 years of driving experience). A human in their lifetime
is above average if they have read 3,000 books in their lifetime (a book a
week), representing about 3 GB of information. GPT-4 was trained on
"CommonCrawl" which is 570 GB (
https://lifearchitect.ai/whats-in-my-ai-paper/ ). This is roughly 190 human
lifetimes of reading material.

GPT-4 has read all 6 million Wikipedia articles, all the lines of code from
open source projects on Github, all the research papers on ArXiv, etc. No
one in the world has accomplished even 1% of this. Moreover, it can process
tasks at probably 2-3 orders of magnitude faster than any human. Compare:
How long does it take a human to synthesize a long series of instructions
and write source code, or an essay? When one put's GPT-4 into a loop and
gives it goals (see Auto-GPT https://github.com/Torantulino/Auto-GPT ) then
its capabilities can scale to match that of almost any person. Some have
likened a single invocation of GPT-4 to a single step or instruction of a
CPU within a computer: it can only do so much. But, like a CPU, if we layer
on top of it a memory, a context, an ALU and some recursion, it can do
almost anything. I think we have all the pieces already, it just takes a
little bit of glue to cobble these together, and there are many people
doing exactly this right now.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230410/2427a4cd/attachment.htm>


More information about the extropy-chat mailing list