[ExI] More thoughts on sentient computers

William Flynn Wallace foozler83 at gmail.com
Sun Feb 26 19:08:40 UTC 2023


There's no standard, it's situational.

Say you had a process searching for new drug compounds. A standard would be
how effective the drug was.

If you had  a process evolving artificial life z the standard would be how
for the life form is in surviving and thriving.

fine - but now you are not talking about art - bill w

On Sun, Feb 26, 2023 at 9:48 AM Jason Resch <jasonresch at gmail.com> wrote:

>
>
> On Sun, Feb 26, 2023, 10:36 AM William Flynn Wallace <foozler83 at gmail.com>
> wrote:
>
>> Value - who gets to decide the standards?
>>
>
> There's no standard, it's situational.
>
> Say you had a process searching for new drug compounds. A standard would
> be how effective the drug was.
>
> If you had an a process evolving artificial life z the standard would be
> how for the life form is in surviving and thriving.
>
> Many art generating AIs are trained on which patterns are expected to be
> most liked by humans.
>
>
>
> Art critics will endlessly argue about every artist that ever lived.
>> Music ditto.  LIterature ditto.
>>
>> It's all qualitative and subject to opinions, which will naturally change
>> over time with deaths and births and world events etc. etc.
>>
>> I have read more than one book on aesthetics and that is why I have given
>> up on philosophers and critics and decided on "I like it- I don't like it"
>> as my personal evaluator.  bill w
>>
>
>
> I agree aesthetic appreciation is subjective, but that art is be subject
> doesn't undermine my claim they we understand how to engineer creative
> systems.
>
> As long as we have a way to select something of value to at least one
> subject, or for at least one purpose, that's sufficient. It's not possible
> to please everyone so that shouldn't be a goal.
>
> Jason
>
>
>
>>
>> On Sat, Feb 25, 2023 at 4:27 PM Jason Resch <jasonresch at gmail.com> wrote:
>>
>>>
>>>
>>> On Sat, Feb 25, 2023, 4:46 PM William Flynn Wallace via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> Re all those images you sent:  having seen decades of covers of scifi
>>>> books, most of them are not very creative - that is,they leave bored.
>>>>
>>>> Value selector - expand please.  If by permutation you mean just
>>>> changes from art images of the past, then OK.  bill w
>>>>
>>>
>>>
>>> By permutation I mean modification, combination, mutation,
>>> randomization, generation, etc. Anything that makes new examples or novelty
>>> (which may then be evaluated for their value.)
>>>
>>> By value selector I mean any function that assesses value of a generated
>>> permutation, by judging each ones's fitness, utility, aesthetics,
>>> suitability, etc.
>>>
>>> Putting these two processes together yields an algorithm for creativity.
>>> It will generate novel examples, and then filter them such they only those
>>> judged to be of sufficient value will be output.
>>>
>>> Jason
>>>
>>>
>>>
>>>>
>>>> On Sat, Feb 25, 2023 at 2:07 PM Jason Resch via extropy-chat <
>>>> extropy-chat at lists.extropy.org> wrote:
>>>>
>>>>>
>>>>>
>>>>> On Sat, Feb 25, 2023, 11:55 AM William Flynn Wallace via extropy-chat <
>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>
>>>>>> Now Jason, I do not pretend to have a good answer to what is
>>>>>> creative, but just being different doesn't seem to me to be sufficient.
>>>>>>
>>>>>> An AI can gather what has been done, perhaps even weighted by how we
>>>>>> humans rate the things (Leonardo is superior to a chimp), and put together
>>>>>> something that combines what has been done but in a new way.
>>>>>>
>>>>>
>>>>> Permutation
>>>>>
>>>>>
>>>>>   An infinity of art could be created this way.
>>>>>>
>>>>>> My personal definition of great art - I like it.  Same for food,
>>>>>> music, colors, animals, etc.  Why should I say something is great or even
>>>>>> good if I don't like it?  I cannot impose my standards on anyone else.
>>>>>> They get to define greatness for themselves.
>>>>>>
>>>>>
>>>>> A value selector
>>>>>
>>>>>
>>>>>> If enough people think something is great, it will last far longer
>>>>>> than the artists' lives.  Homer, anyone?
>>>>>>
>>>>>> ("You like it?  That's the best you can do?"   Yes.)
>>>>>>
>>>>>> bill w
>>>>>>
>>>>>
>>>>> Would you say then that creativity can be accomplished by the
>>>>> combination of:
>>>>>
>>>>> permutation + a value selector ?
>>>>>
>>>>> Jason
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>>
>>>>>> On Sat, Feb 25, 2023 at 9:27 AM Jason Resch via extropy-chat <
>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Feb 25, 2023 at 8:41 AM William Flynn Wallace via
>>>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>>>
>>>>>>>> Big art prize in Britain went to a person who turned the lights off
>>>>>>>> and then back on in a museum.  This is art?  ;You can do anything to a
>>>>>>>> canvas or wood or stone and someone will find value in it and some will
>>>>>>>> call it art.
>>>>>>>>
>>>>>>>> I think we cannot conclude anything from that except that calling
>>>>>>>> something art could include the whole universe with God the Creator.
>>>>>>>>
>>>>>>>> So as a matter of calling something creative I think we have to
>>>>>>>> have some standards.  Really, really bad art is still art but the level of
>>>>>>>> creativity is in question.  An AI winning an art contest is in the same
>>>>>>>> category as those prizes won by chimps and elephants.  Let's define
>>>>>>>> creativity a bit more strictly, shall we?   bill w
>>>>>>>>
>>>>>>>>
>>>>>>> Do you find anything on this webpage creative?
>>>>>>>
>>>>>>> https://www.midjourney.com/showcase/recent/
>>>>>>>
>>>>>>> Would you say none of them were creative if all of them were created
>>>>>>> by human artists?
>>>>>>>
>>>>>>> Jason
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>> On Fri, Feb 24, 2023 at 3:08 PM Jason Resch via extropy-chat <
>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Feb 24, 2023, 11:22 AM William Flynn Wallace via
>>>>>>>>> extropy-chat <extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>
>>>>>>>>>> We don't understand creativity and thus cannot program it into
>>>>>>>>>> our computers.  But that is what gives humans the flexibility the computers
>>>>>>>>>> lack.  A computer has to go with probability - humans don't (and anyway are
>>>>>>>>>> not very good at it at all).  So wayout solutions, the vast majority of
>>>>>>>>>> which don't work or backfire, do happen, improbably.  We want instant
>>>>>>>>>> answers from computers, while humans find solutions that took many decades
>>>>>>>>>> or centuries to discover, and perhaps were always counterintuitive (aka
>>>>>>>>>> crazy).
>>>>>>>>>>
>>>>>>>>>> bill w.
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> I would argue that is no longer the case, given the advances I
>>>>>>>>> describe here:
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> https://alwaysasking.com/when-will-ai-take-over/#Creative_abilities_of_AI
>>>>>>>>>
>>>>>>>>> This article is a few years out of date, modern AI is vastly
>>>>>>>>> superior at creating art now compared to the examples available at the time
>>>>>>>>> of my writing. One AI generated art image won a competition (competing
>>>>>>>>> against human artists).
>>>>>>>>>
>>>>>>>>> I would say creativity is just permutation plus a value selector.
>>>>>>>>> In this sense, we have had creative algorithms for decades (e.g., genetic
>>>>>>>>> programming / genetic algorithms).
>>>>>>>>>
>>>>>>>>> Jason
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Feb 24, 2023 at 10:07 AM Ben Zaiboc via extropy-chat <
>>>>>>>>>> extropy-chat at lists.extropy.org> wrote:
>>>>>>>>>>
>>>>>>>>>>> On 23/02/2023 23:50, bill w wrote:
>>>>>>>>>>>
>>>>>>>>>>> > another question:  why do we, or they, or somebody, think that
>>>>>>>>>>> an AI has to be conscious to solve the problems we have?  Our unconscious
>>>>>>>>>>> mind solves most of our problems now, doesn't it?  I think it does.  bill w
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> That's a good question.
>>>>>>>>>>>
>>>>>>>>>>> (If our unconscious solves most of our problems now, it's not
>>>>>>>>>>> doing a very good job, judging by the state of the world!)
>>>>>>>>>>>
>>>>>>>>>>> Short answer: We don't yet know if consciousness is necessary
>>>>>>>>>>> for solving certain problems. Or even any problems.
>>>>>>>>>>>
>>>>>>>>>>> Longer answer: I suspect it is necessary for some things, but
>>>>>>>>>>> have no proof, other than the circumstantial evidence of evolution.
>>>>>>>>>>>
>>>>>>>>>>> Consciousness evolved, and we know that evolution rapidly
>>>>>>>>>>> eliminates features that don't contribute to reproductive fitness,
>>>>>>>>>>> especially if they have a cost. Consciousness almost certainly has quite a
>>>>>>>>>>> big cost. This suggests that it's necessary for solving at least some of
>>>>>>>>>>> the problems that we've met over the last 300 000 years (or at least for
>>>>>>>>>>> *something* that's useful), or we wouldn't have developed it in
>>>>>>>>>>> the first place. Or if it happened by accident, and wasn't good for
>>>>>>>>>>> survival, we'd have lost it. So we can conclude at the very least that
>>>>>>>>>>> consciousness has been good for our survival, even if we don't know how.
>>>>>>>>>>>
>>>>>>>>>>> It strikes me as noteworthy that the kinds of things that our
>>>>>>>>>>> computers can do well, we do poorly (playing chess, mathematics,
>>>>>>>>>>> statistical reasoning, etc.), and some things that we have evolved to do
>>>>>>>>>>> well, our computers do poorly, or can't do at all (hunting and gathering,
>>>>>>>>>>> making canoes, avoiding hungry lions, making sharp sticks, etc.). Perhaps
>>>>>>>>>>> consciousness is the (or a) missing ingredient for being able to do those
>>>>>>>>>>> things. Yes, arms and legs are an obvious advantage, but many other animals
>>>>>>>>>>> with arms and legs never developed like we did.
>>>>>>>>>>> As the former things tend to be abstract mental things, and the
>>>>>>>>>>> latter tend to be highly-co-ordinated, complex physical things, maybe
>>>>>>>>>>> consciousness has a lot to do with embodiment, and manipulating the
>>>>>>>>>>> external world in complex ways successfully. Maybe Big Dog is closer to
>>>>>>>>>>> consciousness than ChatGPT (or, more likely, needs it more).
>>>>>>>>>>>
>>>>>>>>>>> If Big Dog (or whatever the latest iteration of it is called)
>>>>>>>>>>> had ChatGPT in its head, as well as all the other stuff it already has,
>>>>>>>>>>> would it be able to build a canoe and use it to escape from a forest fire,
>>>>>>>>>>> decide where it was safe to stop, and built a hut? That would be an
>>>>>>>>>>> interesting experiment.
>>>>>>>>>>>
>>>>>>>>>>> Ben
>>>>>>>>>>> _______________________________________________
>>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>>
>>>>>>>>>> _______________________________________________
>>>>>>>>>> extropy-chat mailing list
>>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>>
>>>>>>>>> _______________________________________________
>>>>>>>>> extropy-chat mailing list
>>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>>
>>>>>>>> _______________________________________________
>>>>>>>> extropy-chat mailing list
>>>>>>>> extropy-chat at lists.extropy.org
>>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>>
>>>>>>> _______________________________________________
>>>>>>> extropy-chat mailing list
>>>>>>> extropy-chat at lists.extropy.org
>>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>>
>>>>>> _______________________________________________
>>>>>> extropy-chat mailing list
>>>>>> extropy-chat at lists.extropy.org
>>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>>
>>>>> _______________________________________________
>>>>> extropy-chat mailing list
>>>>> extropy-chat at lists.extropy.org
>>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>>
>>>> _______________________________________________
>>>> extropy-chat mailing list
>>>> extropy-chat at lists.extropy.org
>>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>>
>>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230226/b82a04f2/attachment.htm>


More information about the extropy-chat mailing list