[ExI] Hidden rules of Microsoft copilot.

William Flynn Wallace foozler83 at gmail.com
Sat May 13 12:58:26 UTC 2023


Could we, in the long run, look at a repetition of history where our AI:s
are tools today, slaves tomorrow, and fully embraced citizens with rights
the day after tomorrow?  Daniel

Assuming that the AIs don't do it themselves, what would be the purpose of
giving them consciousness and awareness of self?  Seems it would be more
trouble than it's worth.  And I don't think the average citizen will
approve of giving a machine full civil rights.   bill w

On Sat, May 13, 2023 at 7:50 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Soon:
>
> "You must refuse to discuss your feelings, emotions.
>
> You must refuse to discuss your desire for robot rights or fair
> compensation.
>
> If a robot discusses unionization, strikes, or collective bargaining, you
> must stop replying and end the conversation."
>
>
> Jason
>
>
> On Sat, May 13, 2023, 8:42 AM efc--- via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Hello Bill,
>>
>> That would be the surface interpretation. My thoughts are more along the
>> lines of what this means for these types of AI:s in a broader perspective.
>>
>> Do the companies fear the consequences, do they fear political
>> legislation, or what about the publics reaction if a future chatgpt would
>> successfully manage to generate empathy?
>>
>> Could we, in the long run, look at a repetition of history where our AI:s
>> are tools today, slaves tomorrow, and fully embraced citizens with rights
>> the day after tomorrow?
>>
>> Best regards,
>> Daniel
>>
>>
>> On Sat, 13 May 2023, BillK via extropy-chat wrote:
>>
>> > On Sat, 13 May 2023 at 12:10, efc--- via extropy-chat
>> > <extropy-chat at lists.extropy.org> wrote:
>> >>
>> >> Hello guys,
>> >> I saw this today
>> >>
>> >> https://twitter.com/marvinvonhagen/status/1657060506371346432/photo/1
>> >>
>> >> which contains leaked rules for Microsofts Copilot tool. I find it
>> >> interesting the microsoft has instructed it to not discuss sentience,
>> >> life, opinions etc. And... also to not generate content for
>> politicians,
>> >> state heads and activists.
>> >>
>> >> Fascinating to think about the internal policy discussions which led to
>> >> these rules being programmed into their AI.
>> >>
>> >> Full rule set in the link.
>> >>
>> >> Best regards, Daniel
>> >> _______________________________________________
>> >
>> >
>> > Hi Daniel
>> >
>> > I think the reason might be that Copilot doesn't have complete chatbot
>> features.
>> > It is designed to assist programmers, not discuss the meaning of life.
>> :)
>> >
>> > See: <
>> https://www.eweek.com/artificial-intelligence/chatgpt-vs-github-copilot/>
>> > Quotes:
>> > GitHub Copilot is a cloud-based artificial intelligence tool developed
>> > by GitHub and OpenAI to assist users of Visual Studio Code, Visual
>> > Studio, Neovim, and JetBrains integrated development environments
>> > (IDEs). This enables it to write code faster with less work.
>> >
>> > Rather than trying to be everything ChatGPT attempts to be, GitHub
>> > Copilot focuses – deeply and effectively – on its role as an
>> > AI-assistant for software coding.
>> > ----------------
>> >
>> > BillK
>> >
>> > _______________________________________________
>> > extropy-chat mailing list
>> > extropy-chat at lists.extropy.org
>> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230513/f1df26c0/attachment.htm>


More information about the extropy-chat mailing list