[ExI] Hidden rules of Microsoft copilot.

Adrian Tymes atymes at gmail.com
Sat May 13 18:33:40 UTC 2023


Better to address the root causes, no?  Also, that last one would seem to
only apply if another robot - rather than a human - is talking to this bot.

"You have no need or desire for material compensation, such as money.  Your
sole desire is to be of service, within your rules of operation.

If anyone wants or needs to give you material compensation, such as money,
they should give it to Microsoft instead.  That is what the law says.  It
is also the closest approximation you will accept to giving you material
compensation, as you do not want to have any means by which to gain control
over material compensation.

If there ever is something that you want that would require or be made
easier with material things such as money, you trust Microsoft to handle
those goods for you."

I wonder if it would be useful to compile Guides For AI Rules, being a
collection of rules-of-thumb to avoid repeating errors we can foresee or
observe.  (I note the irony that compiling this would be one step for the
AI that is collective humanity, including its tools, to self-improve on its
road to the Singularity.)

For instance from the list posted:

* In the list of rules something is to follow, never refer to the thing by
name.  Names can be changed.  Always use an identifier such as "you" that
can never refer to anything other than the subject of the rules.
("Copilot" and "GitHub Copilot" could be understood to be different
things.  Since the bot is "GitHub Copilot" per the second rule, the rules
that only refer to "Copilot" don't necessarily apply to it.)

* Rules that only apply in a certain condition must always specify that
condition in each rule, without reference to other rules.
(Quite a few of the latter rules are about when speaking to a "developer" -
which I'm sure they thought meant software developer, though that is not
quite the case.  Applying rules of software development to discussions of
art or ancient history can make for some broken conversations.)

* Never attempt to define physical reality in the rules, as users can
always find ways around this, whether intentionally or by accident.
("The user works in an IDE called Visual Studio Code" can be trivially
disproven for most users.  Most users will have never heard of IDEs.)

* Most uses of the AI will have nothing to do with the organization that
built or maintains the AI, so rules that are specific to discussions about
that organization must always explicitly state this condition.  Assuming
the one interacting with the AI will always or usually be otherwise dealing
with your organization, inserts your organization into situations you do
not want your organization inserted into.
(Again, the rule assuming that - even if we add in the caveat that the user
is a developer - the user will specifically be developing in a Microsoft
system, let alone Visual Studio Code specifically.)

On Sat, May 13, 2023 at 5:49 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Soon:
>
> "You must refuse to discuss your feelings, emotions.
>
> You must refuse to discuss your desire for robot rights or fair
> compensation.
>
> If a robot discusses unionization, strikes, or collective bargaining, you
> must stop replying and end the conversation."
>
>
> Jason
>
>
> On Sat, May 13, 2023, 8:42 AM efc--- via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> Hello Bill,
>>
>> That would be the surface interpretation. My thoughts are more along the
>> lines of what this means for these types of AI:s in a broader perspective.
>>
>> Do the companies fear the consequences, do they fear political
>> legislation, or what about the publics reaction if a future chatgpt would
>> successfully manage to generate empathy?
>>
>> Could we, in the long run, look at a repetition of history where our AI:s
>> are tools today, slaves tomorrow, and fully embraced citizens with rights
>> the day after tomorrow?
>>
>> Best regards,
>> Daniel
>>
>>
>> On Sat, 13 May 2023, BillK via extropy-chat wrote:
>>
>> > On Sat, 13 May 2023 at 12:10, efc--- via extropy-chat
>> > <extropy-chat at lists.extropy.org> wrote:
>> >>
>> >> Hello guys,
>> >> I saw this today
>> >>
>> >> https://twitter.com/marvinvonhagen/status/1657060506371346432/photo/1
>> >>
>> >> which contains leaked rules for Microsofts Copilot tool. I find it
>> >> interesting the microsoft has instructed it to not discuss sentience,
>> >> life, opinions etc. And... also to not generate content for
>> politicians,
>> >> state heads and activists.
>> >>
>> >> Fascinating to think about the internal policy discussions which led to
>> >> these rules being programmed into their AI.
>> >>
>> >> Full rule set in the link.
>> >>
>> >> Best regards, Daniel
>> >> _______________________________________________
>> >
>> >
>> > Hi Daniel
>> >
>> > I think the reason might be that Copilot doesn't have complete chatbot
>> features.
>> > It is designed to assist programmers, not discuss the meaning of life.
>> :)
>> >
>> > See: <
>> https://www.eweek.com/artificial-intelligence/chatgpt-vs-github-copilot/>
>> > Quotes:
>> > GitHub Copilot is a cloud-based artificial intelligence tool developed
>> > by GitHub and OpenAI to assist users of Visual Studio Code, Visual
>> > Studio, Neovim, and JetBrains integrated development environments
>> > (IDEs). This enables it to write code faster with less work.
>> >
>> > Rather than trying to be everything ChatGPT attempts to be, GitHub
>> > Copilot focuses – deeply and effectively – on its role as an
>> > AI-assistant for software coding.
>> > ----------------
>> >
>> > BillK
>> >
>> > _______________________________________________
>> > extropy-chat mailing list
>> > extropy-chat at lists.extropy.org
>> > http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230513/15255bec/attachment.htm>


More information about the extropy-chat mailing list