[ExI] The Cambridge Analytica scandal

Keith Henson hkeithhenson at gmail.com
Sat Mar 31 03:32:45 UTC 2018


Bit long, but our brush with our computer overlords did not work out so well.

Keiht

Computer science faces an ethics crisis. The Cambridge Analytica scandal
proves it.
March 22, 2018
Facebook founder Mark Zuckerberg speaks at a conference in San Jose,
Calif., in 2017. Cambridge Analytica scraped up Facebook data from more
than 50 million people.

Facebook founder Mark Zuckerberg speaks at a conference in San Jose,
Calif., in 2017. Cambridge Analytica scraped up Facebook data from more
than 50 million people.

Cambridge Analytica built a weapon. They did so understanding what uses
its buyers had for it, and it worked exactly as intended. To help
clients manipulate voters, the company built psychological profiles from
data that it surreptitiously harvested from the accounts of 50 million
Facebook users. But what Cambridge Analytica did was hardly unique or
unusual in recent years: a week rarely goes by when some part of the
Internet, working as intended, doesn=E2=80=99t cause appreciable harm.

I didn=E2=80=99t come up in computer science; I began my career as a
physicist.
That transition gave me a specific perspective on this situation. That
the field of computer science, unlike other sciences, has not yet faced
serious negative consequences for the work its practitioners do.

Chemistry had its first reckoning with dynamite; horror at its
consequences led its inventor, Alfred Nobel, to give his fortune to the
prize that bears his name. Only a few years later, its second reckoning
began when chemist Clara Immerwahr committed suicide the night before
her husband and fellow chemist, Fritz Haber, went to stage the first
poison gas attack on the Eastern Front. Physics had its reckoning when
nuclear bombs destroyed Hiroshima and Nagasaki, and so many physicists
became political activists =E2=80=94 some for arms control, some for weapons
development. Human biology had eugenics. Medicine had Tuskegee and
thalidomide. Civil engineering, a series of building, bridge, and dam
collapses. (My thanks to many Twitter readers for these examples.)

These events profoundly changed their respective fields, and the way
people come up in them. Before these crises, each field was dominated by
visions of how it could make the world a better place. New dyes, new
materials, new sources of energy, new modes of transport =E2=80=94 everyone
could see the beauty. Afterward, everyone became painfully aware of how
their work could be turned against their dreams.

Get Truth and Consequences in your inbox:
Michael A. Cohen tekes on the absurdities and hypocrisies of the current
political moment.
Each field dealt with its reckoning in its own way. Physics and
chemistry rarely teach dedicated courses on ethics, but the discussion
is woven into every aspect of daily life, from the first days of one=E2=80=99s

education. As a graduate student, one of the two professors I was
closest to would share stories of the House Un-American Activities
Committee and the anti-war movement; the other would talk obliquely
about his classified work on nuclear weapons. Engineering, like
medicine, developed codes of ethics and systems of licensure. Human
biology, like psychology, developed strong institutional review boards
and processes.

None of these processes, of course, prevent all ethical lapses, and they
neither require nor create agreement about which choices are right. Many
physicists, for example, began avoiding working on problems with
military applications in the years after the McCarthy hearings and the
Vietnam War. But many others do such research, and the issue is
frequently and hotly debated.

Computer science is a field of engineering. Its purpose is to build
systems to be used by others. But even though it has had its share of
events which could have prompted a deeper reckoning =E2=80=94 from the
Therac-25
accidents, in which misprogrammed radiation therapy machines killed
three people, up to IBM=E2=80=99s role in the Holocaust =E2=80=94 and even
though the
things it builds are becoming as central to our lives as roads and
bridges, computer science has not yet come to terms with the
responsibility that comes with building things which so profoundly
affect people=E2=80=99s lives.

Software engineers continue to treat safety and ethics as specialities,
rather than the foundations of all design; young engineers believe they
just need to learn to code, change the world, disrupt something.
Business leaders focus on getting a product out fast, confident that
they will not be held to account if that product fails catastrophically.
Simultaneously imagining their products as changing the world and not
being important enough to require safety precautions, they behave like
kids in a shop full of loaded AK-47=E2=80=99s.

* * *

What would a higher standard of care look like? First of all, safety
would be treated as a principal concern at all stages, even when =E2=80=9Cjust

trying to get something out the door,=E2=80=9D and engineers=E2=80=99
education would
equip them to do so. If safety came first, the Facebook Graph API used
by Cambridge Analytica, which raised widespread alarm among engineers
from the moment it first launched in 2010, would likely never have seen
the light of day.

Tech companies focus intensely on preventing crashes. A rigorous effort
to anticipate what could go wrong is already standard practice for
specialists in system reliability, which deals with =E2=80=9Cwhat-ifs=E2=80=9D
around
computer failures. A higher standard for safety would simply do the same
for =E2=80=9Cwhat-ifs=E2=80=9D around human consequences. This would not imply
that all
systems should be built to the same safety standards; nobody expects a
tent to be built like a skyscraper. But the civil engineer=E2=80=99s approach
would require a substantial shift of priorities.

Such a shift would sometimes be resisted for business reasons, but
working codes of ethics give engineers (and others) more power to say
=E2=80=9Cno.=E2=80=9D If breaking ethics rules would mean the end of
someone=E2=80=99s career,
an employer couldn=E2=80=99t easily replace someone who refuses to cheat. If
the
systems for enforcement are well-built, a competitor couldn=E2=80=99t easily
work around those standards. Uniform codes of ethics give engineers more
of a voice in protecting the public.

Underpinning all of these need to be systems for deciding on what
computer science ethics should be, and how they should be enforced.
These will need to be built by a consensus among the stakeholders in the
field, from industry, to academia, to capital, and most importantly,
among the engineers and the public, who are ultimately most affected. It
must be done with particular attention to diversity of representation.
In computer science, more than any other field, system failures tend to
affect people in different social contexts (race, gender, class,
geography, disability) differently. Familiarity with the details of real
life in these different contexts is required to prevent disaster.

There are many methods by which different fields enforce their ethics,
from the institutional review boards that screen life-sciences
experiments on humans and animals, to the mid-career certification of
professional engineers who then oversee projects used by the
unsuspecting public, to the across-the-board licensure of doctors and
lawyers. Each of these approaches has advantages, and computer science
would need to combine ideas and innovate on them to build something
suited to its specific needs. What would not be acceptable is the
consequence of inaction. The public would lose trust in technology, and
computer scientists would face a host of practical, commercial, and
regulatory consequences.

Computers have made having friends on the other side of the world as
normal as having them next door, have put the sum of human knowledge in
our pockets, and have made nearly every object we encounter more
reliable and less expensive. Yet their failure, whether by accident or
by unthinking design, can have catastrophic consequences for individuals
and society alike.

What stands between these is attention to the core questions of
engineering: to what uses might a system be put? How might it fail? And
how will it behave when it does? Computer science must step up to the
bar set by its sister fields, before its own bridge collapse =E2=80=94 or
worse,
its own Hiroshima.

Yonatan Zunger, now at the startup Humu, is a former distinguished
engineer in security and privacy at Google. Follow him on Twitter
@yonatanzunger.


----------------------------------------------------------------------
End of hackers-l Digest - Monday, March 26, 2018
*********



More information about the extropy-chat mailing list