[ExI] The Battle for Power on the Internet

Eugen Leitl eugen at leitl.org
Mon Oct 28 16:03:12 UTC 2013


The Battle for Power on the Internet

Distributed citizen groups and nimble hackers once had the edge. Now
governments and corporations are catching up. Who will dominate in the
decades ahead?

BRUCE SCHNEIER OCT 24 2013, 7:07 AM ET

Vivek Prakash/Reuters

We’re in the middle of an epic battle for power in cyberspace. On one side
are the traditional, organized, institutional powers such as governments and
large multinational corporations. On the other are the distributed and
nimble: grassroots movements, dissident groups, hackers, and criminals.
Initially, the Internet empowered the second side. It gave them a place to
coordinate and communicate efficiently, and made them seem unbeatable. But
now, the more traditional institutional powers are winning, and winning big.
How these two side fare in the long term, and the fate of the rest of us who
don’t fall into either group, is an open question—and one vitally important
to the future of the Internet.

In the Internet’s early days, there was a lot of talk about its “natural
laws”—how it would upend traditional power blocks, empower the masses, and
spread freedom throughout the world. The international nature of the Internet
bypassed circumvented national laws. Anonymity was easy. Censorship was
impossible. Police were clueless about cybercrime. And bigger changes seemed
inevitable. Digital cash would undermine national sovereignty. Citizen
journalism would topple traditional media, corporate PR, and political
parties. Easy digital copying would destroy the traditional movie and music
industries. Web marketing would allow even the smallest companies to compete
against corporate giants. It really would be a new world order.

This was a utopian vision, but some of it did come to pass. Internet
marketing has transformed commerce. The entertainment industries have been
transformed by things like MySpace and YouTube, and are now more open to
outsiders. Mass media has changed dramatically, and some of the most
influential people in the media have come from the blogging world. There are
new ways to organize politically and run elections. Crowdfunding has made
tens of thousands of projects possible to finance, and crowdsourcing made
more types of projects possible. Facebook and Twitter really did help topple
governments.

But that is just one side of the Internet’s disruptive character. The
Internet has emboldened traditional power as well.

On the corporate side, power is consolidating, a result of two current trends
in computing. First, the rise of cloud computing means that we no longer have
control of our data. Our e-mail, photos, calendars, address books, messages,
and documents are on servers belonging to Google, Apple, Microsoft, Facebook,
and so on. And second, we are increasingly accessing our data using devices
that we have much less control over: iPhones, iPads, Android phones, Kindles,
ChromeBooks, and so on. Unlike traditional operating systems, those devices
are controlled much more tightly by the vendors, who limit what software can
run, what they can do, how they’re updated, and so on. Even Windows 8 and
Apple’s Mountain Lion operating system are heading in the direction of more
vendor control.

I have previously characterized this model of computing as “feudal.” Users
pledge their allegiance to more powerful companies who, in turn, promise to
protect them from both sysadmin duties and security threats. It’s a metaphor
that’s rich in history and in fiction, and a model that’s increasingly
permeating computing today.

Medieval feudalism was a hierarchical political system, with obligations in
both directions. Lords offered protection, and vassals offered service. The
lord-peasant relationship was similar, with a much greater power
differential. It was a response to a dangerous world.

Feudal security consolidates power in the hands of the few. Internet
companies, like lords before them, act in their own self-interest. They use
their relationship with us to increase their profits, sometimes at our
expense. They act arbitrarily. They make mistakes. They’re deliberately—and
incidentally—changing social norms. Medieval feudalism gave the lords vast
powers over the landless peasants; we’re seeing the same thing on the
Internet.

It’s not all bad, of course. We, especially those of us who are not
technical, like the convenience, redundancy, portability, automation, and
shareability of vendor-managed devices. We like cloud backup. We like
automatic updates. We like not having to deal with security ourselves. We
like that Facebook just works—from any device, anywhere.

Government power is also increasing on the Internet. There is more government
surveillance than ever before. There is more government censorship than ever
before. There is more government propaganda, and an increasing number of
governments are controlling what their users can and cannot do on the
Internet. Totalitarian governments are embracing a growing “cyber
sovereignty” movement to further consolidate their power. And the cyberwar
arms race is on, pumping an enormous amount of money into cyber-weapons and
consolidated cyber-defenses, further increasing government power.

Technology magnifies power in general, but rates of adoption are different.

In many cases, the interests of corporate and government powers are aligning.
Both corporations and governments benefit from ubiquitous surveillance, and
the NSA is using Google, Facebook, Verizon, and others to get access to data
it couldn’t otherwise. The entertainment industry is looking to governments
to enforce its antiquated business models. Commercial security equipment from
companies like BlueCoat and Sophos is being used by oppressive governments to
surveil and censor their citizens. The same facial recognition technology
that Disney uses in its theme parks can also identify protesters in China and
Occupy Wall Street activists in New York. Think of it as a public/private
surveillance partnership.

What happened? How, in those early Internet years, did we get the future so
wrong?

The truth is that technology magnifies power in general, but rates of
adoption are different. The unorganized, the distributed, the marginal, the
dissidents, the powerless, the criminal: They can make use of new
technologies very quickly. And when those groups discovered the Internet,
suddenly they had power. But later, when the already-powerful big
institutions finally figured out how to harness the Internet, they had more
power to magnify. That’s the difference: The distributed were more nimble and
were faster to make use of their new power, while the institutional were
slower but were able to use their power more effectively.

So while the Syrian dissidents used Facebook to organize, the Syrian
government used Facebook to identify dissidents to arrest.

All isn’t lost for distributed power, though. For institutional power, the
Internet is a change in degree, but for distributed power it’s a qualitative
one. The Internet gives decentralized groups—for the first time—the ability
to coordinate. This can have incredible ramifications, as we saw in the
SOPA/PIPA debate, Gezi, Brazil, and the rising use of crowdfunding. It can
invert power dynamics, even in the presence of surveillance censorship and
use control. But aside from political coordination, the Internet allows for
social coordination as well to unite, for example, ethnic diasporas, gender
minorities, sufferers of rare diseases, and people with obscure interests.

This isn’t static: Technological advances continue to provide advantage to
the nimble. I discussed this trend in my book Liars and Outliers. If you
think of security as an arms race between attackers and defenders, any
technological advance gives one side or the other a temporary advantage. But
most of the time, a new technology benefits the nimble first. They are not
hindered by bureaucracy—and sometimes not by laws or ethics either. They can
evolve faster.

We saw it with the Internet. As soon as the Internet started being used for
commerce, a new breed of cybercriminal emerged, immediately able to take
advantage of the new technology. It took police a decade to catch up. And we
saw it on social media, as political dissidents made use of its
organizational powers before totalitarian regimes did.

Which type of power dominates in the coming decades?

Right now, it looks like traditional power.

This delay is what I call a “security gap.” It’s greater when there’s more
technology, and in times of rapid technological change. Basically, if there
are more innovations to exploit, there will be more damage resulting from
society's inability to keep up with exploiters of all of them. And since our
world is one in which there’s more technology than ever before, and a faster
rate of technological change than ever before, we should expect to see a
greater security gap than ever before. In other words, there will be an
increasing time period during which nimble distributed powers can make use of
new technologies before slow institutional powers can make better use of
those technologies.

This is the battle: quick vs. strong. To return to medieval metaphors, you
can think of a nimble distributed power—whether marginal, dissident, or
criminal—as Robin Hood; and ponderous institutional powers—both government
and corporate—as the feudal lords.

So who wins? Which type of power dominates in the coming decades?

Right now, it looks like traditional power. Ubiquitous surveillance means
that it’s easier for the government to identify dissidents than it is for the
dissidents to remain anonymous. Data monitoring means easier for the Great
Firewall of China to block data than it is for people to circumvent it. The
way we all use the Internet makes it much easier for the NSA to spy on
everyone than it is for anyone to maintain privacy. And even though it is
easy to circumvent digital copy protection, most users still can’t do it.

The problem is that leveraging Internet power requires technical expertise.
Those with sufficient ability will be able to stay ahead of institutional
powers. Whether it’s setting up your own e-mail server, effectively using
encryption and anonymity tools, or breaking copy protection, there will
always be technologies that can evade institutional powers. This is why
cybercrime is still pervasive, even as police savvy increases; why
technically capable whistleblowers can do so much damage; and why
organizations like Anonymous are still a viable social and political force.
Assuming technology continues to advance—and there’s no reason to believe it
won’t—there will always be a security gap in which technically advanced Robin
Hoods can operate.

Most people, though, are stuck in the middle. These are people who have don’t
have the technical ability to evade either the large governments and
corporations, avoid the criminal and hacker groups who prey on us, or join
any resistance or dissident movements. These are the people who accept
default configuration options, arbitrary terms of service, NSA-installed back
doors, and the occasional complete loss of their data. These are the people
who get increasingly isolated as government and corporate power align. In the
feudal world, these are the hapless peasants. And it’s even worse when the
feudal lords—or any powers—fight each other. As anyone watching Game of
Thrones knows, peasants get trampled when powers fight: when Facebook,
Google, Apple, and Amazon fight it out in the market; when the U.S., EU,
China, and Russia fight it out in geopolitics; or when it’s the U.S. vs. “the
terrorists” or China vs. its dissidents.

The abuse will only get worse as technology continues to advance. In the
battle between institutional power and distributed power, more technology
means more damage. We’ve already seen this: Cybercriminals can rob more
people more quickly than criminals who have to physically visit everyone they
rob. Digital pirates can make more copies of more things much more quickly
than their analog forebears. And we’ll see it in the future: 3D printers mean
that the computer restriction debate will soon involves guns, not movies. Big
data will mean that more companies will be able to identify and track you
more easily. It’s the same problem as the “weapons of mass destruction” fear:
terrorists with nuclear or biological weapons can do a lot more damage than
terrorists with conventional explosives. And by the same token, terrorists
with large-scale cyberweapons can potentially do more damage than terrorists
with those same bombs.

The more destabilizing the technologies, the greater the rhetoric of fear,
and the stronger institutional powers will get.  It’s a numbers game. Very
broadly, because of the way humans behave as a species and as a society,
every society is going to have a certain amount of crime. And there’s a
particular crime rate society is willing to tolerate. With historically
inefficient criminals, we were willing to live with some percentage of
criminals in our society. As technology makes each individual criminal more
powerful, the percentage we can tolerate decreases. Again, remember the
“weapons of mass destruction” debate: As the amount of damage each individual
terrorist can do increases, we need to do increasingly more to prevent even a
single terrorist from succeeding.

The more destabilizing the technologies, the greater the rhetoric of fear,
and the stronger institutional powers will get. This means increasingly
repressive security measures, even if the security gap means that such
measures become increasingly ineffective. And it will squeeze the peasants in
the middle even more.

Without the protection of his own feudal lord, the peasant was subject to
abuse both by criminals and other feudal lords. But both corporations and the
government—and often the two in cahoots—are using their power to their own
advantage, trampling on our rights in the process. And without the technical
savvy to become Robin Hoods ourselves, we have no recourse but to submit to
whatever the ruling institutional power wants.

So what happens as technology increases? Is a police state the only effective
way to control distributed power and keep our society safe? Or do the fringe
elements inevitably destroy society as technology increases their power?
Probably neither doomsday scenario will come to pass, but figuring out a
stable middle ground is hard. These questions are complicated, and dependent
on future technological advances that we cannot predict. But they are
primarily political questions, and any solutions will be political.

In the short term, we need more transparency and oversight. The more we know
of what institutional powers are doing, the more we can trust that they are
not abusing their authority. We have long known this to be true in
government, but we have increasingly ignored it in our fear of terrorism and
other modern threats. This is also true for corporate power. Unfortunately,
market dynamics will not necessarily force corporations to be transparent; we
need laws to do that. The same is true for decentralized power; transparency
is how we’ll differentiate political dissidents from criminal organizations.

Oversight is also critically important, and is another long-understood
mechanism for checking power. This can be a combination of things: courts
that act as third-party advocates for the rule of law rather than
rubber-stamp organizations, legislatures that understand the technologies and
how they affect power balances, and vibrant public-sector press and watchdog
groups that analyze and debate the actions of those wielding power.

Transparency and oversight give us the confidence to trust institutional
powers to fight the bad side of distributed power, while still allowing the
good side to flourish. For if we’re going to entrust our security to
institutional powers, we need to know they will act in our interests and not
abuse that power. Otherwise, democracy fails.

In the longer term, we need to work to reduce power differences. The key to
all of this is access to data. On the Internet, data is power. To the extent
the powerless have access to it, they gain in power. To the extent that the
already powerful have access to it, they further consolidate their power. As
we look to reducing power imbalances, we have to look at data: data privacy
for individuals, mandatory disclosure laws for corporations, and open
government laws.

Medieval feudalism evolved into a more balanced relationship in which lords
had responsibilities as well as rights. Today’s Internet feudalism is both
ad-hoc and one-sided. Those in power have a lot of rights, but increasingly
few responsibilities or limits. We need to rebalance this relationship. In
medieval Europe, the rise of the centralized state and the rule of law
provided the stability that feudalism lacked. The Magna Carta first forced
responsibilities on governments and put humans on the long road toward
government by the people and for the people. In addition to re-reigning in
government power, we need similar restrictions on corporate power: a new
Magna Carta focused on the institutions that abuse power in the 21st century.

Today’s Internet is a fortuitous accident: a combination of an initial lack
of commercial interests, government benign neglect, military requirements for
survivability and resilience, and computer engineers building open systems
that worked simply and easily. Corporations have turned the Internet into an
enormous revenue generator, and they’re not going to back down easily.
Neither will governments, which have harnessed the Internet for political
control.

We’re at the beginning of some critical debates about the future of the
Internet: the proper role of law enforcement, the character of ubiquitous
surveillance, the collection and retention of our entire life’s history, how
automatic algorithms should judge us, government control over the Internet,
cyberwar rules of engagement, national sovereignty on the Internet,
limitations on the power of corporations over our data, the ramifications of
information consumerism, and so on.

Data is the pollution problem of the information age. All computer processes
produce it. It stays around. How we deal with it—how we reuse and recycle it,
who has access to it, how we dispose of it, and what laws regulate it—is
central to how the information age functions. And I believe that just as we
look back at the early decades of the industrial age and wonder how society
could ignore pollution in their rush to build an industrial world, our
grandchildren will look back at us during these early decades of the
information age and judge us on how we dealt with the rebalancing of power
resulting from all this new data.

This won’t be an easy period for us as we try to work these issues out.
Historically, no shift in power has ever been easy. Corporations have turned
our personal data into an enormous revenue generator, and they’re not going
to back down. Neither will governments, who have harnessed that same data for
their own purposes. But we have a duty to tackle this problem.

I can’t tell you what the result will be. These are all complicated issues,
and require meaningful debate, international cooperation, and innovative
solutions. We need to decide on the proper balance between institutional and
decentralized power, and how to build tools that amplify what is good in each
while suppressing the bad.



More information about the extropy-chat mailing list