Fixing the internet

See allHide authors and affiliations

Science  23 Nov 2018:
Vol. 362, Issue 6417, pp. 871
DOI: 10.1126/science.aaw0798

Data breaches at Facebook and Google—and along with Amazon, those firms' online dominance—crest a growing wave of anxiety around the internet's evolving structure and its impact on humanity. Three keys to the decades-long global expansion of the internet and the World Wide Web are breaking down.

The first key is the “procrastination principle,” a propensity to “set it and forget it” without attempting to predict and avert every imaginable problem. The networks' framers established a set of simple and freely available protocols for communicating over the internet, then stepped back to let competitive markets and cooperative pursuits work their magic.

The second key is the networks' layered architecture. For the internet, this meant that people could concern themselves with, say, writing applications to read and send email without having to know anything about what happens “below,” such as how bits find their way from sender to recipient. By the same token, those rolling out physical infrastructure didn't need to know or predict anything about how it would be used by the applications “above.”



The third key flows from the first two: decentralization. The internet and the web were designed not to create new gatekeepers, in part because regulatory bodies had little awareness of these protocols, let alone a hand in structuring them. A website hosted in Romania would still be just a click away for a user in Canada, without authorization by some centralized party.

Today, the principles of layers and decentralization are badly fraying, which risks transforming the principle of procrastination into one of abdication.

First, the issue of centralization. Surfing the web can now mean simply jumping among Amazon Web Services' hosting servers. If such a major network of servers—or one of the top domain name resolution providers—were to stop working, whole swaths of the internet would go down with it.

Second, formerly separate layers of the internet's architecture are blurring. The runaway success of a few startups has created new, proprietized one-stop platforms. Many people are not really using the web at all, but rather flitting among a small handful of totalizing apps like Facebook and Google. And those application-layer providers have dabbled in providing physical-layer internet access. Facebook's Free Basics program has been one of several experiments that use broadband data cap exceptions to promote some sites and services over others.

What to do? Columbia University law professor Tim Wu has called upon regulators to break up giants like Facebook, but more subtle interventions should be tried first. Web inventor Tim Berners-Lee's Contract for the Web offers a set of principles for governments, companies, and individuals, focusing on internet accessibility, user privacy, and a form of “re-decentralization” to revitalize one key to the network's success. On the technical side, he has launched Solid, a “relayerizing” separation of data from application: Users can maintain their own data (whether in a server in their living room or in the hands of a trusted proxy), and application providers would have to negotiate access rather than hoard the data themselves. And as Yale University law professor Jack Balkin and I have argued, those firms that do leverage users' data should be “information fiduciaries,” obliged to use what they learn in ways that reflect a loyalty to users' interests. These interventions represent meaningful action, while procrastinating a bit longer on the stronger medicine of forced corporate breakup.

The internet was designed to be resilient and flexible, without need for drastic intervention. But its trends toward centralization, and exploitation of its users, call for action.

View Abstract

Stay Connected to Science

Navigate This Article