Technology × Freedom

Is Cybersecurity a Negative Externality?

February 21, 2017 invisiblehand

Bruce Schneier is afraid. He's afraid of the internet-of-things "world-wide robot" that we're creating, and how vulnerable we all are to a cyberattack now that more and more of our belongings are IoT-enabled. And he doesn't believe the free market has a solution:

The market can’t fix this because neither the buyer nor the seller cares. The owners of the webcams and DVRs used in the denial-of-service attacks don’t care. Their devices were cheap to buy, they still work, and they don’t know any of the victims of the attacks. The sellers of those devices don’t care: They’re now selling newer and better models, and the original buyers only cared about price and features. There is no market solution, because the insecurity is what economists call an externality: It’s an effect of the purchasing decision that affects other people. Think of it kind of like invisible pollution.

The specter of "negative externalities" has been raised countless times as an argument against a totally free market. These externalities supposedly occur when manufacturing or selling a given product causes an undesirable side effect that impacts society as a whole but that no individual or company is directly responsible for. (Hence his comparison to pollution.)

Schneier's solution is a government regulatory agency with strong, centralized power:

We need government to ensure companies follow good security practices: testing, patching, secure defaults — and we need to be able to hold companies liable when they fail to do these things. We need government to mandate strong personal data protections, and limitations on data collection and use. We need to ensure that responsible security research is legal and well-funded.

One of the roadblocks to establishing this regulatory agency, according to Schneier, is the "historical divide between Washington and Silicon Valley — the mistrust of governments by tech companies and the mistrust of tech companies by governments," which he claims "is dangerous." Apparently Schneier has been hiding under a rock for the past decade, since it is overwhelmingly clear that Washington and Silicon Valley actually have deep ties, and this chummy relationship is the far greater danger to ordinary citizens.

Schneier's implicit trust in government institutions is obvious. He says that "We can imagine wanting to give police the ability to remotely and safely disable a moving car; that would make high-speed chases a thing of the past. But we definitely don’t want hackers to be able to do that." Yet police and the intelligence community's total disregard for citizens' rights is a daily occurrence. Schneier wants to protect us from one enemy by giving unprecedented powers to another, far more politicized enemy. Hmm.

A recurring gap in Schneier's logic is his failure to distinguish between society and government, "we" the people vs. "we" the state. For instance, he launches into his call for government regulation by first stating "We have to fix this." Later he writes optimistically, "Our society has tackled bigger problems than this one." So which is it? Will the government fix this problem, or will society? Because, newsflash, they are not the same thing. In fact, their interests are almost always in opposition.

As for the whole negative externality argument, Andrea O'Sullivan at the Mercatus Center addresses it deftly:

Like many who make "market failure" arguments, Schneier believes that the government alone can intervene to fix the problem…. But behind every suspected market failure is usually an existing government failure. Schneier himself says as much when he discusses the many laws that inhibit security research and contribute to smart-device insecurity. In particular, laws like the Digital Millennium Copyright Act (DMCA) and Computer Fraud and Abuse Act (CFAA) penalize computer scientists who try to test or report certain software vulnerabilities. These laws should be amended before we do anything else.

What's more:

Entrepreneurs and researchers are hard at work on new IOT security solutions, because after all, where there is a great social need, there is a great profit opportunity. But if a "Department of Technology Policy" preemptively blocks such research, or requires companies to dedicate resources elsewhere by mandate, these solutions may never be discovered.

Put it this way - if the market isn't producing a solution to the problem, then consumers must not think it's a big problem. If it really is a big problem, consumers will realize it sooner or later, and when they do, the market will fix it. Schneier should have spent more of his energy convincing readers of the problem and less time trying to sell a "solution."