Skip to main content

The Wall Street Journal, Net Neutrality, and the Devil in the Details

I was a bit stunned to read this article in the Wall Street Journal today about the defenders of Net Neutrality backing down. The story was picked up and parroted by multiple other sources (CIO Insight, and InformationWeek, to name just two) without critical examination.

Some companies, like Microsoft, have openly dropped their active support of Net Neutrality, though the details in the WSJ article don't point to any open violation of it, or activism against it.

But Google? The "Don't Be Evil" company? Not hardly. If you wish to read Google's prompt denial of the claims in the WSJ, check here

The Journal also spears Lawrence Lessig, Obama advisor, and Net Neutrality advocate, saying:

Stanford's Mr. Lessig, for one, has softened his opposition to variable service tiers. At a conference, he argued that carriers won't become kingmakers so long as the faster service at a higher price is available to anyone willing to pay it."

Mr. Lessig also responded quickly, on his blog.

I will let those two speak for themselves, but I am troubled that a news source as valued as the Wall Street Journal can get the details so wrong. In the case of Net Neutrality, the technical details are crucially important.

The Journal article at one point states:

Advocates of network neutrality also claimed that dismantling the rule would be the first step toward distributors gaining control over content, since they could dictate traffic according to fees charged to content providers. The fortunes of a certain Web site, in other words, might depend on how much it could pay network providers, rather than on its popularity.

But the fortunes of websites have *always* depended on how much they pay network providers *for their bandwidth*. This is not a problem for Net Neutrality. Net Neutrality is not intended to bring about equality of outcome. The Tinker's Mind doesn't deserve the same bandwidth as YouTube, and I'm not going to pay for the same bandwidth. 

The problem that Net Neutrality is meant to protect is the end-to-end delivery of content on a best effort, per-packet basis. So if my tiny website pays for some minor bandwidth (say, a 1.5 MBps T1 connection), then that's the speed of my on-ramp. If my website suddenly explodes in popularity, that T1 will be horribly burdened trying vainly to deliver it's precious content to the information-hungry masses. I get what I pay for. (This is happening on both ends of the pipe--chances are you can pay more for faster residential service through your ISP if you want to.)

At the other end of the spectrum, with something like Google's OpenEdge proposal, according to the Journal:

Google's proposed arrangement with network providers, internally called OpenEdge, would place Google servers directly within the network of the service providers, according to documents reviewed by the Journal. The setup would accelerate Google's service for users.

It certainly would accelerate Google's service for users. From the standpoint of Net Neutrality, this is not a problem, however. The fact that Google can afford it, and I can not, is also not a problem.

What Net Neutrality will prevent is the following scenario: When you request a webpage (be it from Google, or from my tiny website), the packets being delivered to your desktop are switched along all the intermediate pathways (by AT&T, Comcast, or whomever) *without being molested*. Every packet on the network, from end to end, queues up and shoots down the line at it's fastest possible speed. Comcast wants the right to hold up your packets in transit to make way for traffic they deem more important. This is a violation of Net Neutrality. When intermediate carriers and providers can decide what types of applications, or packets from certain sources, are given priority at the switch level, they can decide which sites perform better on your desktop. Not based on the bandwidth that you pay for...Not based on the bandwidth the website pays for... But on which content is in the *best interest of the ISP*.

This means that, should my little website decide to beef up its bandwidth to better match its demand, it may not matter. If Brighthouse network is your ISP, they may decide that packets from my website don't need to reach you as fast as the packets from their fellow AOL/Time/Warner brethren. This is what Net Neutrality is meant to protect against. (Brighthouse is no longer wholly owned by AOL/Time/Warner, for the record).

So back to Google's OpenEdge technology, which would place frequently accessed content closer to the end user to improve performance. This doesn't violate Net Neutrality in the least. This just means that end-to-end, the data you asked for is closer to you and reaches you faster. There is no hint of technology that would hold up other end-to-end traffic along the way, based on its source, or its content. Google may be able to pay a premium to the ISPs for locating their data close to you, but non-Google data won't be squelched in order to make room for it. In fact, the opposite happens. Instead of traveling halfway across the globe, occupying the lines of transmission along the way, Google traffic to you will only occupy a local spur of the information lanes, leaving more room for data that can't afford to cache itself locally to travel freely around the world.

Like so many other facets of life, equality of opportunity is not equality of outcome. Know which one you are fighting for.


Popular posts from this blog

Crowdsourcing Curation: The Social Graph as Gatekeeper

I've written before about the compromise we tacitly agree to when amateurs take over the roles formerly held by professionsals. The Internet promotes this takeover by lowering the cost of production and transmission to near zero for nearly every user, for everything from words (blogs) to pictures (Flickr) to video (YouTube).

As Clay Shirky put it so well: As freedom to produce increases, average quality necessarily goes down. For example: Thanks to Flickr, we now have access to a mind-boggling array of beautiful pictures, but that's partly because we simply have access to a mind boggling array of pictures, period. Some of these, of course, are beautiful; but there are a lot more of Aunt Bettie's 43rd picture of a bundt cake than of an Annie Leibovitz Rolling Stone cover.

It is at this point that many people interject: "This is the problem with the internet! It's full of crap!" Many would argue that without professional producers, editors, publishers, and the …

Intellectual Property and Deflation of the Knowledge Economy

[Update: This accidentally became a series of posts on a theme.

Does Intellectual Property Law Foster Innovation?Where I question the efficacy of patent and copyright in a socially networked world.

Intellectual Property and the Deflation of the Knowledge Economy - (this post) Where I toy with the idea that the Knowledge Economy may not turn out to be much of an economy, especially when it comes to Intellectual Property

The Economic Reset Button- Where Jeff Jarvis asks Eric Schmidt whether or not this is a fundamental shift in the economic base

Innovative Deflation- Where I ask, "Is the knowledge economy ripe for growth, or is it the means by which traditional economies are shrunk?" ]

Friday night I was discussing the future of intellectual property law with some friends. My argument, in a nutshell:

Every business model relying on intellectual property law (patent and copyright) is heading for massive deflation in our lifetimes. We've seen it with the music industry and news…

COVID-19 and the Tools We Need to Re-open Wisely

There's a lot of graphs and stats that the news is throwing at people right now. So much so, that you can get information overload trying to make sense of the statistics that have meaning. To quote my old Econometrics professor, "There are three types of lies: 'Lies', 'Damned Lies', and 'Statistics' ". I should also lead with the caveat that I'm an engineer and data nerd by trade, but I'm not an epidemiologist. I welcome feedback from those who have more experience than I do.The most important question we're trying to answer (at least here in Michigan), is "How are we doing?", and "When can we reopen our economy?". With respect to those questions, here's my take on the most important data, and some caveats about what these data are telling us.The four most cited data in news stories are:Total Number of CasesDaily New Cases.Total Number of DeathsDaily New DeathsThis post will talk about #1 and #2 above. I'll …