There are a lot of things we take for granted when it comes to the Internet. When you log on to your computer and boot up your browser, you don’t think twice about whether you will be able to access any site you want, for any service you want, at the speed you expect. Even with the proliferation of mobile devices such as smartphones and tablets, we are able to browse, tweet, read, watch and download just as easily as if we were on a desktop. How is this possible?
Network Neutrality is the guiding principle that preserves the free and open Internet – it argues that no bit of information should be prioritized over another. It’s the difference between offering a useful and efficient network to the public instead of targeted network access to specific groups. Net Neutrality means Internet service providers may not discriminate between different kinds of content and applications online, and it is the reason every website we visit works with the same speed and quality. An open Internet promotes competition and enables investment and innovation, making it possible for anyone, anywhere to easily launch innovative applications and services.
Without Net Neutrality, there would be something called a tiered Internet. Network providers could choose to discriminate and decide how fast data can be transmitted and at what quality, giving certain content providers more of an advantage over others.
In September 2011, the FCC (News - Alert) made an official ruling about Net Neutrality, “Preserving the Open Internet.” Network management plays a huge role in Net Neutrality, as it is essentially how service providers run their business. In this upcoming series, we’ll explore the relationship between network management and Net Neutrality and how the ruling is doing more than one year later.
Network Management and Service Providers
Network management allows service providers to focus on bandwidth utilization, traffic analysis and key performance indicators (KPIs) in every layer of the network and to obtain end-to-end performance data that is relevant to the user experience, such as knowing how a network reacts under pressure with capacity testing, resolving issues before they turn into service interruptions and fixing instead of just diagnosing issues.
“In some corners, current discussions surrounding the issue of Net Neutrality have alarming overtones, with people concerned about the possibility of the government stepping in to regulate the Internet,” explained Eric Wegner, business development manager, Zoho (News - Alert) Corporation, in a blog post. “This can be seen as a free market versus regulation issue that could hinder the ability of service providers to tunnel content to users and tier services for the pay to play. In short, many people think that regulation will cause the Internet more harm than good. Part of the problem is that terms haven’t really been fully defined, leading to general confusion on many points. Or should it be defined at all?”
Defining Reasonable Network Management
In the ruling, the FCC acknowledges subject to “reasonable network management.” What defines “reasonable”? According to the FCC, a network management practice is reasonable if it appropriate and tailored to achieving a legitimate network management purpose, taking into account the particular network architecture and technology of the broadband Internet access service.
According to NPRM:
Reasonable network management consists of: (a) reasonable practices employed by a provider of broadband Internet access to (i) reduce or mitigate the effects of congestion on its network or to address quality-of-service concerns; (ii) address traffic that is unwanted by users or harmful; (iii) prevent the transfer of unlawful content; or (iv) prevent the unlawful transfer of content; and (b) other reasonable network management practices.
Ed Felten, a professor of computer science and public affairs at Princeton University, helped dig a little deeper into what that means.
“’Reasonable’ is hard to define because in real life every ‘network management’ measure will have tradeoffs,” Felten said in a blog post. “Of course, declaring a vague standard rather than a bright-line rule can sometimes be good policy, especially where the facts on the ground are changing rapidly and it’s hard to predict what kind of details might turn out to be important in a dispute. Still, by choosing a case-by-case approach, the FCC is leaving us mostly in the dark about where it will draw the line between ‘reasonable’ and ‘unreasonable’.”
Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO Miami 2013, Jan 29- Feb. 1 in Miami, Florida. Stay in touch with everything happening at ITEXPO (News - Alert). Follow us on Twitter.
Edited by Jamie Epstein