Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Agile Computing, @BigDataExpo, SDN Journal

@CloudExpo: Article

Network Neutrality, Victory or Disappointment? | Part 1

A January 14th ruling from the United States Federal Court of Appeals has stirred the pot once again

Despite the fact that the net neutrality debate is a discussion that has been ongoing for years, a January 14th ruling from the United States Federal Court of Appeals has stirred the pot once again. The court's decision has created a renewed upsurge in comments, opinions and future-gazing, with debate squarely landing in two very different camps. And, as is to be expected, there is actually very little neutrality.

One is left to ask if it is in fact possible to look at this topic objectively, without taking sides from the outset. Perhaps the passage of time has helped to put the topic in perspective. It may be that the Internet itself, which plays such a central role in our daily lives, has achieved a sort of self-defining momentum that will in due course make some of the net neutrality debate academic.

In this blog and follow-up posts, I'll try to keep the discussion of the recent court decision short and to the point. The actual decision document is here for you to read if you have a couple of hours to spare, and it's worth reading closely to get the true sense of what this decision is all about. Strangely enough, it's not really about net neutrality at all.

The Appeals Court decision is really all about the FCC's Open Internet Order. Essentially, it tears down the Open Internet Order's rules that prohibit Internet service providers (ISPs) from site blocking and from providing preferential service to chosen edge providers. It does not overrule the transparency requirement, which says that ISPs must disclose their traffic management policies. If this federal court decision stands, it essentially means ISPs are allowed to block sites and provide preferential service to some edge providers, but if they do so, they must tell us all what they are doing.

The basis for this decision is important. The judges did not closely examine the pros and cons of Internet openness because that was not what the case was about. The complaint against the FCC is that it overstepped its jurisdiction and that it was not in fact legally entitled to make these rulings. The judges for the most part agreed with the complaint and struck down two significant rules that, in the eyes of the FCC, sought to preserve the "continued freedom and openness of the Internet."

The Wall Street Journal proclaims this decision as a "Victory for the Unfettered Internet." The New York Times, in contrast, describes this as a "Disappointing Internet Decision" on the grounds that it "could undermine the open nature of the Internet." Most vocal opinions are divided along these lines. They are all reading the same decision, but one group believes this will make the Internet more unfettered and open while the other believes the opposite.

Advocates on each side assert that they uphold the principle of an open and unfettered Internet, but their interpretations of what "open and unfettered" means in practice leads (or drives) them to conflicting conclusions. Since different takes on this concept help drive the debate, let's look at those perspectives to see what light they cast on the outcome.

Some people regard "open and unfettered" as meaning that governments should leave the Internet alone. That means no government censorship, no blocking of sites and no monitoring user activity. The traffic must flow unimpeded. Most participants in the U.S. debate would agree on this, so perhaps some meeting of the minds is possible? Not likely, because there is also an opinion that "open and unfettered" means no government regulation either. That means no control of pricing, no rules that specify in any way how ISPs deliver their parts of this immense global cooperative enterprise, and certainly no treating Internet access like a phone service.

To some others, "open and unfettered" means that the corporations that provide Internet services should themselves play by these rules. In other words, they too, just like governments, should refrain from censorship, blocking and tracking what users do (at least without consent of each user). If those companies do not allow traffic to flow unimpeded, then the Internet is in reality not open and unfettered.

Let's be clear that not everybody views a completely open and unfettered Internet as a good idea. Various governments around the world limit Internet access with various forms of site blocking, censorship, user tracking and traffic interception. Presumably the officials and politicians responsible for this believe that their individual varieties of fettering are a good thing, overall.

We also know that some Internet service providers engage in, or have engaged in, site blocking, port blocking and scrutiny of user activity, again presumably because decision makers in those companies and organizations see benefits to doing so.

Where do the various parties fall in the spectrum as a result of the recent ruling? And how will it impact the existing system? I'll get into that in Part II. In the meantime, check out our other thoughts on the latest technology trends for the coming year.

More Stories By Esmeralda Swartz

Esmeralda Swartz is VP, Marketing Enterprise and Cloud, BUSS. She has spent 15 years as a marketing, product management, and business development technology executive bringing disruptive technologies and companies to market. Esmeralda was CMO of MetraTech, now part of Ericsson. At MetraTech, Esmeralda was responsible for go-to-market strategy and execution for enterprise and SaaS products, product management, business development and partner programs. Prior to MetraTech, Esmeralda was co-founder, Vice President of Marketing and Business Development at Lightwolf Technologies, a big data management startup. She was previously co-founder and Senior Vice President of Marketing and Business Development of Soapstone Networks, a developer of resource and service control software, now part of Extreme Networks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet and...
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
Zerto exhibited at SYS-CON's 18th International Cloud Expo®, which took place at the Javits Center in New York City, NY, in June 2016. Zerto is committed to keeping enterprise and cloud IT running 24/7 by providing innovative, simple, reliable and scalable business continuity software solutions. Through the Zerto Cloud Continuity Platform™, organizations can seamlessly move and protect virtualized workloads between public, private and hybrid clouds. The company’s flagship product, Zerto Virtual...
Some people worry that OpenStack is more flash then substance; however, for many customers this could not be farther from the truth. No other technology equalizes the playing field between vendors while giving your internal teams better access than ever to infrastructure when they need it. In his session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will talk through some real-world OpenStack deployments and look into the ways this can benefit customers of all sizes....
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...