Welcome!

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing, Cloud Security, SDN Journal

@CloudExpo: Article

Network Neutrality, Victory or Disappointment? | Part 2

This decision looks likely to set the pattern for ISP regulation in the U.S. for some time to come

In my last blog, I discussed the debate surrounding the true definition of open and unfettered Internet, and the different interpretations various groups and individuals have on the matter. In this posting, we continue the discussion.

The U.S. government has not enacted legislation to actually define and require Internet openness or to specify what level of non-openness is acceptable, if any. So openness is a concept without legal definition or backing, which means that individual opinions vary on what constitutes fettering and what doesn't. And attempts by the FCC to fill this gap with the Open Internet Order triggered this court action.

The debate, as constructed just now, offers two alternatives: Do ISPs have the right to manage Internet traffic preferentially, thus by most definitions fettering that traffic? Or do governments have the right to prevent ISPs from managing Internet traffic preferentially, thus clearly fettering the Internet by engaging in regulation of its players?

The FCC in the past has held the view that regulating the ISPs to prevent fettering is a lesser evil than allowing the ISPs to manage their traffic.

The recent U.S. federal court decision agreed with the ISPs that the FCC should not regulate ISP activity. Note that this decision is not based on the pros and cons of openness, but on the limits to the authority of the FCC. The court stated, "Our task as a reviewing court is not to assess the wisdom of the Open Internet Order regulations, but rather to determine whether the Commission has demonstrated that the regulations fall within the scope of its statutory grant of authority."

Whatever the reason, it seems that ISPs are now free to offer preferential quality of service to edge providers and edge providers are free to pay them money for the privilege.

What can happen next? Taking this to the U.S. Supreme Court is something that pro-neutrality advocates must be considering, but since the issue is not about neutrality, but rather FCC jurisdiction, the chances of success are likely to be slim.

What about a move to change the scope of FCC jurisdiction? This could be achieved by legislation that specifically defines Internet openness, makes it a legal requirement and empowers the FCC to oversee it. Not much chance of that happening anytime soon.

Alternatively, the FCC could make the case that Internet access should be defined as a common carrier service, which would make it subject to the same kind of oversight as traditional phone services. Judge Silberman in his (partially) dissenting opinion fears this possible consequence, while the New York Times encourages the FCC to go for it. Why the difference? Apparently Silberman dislikes government regulation more than he dislikes corporate manipulation of Internet traffic, while the New York Times just happens to take the contrary view. The divisions continue, and it seems that extending the jurisdiction of this FCC ruling in this way is not much more likely than new legislation.

Altogether, this decision looks likely to set the pattern for ISP regulation in the U.S. for some time to come, and we should start thinking about the implications it will have on the future. If we look closely, we may find that for the winners in this case, it may turn out not to be such a big deal after all, and those who view this as an unmitigated disaster may be relieved that it's not as bad as they feared.

If this decision stands, as seems likely, there are a number of possible repercussions to consider. In what ways will the ISP competitive landscape be transformed? How will edge providers respond? How does the role of applications change in a world in which fettering is normal, and how will that impact the software companies? Will there be a knock-on effect on the venerable institution of inter-ISP peering charges? And what will be the impact on the billing needs of all these players in a world made much more complex?

Leave a comment to let us know where you think these new developments will lead us.

More Stories By Esmeralda Swartz

Esmeralda Swartz is VP, Marketing Enterprise and Cloud, BUSS. She has spent 15 years as a marketing, product management, and business development technology executive bringing disruptive technologies and companies to market. Esmeralda was CMO of MetraTech, now part of Ericsson. At MetraTech, Esmeralda was responsible for go-to-market strategy and execution for enterprise and SaaS products, product management, business development and partner programs. Prior to MetraTech, Esmeralda was co-founder, Vice President of Marketing and Business Development at Lightwolf Technologies, a big data management startup. She was previously co-founder and Senior Vice President of Marketing and Business Development of Soapstone Networks, a developer of resource and service control software, now part of Extreme Networks.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.