Welcome!

Related Topics: SDN Journal, Java IoT, Microservices Expo, Containers Expo Blog

SDN Journal: Blog Post

Scripting Is Automation, But Automation Is Not Scripting

There are many extremely complex clustered applications that rely entirely on exchanging information through APIs

Last week Greg Ferro (@etherealmind) wrote this article about his experience with scripting as a method for network automation, with the ultimate conclusion that scripting does not scale.

Early in my career I managed a small network that grew to be a IP over X.25 hub of Europe for a few years providing many countries with their first Internet connectivity. Scripts were everywhere, small ones to grab stats and create pretty graphs, others that continuously checked the status of links and would send emails when things went wrong.

While it is hard to argue with Greg’s complaints per se, I believe the key point is missing. And it has nothing to do with scripting. In a reply, Ivan’s last comment touches on the real issue.

We have been scripting our networks against CLIs forever and I will bet you most folks will consider it successful, even though it may be a pain. The article lists the pains, but not the reasons why. As a network industry, we have never ever considered the interaction with our network devices an API. Not in the true software engineering sense of an API.

There are many extremely complex clustered applications that rely entirely on exchanging information through APIs that are well documented, well versioned, well abstracted and properly promoted or deprecated. Creating and maintaining APIs is a real software engineering effort, a skill that requires true architecture, engineering and discipline. And we have not given our users anything close to it.

If we (that collective network industry) had truly considered our CLI an API, we would (and should) have been pushed aside a long time ago. The CLI is and always has been a simple interface for a human to tell a device what to do. It was not designed to be automated. It is not structured enough to be automated. Even large vendors have multiple flavors that are all industry standard, but all slightly different. And nowhere would you find a formal, full and complete dictionary of that CLI with all inputs, outputs, versions and options. The closest the network industry has had to a true API is SNMP, and that is indeed a very sad statement.

I think we have mentioned before that the networking industry is a bit slow to get to modern software engineering methods and practices, but the tide is changing. And whether you want to call it SDN or something else, the sheer volume and complexity of interaction with the network is pushing us to provide truly automated access to our devices and our networks.

And creating and maintaining APIs is far more than the technology used to access them. It does not matter whether its XML, JSON, REST, NETCONF or anything else. Those are definitions of how information is carried to and from the device and network. I can build a wonderful REST API that takes a CLI command as an argument and spits me back the output from that CLI command in some format. I am sure that sounds familiar to some, but this is not an API. Not in a truly meaningful way that would elevate our automation abilities.

Designing and implementing APIs is not trivial. Believe me, as an entirely API driven solution, we spend a tremendous amount of time discussing our APIs and abstractions to make sure they find that find balance between granularity, functionality, abstraction, scaling and a few other relevant qualifiers.  But the key is that they are part of any feature design from day one, they are part of the overarching architecture, not bolted on at the end. Our APIs are not perfect, there is no such thing, but they are modeled after the workflow of you the user doing the work required to keep the network running and thriving.

So when you need to configure MLAG on a set of Plexxi switches, we do not have a series of API calls to bundle ports together on a switch, give them a unique ID, then tie the switches together as an MLAG pair that shares that unique ID. Oh, and create an MLAG control channel between them, and make each of the switch local LAGs have the same set of VLANs on them. Our API will simply take a list of port objects from any amount of switches in a Plexxi network and turn them into an MLAG. An then you can simply take that entire entity and stick a VLAN on top, we will make sure the participating switches get the pieces they need. That is abstraction, that is workflow encapsulation, that is what APIs are supposed to give you. That is how simple LAG is supposed to be.

We have a long way to go as an industry to get to full APIs the way real software folks think about APIs. The CLI is not it. Scripting against a CLI (or a CLI hidden behind a layer of official sounding API terms) is a useful tool, but one that should be mostly retired to get to true programmable networks that are controlled by real controller (in the broadest definition of the word) using real APIs. Automation is not scripting.

[Today's fun fact: to make sure you do not think I am anti scripting, I once wrote a large chunk of a 10,000 line Perl4 system. It functioned very nicely for years as the RIPE database for IP address allocations back in the mid 90s. Thankfully it has since been tackled by real software engineers.]

The post Scripting is automation, but automation is not scripting appeared first on Plexxi.

Read the original blog entry...

More Stories By Marten Terpstra

Marten Terpstra is a Product Management Director at Plexxi Inc. Marten has extensive knowledge of the architecture, design, deployment and management of enterprise and carrier networks.

Latest Stories
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're developing a software that is based on the cloud environment and we are providing those services to corporations and the general public," explained Seungmin Kim, CEO/CTO of SM Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"We're focused on how to get some of the attributes that you would expect from an Amazon, Azure, Google, and doing that on-prem. We believe today that you can actually get those types of things done with certain architectures available in the market today," explained Steve Conner, VP of Sales at Cloudistics, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are moving to the cloud faster than most of us in security expected. CIOs are going from 0 to 100 in cloud adoption and leaving security teams in the dust. Once cloud is part of an enterprise stack, it’s unclear who has responsibility for the protection of applications, services, and data. When cloud breaches occur, whether active compromise or a publicly accessible database, the blame must fall on both service providers and users. In his session at 21st Cloud Expo, Ben Johnson, C...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
"CA has been doing a lot of things in the area of DevOps. Now we have a complete set of tool sets in order to enable customers to go all the way from planning to development to testing down to release into the operations," explained Aruna Ravichandran, Vice President of Global Marketing and Strategy at CA Technologies, in this SYS-CON.tv interview at DevOps Summit at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"The reason Tier 1 companies are coming to us is we're able to narrow the gap where custom applications need to be built. They provide a lot of services, like IBM has Watson, and they provide a lot of hardware but how do you bring it all together? Bringing it all together they have to build custom applications and that's the niche that we are able to help them with," explained Peter Jung, Product Leader at Pulzze Systems Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2,...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
"We work around really protecting the confidentiality of information, and by doing so we've developed implementations of encryption through a patented process that is known as superencipherment," explained Richard Blech, CEO of Secure Channels Inc., in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.