|By Marten Terpstra||
|July 14, 2014 01:45 PM EDT||
Last week Greg Ferro (@etherealmind) wrote this article about his experience with scripting as a method for network automation, with the ultimate conclusion that scripting does not scale.
Early in my career I managed a small network that grew to be a IP over X.25 hub of Europe for a few years providing many countries with their first Internet connectivity. Scripts were everywhere, small ones to grab stats and create pretty graphs, others that continuously checked the status of links and would send emails when things went wrong.
While it is hard to argue with Greg’s complaints per se, I believe the key point is missing. And it has nothing to do with scripting. In a reply, Ivan’s last comment touches on the real issue.
We have been scripting our networks against CLIs forever and I will bet you most folks will consider it successful, even though it may be a pain. The article lists the pains, but not the reasons why. As a network industry, we have never ever considered the interaction with our network devices an API. Not in the true software engineering sense of an API.
There are many extremely complex clustered applications that rely entirely on exchanging information through APIs that are well documented, well versioned, well abstracted and properly promoted or deprecated. Creating and maintaining APIs is a real software engineering effort, a skill that requires true architecture, engineering and discipline. And we have not given our users anything close to it.
If we (that collective network industry) had truly considered our CLI an API, we would (and should) have been pushed aside a long time ago. The CLI is and always has been a simple interface for a human to tell a device what to do. It was not designed to be automated. It is not structured enough to be automated. Even large vendors have multiple flavors that are all industry standard, but all slightly different. And nowhere would you find a formal, full and complete dictionary of that CLI with all inputs, outputs, versions and options. The closest the network industry has had to a true API is SNMP, and that is indeed a very sad statement.
I think we have mentioned before that the networking industry is a bit slow to get to modern software engineering methods and practices, but the tide is changing. And whether you want to call it SDN or something else, the sheer volume and complexity of interaction with the network is pushing us to provide truly automated access to our devices and our networks.
And creating and maintaining APIs is far more than the technology used to access them. It does not matter whether its XML, JSON, REST, NETCONF or anything else. Those are definitions of how information is carried to and from the device and network. I can build a wonderful REST API that takes a CLI command as an argument and spits me back the output from that CLI command in some format. I am sure that sounds familiar to some, but this is not an API. Not in a truly meaningful way that would elevate our automation abilities.
Designing and implementing APIs is not trivial. Believe me, as an entirely API driven solution, we spend a tremendous amount of time discussing our APIs and abstractions to make sure they find that find balance between granularity, functionality, abstraction, scaling and a few other relevant qualifiers. But the key is that they are part of any feature design from day one, they are part of the overarching architecture, not bolted on at the end. Our APIs are not perfect, there is no such thing, but they are modeled after the workflow of you the user doing the work required to keep the network running and thriving.
So when you need to configure MLAG on a set of Plexxi switches, we do not have a series of API calls to bundle ports together on a switch, give them a unique ID, then tie the switches together as an MLAG pair that shares that unique ID. Oh, and create an MLAG control channel between them, and make each of the switch local LAGs have the same set of VLANs on them. Our API will simply take a list of port objects from any amount of switches in a Plexxi network and turn them into an MLAG. An then you can simply take that entire entity and stick a VLAN on top, we will make sure the participating switches get the pieces they need. That is abstraction, that is workflow encapsulation, that is what APIs are supposed to give you. That is how simple LAG is supposed to be.
We have a long way to go as an industry to get to full APIs the way real software folks think about APIs. The CLI is not it. Scripting against a CLI (or a CLI hidden behind a layer of official sounding API terms) is a useful tool, but one that should be mostly retired to get to true programmable networks that are controlled by real controller (in the broadest definition of the word) using real APIs. Automation is not scripting.
[Today's fun fact: to make sure you do not think I am anti scripting, I once wrote a large chunk of a 10,000 line Perl4 system. It functioned very nicely for years as the RIPE database for IP address allocations back in the mid 90s. Thankfully it has since been tackled by real software engineers.]
The post Scripting is automation, but automation is not scripting appeared first on Plexxi.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Feb. 28, 2017 01:15 AM EST Reads: 2,809
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Feb. 27, 2017 11:45 PM EST Reads: 890
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Feb. 27, 2017 11:00 PM EST Reads: 9,161
Ayehu provides IT Process Automation & Orchestration solutions for IT and Security professionals to identify and resolve critical incidents and enable rapid containment, eradication, and recovery from cyber security breaches. Ayehu provides customers greater control over IT infrastructure through automation. Ayehu solutions have been deployed by major enterprises worldwide, and currently, support thousands of IT processes across the globe. The company has offices in New York, California, and Isr...
Feb. 27, 2017 10:45 PM EST Reads: 803
Tintri VM-aware storage is the simplest for virtualized applications and cloud. Organizations including GE, Toyota, United Healthcare, NASA and 6 of the Fortune 15 have said "No to LUNs." With Tintri they manage only virtual machines, in a fraction of the footprint and at far lower cost than conventional storage. Tintri offers the choice of all-flash or hybrid-flash platform, converged or stand-alone structure and any hypervisor. Rather than obsess with storage, leaders focus on the business app...
Feb. 27, 2017 10:15 PM EST Reads: 706
Addteq is one of the top 10 Platinum Atlassian Experts who specialize in DevOps, custom and continuous integration, automation, plugin development, and consulting for midsize and global firms. Addteq firmly believes that automation is essential for successful software releases. Addteq centers its products and services around this fundamentally unique approach to delivering complete software release management solutions. With a combination of Addteq's services and our extensive list of partners,...
Feb. 27, 2017 10:00 PM EST Reads: 1,299
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Feb. 27, 2017 09:15 PM EST Reads: 4,176
Big Data, cloud, analytics, contextual information, wearable tech, sensors, mobility, and WebRTC: together, these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at @ThingsExpo, Erik Perotti, Senior Manager of New Ventures on Plantronics’ Innovation team, provided an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it m...
Feb. 27, 2017 08:00 PM EST Reads: 7,968
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Feb. 27, 2017 07:45 PM EST Reads: 7,384
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Feb. 27, 2017 07:45 PM EST Reads: 1,634
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Feb. 27, 2017 07:30 PM EST Reads: 2,269
SYS-CON Events announced today that IoT Now has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
Feb. 27, 2017 07:15 PM EST Reads: 1,999
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, discussed the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports.
Feb. 27, 2017 07:00 PM EST Reads: 2,609
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SSL, peer-to-peer, mob...
Feb. 27, 2017 07:00 PM EST Reads: 2,220
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain.
Feb. 27, 2017 06:45 PM EST Reads: 2,270