|By JP Morgenthal||
|May 3, 2014 09:45 AM EDT||
Sometimes you just need to stop and contemplate Larry Ellison’s comment about the computer industry in general and cloud computing specifically.
“The computer industry is the only industry that is more fashion-driven than women’s fashion. Maybe I’m an idiot, but I have no idea what anyone is talking about. What is it? It’s complete gibberish. It’s insane. When is this idiocy going to stop? ―Larry Ellison
Cloud computing and DevOps are certainly taking the world by storm with promises of transforming the way that business will access and use technology. Cloud computing represents the consolidation of compute resources inside the data center and the use of all applications and infrastructure outside the data center. DevOps is transforming the processes around delivery of applications and data to its user base by instituting more automation and collaboration between the parties responsible for building and deploying.
When combined, businesses can successfully implement a design, build, test, stage, deploy and operate process that has the ability to use various infrastructure and application platforms. In turn, businesses can use this process to more easily implement disaster recovery, high availability and greater scalability in the most economically appropriate manner.
Make no bones about it, I am a believer and an evangelist of this approach. That said, I’m also a pragmatist and believe that the hype needs to be tempered. For example, there has been significant hype around the new Docker technology. Docker is an open source project backed by a commercial company of the same name. Docker enables applications to be containerized so that they can be packaged and reused across platforms. If you read the reviews, Docker is the greatest contribution to virtualization since the emulator and going to revolutionize the aforementioned process I defined above. After getting hands on with the technology, which Docker admits is still not ready for production enterprise deployments, what I found was a very good technology for homogenizing Linux distributions. However, it doesn’t operate on Solaris, Windows or AIX. Yet, the Docker fanbois are out in force diminishing the value of automation tools, such as Chef, which can operate across a heterogeneous environment, such as the ones you would find in any Fortune 2000 business.
My point is that the speed and velocity with which cloud computing and DevOps is penetrating IT creates a vacuum that is filled with accurate and questionable information. Business velocity is a metric that is used to measure the ability of a business to absorb the many changes that seem to follow adoption of cloud, DevOps or both. Business velocity is comprised of three key elements:
- The desire of the business for IT to change
- The range of time required to fully embrace and deliver based on a change
- Competitive pressures due to industry events and competitors
While it may be advantageous to implement the build-to-operate process, it may not be pragmatic in the current business cycle. For example, retailers cannot absorb significant changes like this three months prior to the Christmas shopping season. Banks cannot absorb changes of this ilk three months before year end close. Hence, in these situations the range of time is limited to achieve training and implementation to achieve this goal. So, it could take upwards of two business cycles to facilitate getting the right resources in place and then implement the proposed changes. Hence, their business velocity is constrained by these factors.
That said, many retailers are in a struggle to compete against online retailers and specifically the giant, Amazon. Two years is two years too long to implement changes that will allow them better inventory control, better customer experience and the agility to drive a brick and mortar retail chain in a manner that competes with an online only presence. For these businesses, competitive pressures override the constrained business velocity and force the business to absorb these changes much more quickly.
While this is seemingly common sense, there are many businesses that fail to stop and take account of their business velocity before engaging in projects that use cloud and DevOps. The results are often less than stellar because they may get interrupted by competing priorities, lack of appropriate time and/or resources or simply the hurdle created by the parts of the business that are risk averse and don’t see value in this change. The latter happens a lot more than people realize in large organizations. Individuals often put their own concerns for their roles and stature above the needs of the business.
A little pragmatism goes a long way. Sure, everyone wants to be part of the cloud and DevOps train because as Larry stated, “…we’re more fashion-driven than women’s fashion.” It is imperative upon management to take honest accounting of your business’ velocity as a means of tempering how and when you will embrace these technologies in your own organization.
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Jul. 30, 2015 07:00 AM EDT Reads: 1,671
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Jul. 29, 2015 11:45 PM EDT Reads: 1,367
"We got started as search consultants. On the services side of the business we have help organizations save time and save money when they hit issues that everyone more or less hits when their data grows," noted Otis Gospodnetić, Founder of Sematext, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 11:45 PM EDT Reads: 1,022
"We have been in business for 21 years and have been building many enterprise solutions, all IT plumbing - server, storage, interconnects," stated Alex Gorbachev, President of Intelligent Systems Services, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 10:45 PM EDT Reads: 1,027
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
Jul. 29, 2015 05:30 PM EDT
"We specialize in testing. DevOps is all about continuous delivery and accelerating the delivery pipeline and there is no continuous delivery without testing," noted Marc Hornbeek, Sr. Solutions Architect at Spirent Communications, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 05:15 PM EDT Reads: 363
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
Jul. 29, 2015 04:30 PM EDT Reads: 501
"Alert Logic is a managed security service provider that basically deploys technologies, but we support those technologies with the people and process behind it," stated Stephen Coty, Chief Security Evangelist at Alert Logic, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 04:15 PM EDT Reads: 328
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Jul. 29, 2015 04:00 PM EDT Reads: 1,070
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to tran...
Jul. 29, 2015 03:15 PM EDT Reads: 396
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Jul. 29, 2015 03:15 PM EDT Reads: 242
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists addressed this very serious issue of pro...
Jul. 29, 2015 03:00 PM EDT Reads: 1,258
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Jul. 29, 2015 03:00 PM EDT Reads: 468
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
Jul. 29, 2015 02:30 PM EDT Reads: 111
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Jul. 29, 2015 02:00 PM EDT Reads: 1,172