|By JP Morgenthal||
|May 3, 2014 09:45 AM EDT||
Sometimes you just need to stop and contemplate Larry Ellison’s comment about the computer industry in general and cloud computing specifically.
“The computer industry is the only industry that is more fashion-driven than women’s fashion. Maybe I’m an idiot, but I have no idea what anyone is talking about. What is it? It’s complete gibberish. It’s insane. When is this idiocy going to stop? ―Larry Ellison
Cloud computing and DevOps are certainly taking the world by storm with promises of transforming the way that business will access and use technology. Cloud computing represents the consolidation of compute resources inside the data center and the use of all applications and infrastructure outside the data center. DevOps is transforming the processes around delivery of applications and data to its user base by instituting more automation and collaboration between the parties responsible for building and deploying.
When combined, businesses can successfully implement a design, build, test, stage, deploy and operate process that has the ability to use various infrastructure and application platforms. In turn, businesses can use this process to more easily implement disaster recovery, high availability and greater scalability in the most economically appropriate manner.
Make no bones about it, I am a believer and an evangelist of this approach. That said, I’m also a pragmatist and believe that the hype needs to be tempered. For example, there has been significant hype around the new Docker technology. Docker is an open source project backed by a commercial company of the same name. Docker enables applications to be containerized so that they can be packaged and reused across platforms. If you read the reviews, Docker is the greatest contribution to virtualization since the emulator and going to revolutionize the aforementioned process I defined above. After getting hands on with the technology, which Docker admits is still not ready for production enterprise deployments, what I found was a very good technology for homogenizing Linux distributions. However, it doesn’t operate on Solaris, Windows or AIX. Yet, the Docker fanbois are out in force diminishing the value of automation tools, such as Chef, which can operate across a heterogeneous environment, such as the ones you would find in any Fortune 2000 business.
My point is that the speed and velocity with which cloud computing and DevOps is penetrating IT creates a vacuum that is filled with accurate and questionable information. Business velocity is a metric that is used to measure the ability of a business to absorb the many changes that seem to follow adoption of cloud, DevOps or both. Business velocity is comprised of three key elements:
- The desire of the business for IT to change
- The range of time required to fully embrace and deliver based on a change
- Competitive pressures due to industry events and competitors
While it may be advantageous to implement the build-to-operate process, it may not be pragmatic in the current business cycle. For example, retailers cannot absorb significant changes like this three months prior to the Christmas shopping season. Banks cannot absorb changes of this ilk three months before year end close. Hence, in these situations the range of time is limited to achieve training and implementation to achieve this goal. So, it could take upwards of two business cycles to facilitate getting the right resources in place and then implement the proposed changes. Hence, their business velocity is constrained by these factors.
That said, many retailers are in a struggle to compete against online retailers and specifically the giant, Amazon. Two years is two years too long to implement changes that will allow them better inventory control, better customer experience and the agility to drive a brick and mortar retail chain in a manner that competes with an online only presence. For these businesses, competitive pressures override the constrained business velocity and force the business to absorb these changes much more quickly.
While this is seemingly common sense, there are many businesses that fail to stop and take account of their business velocity before engaging in projects that use cloud and DevOps. The results are often less than stellar because they may get interrupted by competing priorities, lack of appropriate time and/or resources or simply the hurdle created by the parts of the business that are risk averse and don’t see value in this change. The latter happens a lot more than people realize in large organizations. Individuals often put their own concerns for their roles and stature above the needs of the business.
A little pragmatism goes a long way. Sure, everyone wants to be part of the cloud and DevOps train because as Larry stated, “…we’re more fashion-driven than women’s fashion.” It is imperative upon management to take honest accounting of your business’ velocity as a means of tempering how and when you will embrace these technologies in your own organization.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 20, 2017 02:30 AM EST Reads: 5,006
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jan. 20, 2017 02:15 AM EST Reads: 6,036
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jan. 20, 2017 02:00 AM EST Reads: 3,609
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Jan. 20, 2017 02:00 AM EST Reads: 6,570
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Jan. 20, 2017 02:00 AM EST Reads: 5,316
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Jan. 20, 2017 01:45 AM EST Reads: 4,261
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jan. 20, 2017 01:15 AM EST Reads: 1,342
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 20, 2017 01:15 AM EST Reads: 2,137
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 20, 2017 12:45 AM EST Reads: 2,867
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Jan. 20, 2017 12:45 AM EST Reads: 4,107
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Jan. 20, 2017 12:45 AM EST Reads: 2,827
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 20, 2017 12:00 AM EST Reads: 6,335
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Jan. 19, 2017 11:45 PM EST Reads: 9,922
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jan. 19, 2017 09:45 PM EST Reads: 6,821
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 19, 2017 09:45 PM EST Reads: 7,703