Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @BigDataExpo, SDN Journal, FinTech Journal

@CloudExpo: Article

The Evolution of Cloud Computing

Conceptual origins of cloud computing

Definitions of cloud computing are easy to find, but a single, authoritative definition is hard to come by. Perhaps the best work in this area was done by Böhm, et al. By compiling characteristics of 17 different scholarly and industrial definitions, the authors identified five primary characteristics of cloud computing allowing a definition such as: "Cloud computing is a service that delivers scalable hardware and/or software solutions via the Internet or other network on a pay-per-usage basis." (Emphasis indicates essential definition elements).

Cloud computing can further be broken down into three common types: SaaS, PaaS, and IaaS. SaaS (Software as a Service) allows users to log into and utilize preprogrammed software that is owned and maintained by the service provider. PaaS (Platform as a Service) gives users tools and languages owned and maintained by the service provider that can be used to build and deploy customized applications. IaaS (Infrastructure as a Service) provides users with storage and processing, allowing users full control over the use of that infrastructure. There are other divisions of cloud computing, but these are the most common.

Conceptual Origins of Cloud Computing
Looking back, it seems that cloud computing was seen as the end goal of many computer pioneers in the 1960s, or, at least, the goal of the early experiments that would eventually become the Internet.

There are three main figures commonly cited as laying the conceptual framework for cloud computing: John McCarthy, JCR Licklider, and Douglas F. Parkhill.

McCarthy first proposed in 1957 that time sharing of computing resources might allow companies to sell excess computation services for maximum utilization of the resource. He even imagined that computation might be organized as a utility.

Licklider, a programmer at the Advanced Research Projects Agency, highlighted some of the promise and challenges in cloud computing in a 1963 memo to those he described as the "Members and Affiliates of the Intergalactic Computer Network." Specifically, he talked about the ability to send a problem to a network of computers that could then pool their resources to solve it, and the need to establish a shared language to allow the computers to talk to one another.

In 1966 Parkhill published "The Challenge of the Computer Utility," which identified many of the challenges facing cloud computing, such as scalability and the need for large bandwidth connections. He also initiated a comparison with electric utilities.

Why We Are in Cloud Computing Time
If cloud computing has been around for so long conceptually, why does it seem like a revolutionary idea at all? Because only now are we in cloud computing time.

Science fiction scholars commonly use the shorthand "steam engine time" to describe the phenomenon that ideas pop up several times but don't catch on for many years. They point out that the Romans knew what steam engines were and could make them, but it wasn't until 1600 years later that the technology came to fruition. The world just wasn't ready for steam engines. The same is true of cloud computing.

The necessary elements that had to be in place before cloud computing could become a reality were the presence of very large datacenters, high-speed Internet connectivity, and the acceptance of cloud computing as a viable model for supplying IT needs.

The presence of very large datacenters is a crucial piece in the foundation of cloud computing. To be able to offer cloud services at a competitive price, suppliers must have datacenters sufficiently large to take advantage of the economies of scale benefits that can reduce costs 80-86% over the medium-sized datacenters that many companies previously utilized. These very large datacenters were manufactured for their own use by many companies that would later become cloud computing providers, such as Amazon, Google, and Microsoft.

Almost universal access to high-speed Internet connectivity is crucial to cloud computing. If your data is bottlenecked getting to and from the cloud, it simply can't be a practical solution for your IT needs.

Finally, it is important for potential users to see cloud computing as a viable solution for IT needs. People need to be able to trust that some ethereal company is going to be able to provide for your urgent IT needs on a daily basis. This cultural work was done by many disparate influences, from MMOs to Google, which expanded acceptance of online resources beyond the IT community. Another crucial but oft-neglected part of this cultural work was performed by peer-to-peer computing, which introduced many people to the notion that they could utilize the resources of other computers via the Internet.

Cloud Computing Timeline: Who, When, and Why
There are many good timelines about cloud computing available, and several are available in my resources section, but it's still important to give a basic timeline to show the evolution of cloud computing service offerings:

  • 1999: Salesforce launches its SaaS enterprise applications
  • 2002: Amazon launches Amazon Web Services (AWS), which offer both artificial and human intelligence for problem solving via the Internet
  • 2006: Google launches Google Docs, a free, web-based competitor to Microsoft Office
  • 2006: Amazon launches Elastic Compute Cloud (EC2) and Simple Storage Service (S3), sometimes described as the first IaaS
  • 2007: Salesforce launches Force.com, often described as the first PaaS
  • 2008: Google App Engine launched
  • 2009: Microsoft launches Windows Azure

Armbrust, et al. note many motives that drive companies to launch cloud computing services, including:

  • Profit: By taking advantage of cost savings from very large datacenters, companies can underbid competitors and still make significant profit
  • Leverage existing investment: For example, many of the applications in AWS were developed for internal use first, then sold in slightly altered form for additional revenue
  • Defend a franchise: Microsoft launched Windows Azure to help maintain competitiveness of the Windows brand
  • Attack a competitor: Google Docs was launched partly as an attack on Microsoft's profitable Office products
  • Leverage customer relationships: Windows Azure gives existing clients a branded cloud service that plays up perceived reliability of the brand, constantly emphasizing that it is a "rock-solid" cloud service

These are the motives that bring competitors to offer cloud computing services, but what drives companies and individuals to adopt cloud computing, and what barriers still exist to full cloud implementation.

The Cloud Computing Market: Where It's At, and Where It's Going
According to a study by IT trade group CompTIA, up to 80% of businesses use some form of cloud computing, although the degree of use varies widely. IBM's studies show that although only 8% of businesses believe cloud computing currently has a significant impact on their business, it is expected to grow to more than 30% in the next three years.

Cloud computing is often sold on the basis of price, but the primary benefit companies are seeking from cloud computing, according to recent surveys, is flexibility. With the huge swings caused by viral phenomena on the Internet, companies can see demand for their site and services fluctuate wildly in a short period of time. Cloud computing gives companies the flexibility to purchase computing resources on demand. A more conventional benefit of cloud computing's flexibility is the ability to avoid hiring and firing IT personnel for short-term projects.

One of the major obstacles to full adoption of cloud computing services remains security concerns. Although cloud-based security solutions exist, there is still a perception that cloud computing puts data at risk compared to private datacenters and increases the operational impact of denial-of-service attacks.

Despite these concerns, however, all sectors of the cloud computing market are expected to thrive in the near future, with revenue in nearly all sectors doubling within the next 3-5 years.

More Stories By Matthew Candelaria

Dr. Matthew Candelaria is a professional writer with more than five years' experience writing copy in industries such as law, medicine, technology and computer security. For more information about him and his work, visit www.writermc.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists discussed...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...