Welcome!

Related Topics: @CloudExpo, Microservices Expo, Agile Computing, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Basic Cloud Computing Patterns for Application Development

Design patterns help not only in the development process but across the application development life cycle

Over the past few years, the cloud evolution has answered all questions on the cloud being the right strategy. The key challenge that remains now is leveraging cloud capabilities and features in such a way that they can be used to innovate as well as solve business problems. If we relate different cloud migration strategies executed over time, we'll find many similarities. There has been focus on cloud assessment as well as a consideration for application development approaches. Even though business cases are different, we can still link the proposed or implemented cloud-based solutions with a set of design patterns. If we have to define a design pattern, the most common definition states it as, ‘A widely used concept in computer science to describe good solutions to re-occurring problems in an abstract form.' Any abstract solution to recurring problems in the domain of cloud computing can be referred to as a cloud computing pattern that is independent of concrete providers, products and programming languages.

The following are some basic application architecture patterns. Most of these were referred to as cloud best practices in the beginning. As we come across multiple real-time implementations, we shall be able to easily identify a pattern in them.

Composite Application
On a higher level, traditional application architecture has to deal with challenges such as difficulties integrating with other applications and lack of flexibility for supporting changing functionalities in an application lifecycle. Since in a cloud environment applications can be scaled individually, it's always a good option to divide the application functionality into multiple components that can later be integrated to form a unified application.

Composite applications are one of the main elements in service-oriented architecture (SOA) that help in contextual collaboration. This approach makes applications extendable right from the beginning. The integration of other applications is also simplified by using the same integration techniques inside individual applications.

Example of a Composite Application for a Travel Booking Process

The key to a successful implementation of this pattern is achieving the correct balance in the distribution of functionality across multiple components. With too few components, integrating new functionality and changing the application flexibly will need extra time due to likelihood of errors. On the other hand, if the functionality is distributed among too many components, there will be a higher communication overhead for the application to perform. Composite application patterns used along with loose coupling (explained earlier) helps extract the benefits of cloud features like elasticity, payment models and standardized management.

Loose Coupling
In essence, loose coupling isolates the various layers and components of your application so that each component interacts asynchronously with the others and treats them as a "black box." The key principal for this pattern is to reduce the set of assumptions for the information exchange between components, which eventually results in better scalability.

Decoupling your components, building asynchronous systems and scaling horizontally become very important in the context of the cloud. It will not only allow you to scale out by adding more instances of the same component but also allow you to design innovative hybrid models in which a few components continue to run ‘on-premise' while the other components can take advantage of the ‘cloud-scale' and use the cloud for additional compute-power and bandwidth.

The following is a sample illustration of decoupling components using queues and AWS specific tactics:

Ref: Whitepaper on Architecting for the AWS Cloud: Best Practices.

AWS specific techniques for implementing this best practice are as follows:

  1. Use Amazon SQS to isolate components
  2. Use Amazon SQS as a buffer between components
  3. Design every component in a way that it exposes a service interface and is responsible for its own scalability in all appropriate dimensions and interacts with other components asynchronously
  4. Bundle the logical construct of a component into an Amazon Machine Image so that it can be deployed more often
  5. Make your applications as stateless as possible. Store session state outside of component (in Amazon SimpleDB, if appropriate)

Loose coupling normally results in performance reduction because asynchronous communication using messages adds a lot of overhead due to the communication path being longer. Though it needs to be weighed between loose coupling and performance, things can be easily handled by scaling resources out.

Elastic Component
As an application is componentized, components are distributed among multiple compute nodes. The system utilization is tracked by these nodes using parameters like CPU load, memory usage, or network I/O for scaling decisions. As the utilization of compute nodes exceeds a specified threshold, additional hosting components are provisioned that contain the same application component.

In cloud, elasticity can be implemented in three ways:

  1. Proactive Cyclic Scaling: Periodic scaling that occurs at fixed interval
  2. Proactive Event-Based Scaling: Scaling just when you are expecting a big surge of traffic requests due to a scheduled business event
  3. Auto-scaling based on demand

Other Cloud Computing Patterns
The following are some other commonly used cloud computing patterns:

Stateless Component
In regular component-based applications in cloud, the chances of failure increase as components can be distributed across multiple nodes. Components are added/ removed to address scalability needs with changes in demand. ‘Stateless Components' is a pattern in which components do not contain any internal state, rather external persistence storage is used for state management.

Map-Reduce
The Map-Reduce pattern is used to achieve performance requirements for complex queries on large data sets as most of the conventional storage solutions do not support such queries natively. Map-Reduce is often used to query large amounts of weakly structured/unstructured data for analysis purposes. For example, it can be used for the analysis of web service logs to determine user access statistics or the analysis of order information to find popular products.

Design patterns help not only in the development process but across the application development life cycle. In their abstracted form, patterns make themselves applicable to challenges that the developers of cloud application face today that are independent of the actual technologies as well as cloud services that are being used. Applying them to the cloud lets your application extract maximum benefits of cloud platforms.

More Stories By Mahesh Kumar

Mahesh Kumar is currently working as a Senior Tech Lead at Harbinger. He is a member of Technology Forum and Proposal Engineering Group at Harbinger Systems. He is an active contributor in the technology arm of Harbinger’s Marketing division. Mahesh Kumar has over 7 years of experience in design and development of Enterprise Applications in BI, Healthcare and eLearning domain. His core technology expertise is in Java, J2ee, Java Frameworks and libraries, Android, Big Data and Cloud. He is frequently invited as a guest speaker at Management Colleges and Universities.

Mahesh Kumar holds a Bachelors of Engineering in Information Technology from University of Pune, India.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
With over 720 million Internet users and 40–50% CAGR, the Chinese Cloud Computing market has been booming. When talking about cloud computing, what are the Chinese users of cloud thinking about? What is the most powerful force that can push them to make the buying decision? How to tap into them? In his session at 18th Cloud Expo, Yu Hao, CEO and co-founder of SpeedyCloud, answered these questions and discussed the results of SpeedyCloud’s survey.
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Kubernetes, Docker and containers are changing the world, and how companies are deploying their software and running their infrastructure. With the shift in how applications are built and deployed, new challenges must be solved. In his session at @DevOpsSummit at19th Cloud Expo, Sebastian Scheele, co-founder of Loodse, will discuss the implications of containerized applications/infrastructures and their impact on the enterprise. In a real world example based on Kubernetes, he will show how to ...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...