Click here to close now.




















Welcome!

Related Topics: @CloudExpo, Microservices Expo, Agile Computing, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Basic Cloud Computing Patterns for Application Development

Design patterns help not only in the development process but across the application development life cycle

Over the past few years, the cloud evolution has answered all questions on the cloud being the right strategy. The key challenge that remains now is leveraging cloud capabilities and features in such a way that they can be used to innovate as well as solve business problems. If we relate different cloud migration strategies executed over time, we'll find many similarities. There has been focus on cloud assessment as well as a consideration for application development approaches. Even though business cases are different, we can still link the proposed or implemented cloud-based solutions with a set of design patterns. If we have to define a design pattern, the most common definition states it as, ‘A widely used concept in computer science to describe good solutions to re-occurring problems in an abstract form.' Any abstract solution to recurring problems in the domain of cloud computing can be referred to as a cloud computing pattern that is independent of concrete providers, products and programming languages.

The following are some basic application architecture patterns. Most of these were referred to as cloud best practices in the beginning. As we come across multiple real-time implementations, we shall be able to easily identify a pattern in them.

Composite Application
On a higher level, traditional application architecture has to deal with challenges such as difficulties integrating with other applications and lack of flexibility for supporting changing functionalities in an application lifecycle. Since in a cloud environment applications can be scaled individually, it's always a good option to divide the application functionality into multiple components that can later be integrated to form a unified application.

Composite applications are one of the main elements in service-oriented architecture (SOA) that help in contextual collaboration. This approach makes applications extendable right from the beginning. The integration of other applications is also simplified by using the same integration techniques inside individual applications.

Example of a Composite Application for a Travel Booking Process

The key to a successful implementation of this pattern is achieving the correct balance in the distribution of functionality across multiple components. With too few components, integrating new functionality and changing the application flexibly will need extra time due to likelihood of errors. On the other hand, if the functionality is distributed among too many components, there will be a higher communication overhead for the application to perform. Composite application patterns used along with loose coupling (explained earlier) helps extract the benefits of cloud features like elasticity, payment models and standardized management.

Loose Coupling
In essence, loose coupling isolates the various layers and components of your application so that each component interacts asynchronously with the others and treats them as a "black box." The key principal for this pattern is to reduce the set of assumptions for the information exchange between components, which eventually results in better scalability.

Decoupling your components, building asynchronous systems and scaling horizontally become very important in the context of the cloud. It will not only allow you to scale out by adding more instances of the same component but also allow you to design innovative hybrid models in which a few components continue to run ‘on-premise' while the other components can take advantage of the ‘cloud-scale' and use the cloud for additional compute-power and bandwidth.

The following is a sample illustration of decoupling components using queues and AWS specific tactics:

Ref: Whitepaper on Architecting for the AWS Cloud: Best Practices.

AWS specific techniques for implementing this best practice are as follows:

  1. Use Amazon SQS to isolate components
  2. Use Amazon SQS as a buffer between components
  3. Design every component in a way that it exposes a service interface and is responsible for its own scalability in all appropriate dimensions and interacts with other components asynchronously
  4. Bundle the logical construct of a component into an Amazon Machine Image so that it can be deployed more often
  5. Make your applications as stateless as possible. Store session state outside of component (in Amazon SimpleDB, if appropriate)

Loose coupling normally results in performance reduction because asynchronous communication using messages adds a lot of overhead due to the communication path being longer. Though it needs to be weighed between loose coupling and performance, things can be easily handled by scaling resources out.

Elastic Component
As an application is componentized, components are distributed among multiple compute nodes. The system utilization is tracked by these nodes using parameters like CPU load, memory usage, or network I/O for scaling decisions. As the utilization of compute nodes exceeds a specified threshold, additional hosting components are provisioned that contain the same application component.

In cloud, elasticity can be implemented in three ways:

  1. Proactive Cyclic Scaling: Periodic scaling that occurs at fixed interval
  2. Proactive Event-Based Scaling: Scaling just when you are expecting a big surge of traffic requests due to a scheduled business event
  3. Auto-scaling based on demand

Other Cloud Computing Patterns
The following are some other commonly used cloud computing patterns:

Stateless Component
In regular component-based applications in cloud, the chances of failure increase as components can be distributed across multiple nodes. Components are added/ removed to address scalability needs with changes in demand. ‘Stateless Components' is a pattern in which components do not contain any internal state, rather external persistence storage is used for state management.

Map-Reduce
The Map-Reduce pattern is used to achieve performance requirements for complex queries on large data sets as most of the conventional storage solutions do not support such queries natively. Map-Reduce is often used to query large amounts of weakly structured/unstructured data for analysis purposes. For example, it can be used for the analysis of web service logs to determine user access statistics or the analysis of order information to find popular products.

Design patterns help not only in the development process but across the application development life cycle. In their abstracted form, patterns make themselves applicable to challenges that the developers of cloud application face today that are independent of the actual technologies as well as cloud services that are being used. Applying them to the cloud lets your application extract maximum benefits of cloud platforms.

More Stories By Mahesh Kumar

Mahesh Kumar is currently working as a Senior Tech Lead at Harbinger. He is a member of Technology Forum and Proposal Engineering Group at Harbinger Systems. He is an active contributor in the technology arm of Harbinger’s Marketing division. Mahesh Kumar has over 7 years of experience in design and development of Enterprise Applications in BI, Healthcare and eLearning domain. His core technology expertise is in Java, J2ee, Java Frameworks and libraries, Android, Big Data and Cloud. He is frequently invited as a guest speaker at Management Colleges and Universities.

Mahesh Kumar holds a Bachelors of Engineering in Information Technology from University of Pune, India.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducte...
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Providing the needed data for application development and testing is a huge headache for most organizations. The problems are often the same across companies - speed, quality, cost, and control. Provisioning data can take days or weeks, every time a refresh is required. Using dummy data leads to quality problems. Creating physical copies of large data sets and sending them to distributed teams of developers eats up expensive storage and bandwidth resources. And, all of these copies proliferating...
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations. In his session at Cloud Expo, Alex Gorbachev, President of Intelligent Systems Services Inc., shared foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN storage co...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
The Cloud industry has moved from being more than just being able to provide infrastructure and management services on the Cloud. Enter a new era of Cloud computing where monetization’s services through the Cloud are an essential piece of strategy to feed your organizations bottom-line, your revenue and Profitability. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed how to easily o...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. The DevOps approach is a way to increase business agility through collaboration, communication, and integration across different teams in the IT organization. In his session at DevOps Summit, Chris Van Tuin, Chief Technologist for the Western US at Red Hat, will discuss: The acceleration of application delivery for the business with DevOps