Welcome!

Related Topics: @CloudExpo, Microservices Expo, Agile Computing, @BigDataExpo, SDN Journal, @DevOpsSummit

@CloudExpo: Article

In-Memory Data Grids and Cloud Computing

The promise of the cloud is a reduction in total cost of ownership

The use of in-memory data grids (IMDGs) for scaling application performance has rapidly increased in recent years as firms have seen their application workloads explode. This trend runs across nearly every vertical market, touching online applications for financial services, ecommerce, travel, manufacturing, social media, mobile, and more. At the same time, many firms are also looking to leverage the use of cloud computing to meet the challenge of ever increasing workloads. One of the fundamental promises of the cloud is elastic, transparent, on-demand scalability -- a key capability that has become practical with the use of in-memory data grid technology. As such IMDGs are becoming a vital factor in the cloud, just as they have been for on-premise applications.

What makes IMDGs such a good fit with cloud computing? The promise of the cloud is a reduction in total cost of ownership. Part of that reduction comes from the ability to quickly provision and use new server capacity (without having to own the hardware). The essential synergy between IMDGs and the cloud derives from their common elasticity. IMDGs can scale out their memory-based storage and performance linearly as servers are added to the grid, and they can gracefully scale back when fewer servers are needed. IMDGs take full advantage of the cloud's ability to easily spin-up or remove servers. IMDGs enable cloud-hosted applications to be quickly and easily deployed on an elastic pool of cloud servers to deliver scalable performance, maintaining fast data access even as workloads increase. This is an ideal solution for fast-growing companies and for applications whose workloads create widely varying demands (like online flowers for Mother's Day, concert tickets, etc.). These companies no longer need to create space, power, and cooling for new hardware to meet these fluctuating workloads. Instead, with a few button clicks, they can start up an IMDG-enabled cloud architecture, which transparently meets their performance demands at a cost is solely based on usage.

Expanding on the promise of the cloud, some in-memory data grids can span both on-premise and cloud environments to provide seamless "cloud bursting" for handling high workloads. Let's say your e-commerce application stores shopping carts in an IMDG to give customers fast response times. To spur sales, your marketing group plans to run a special online sales event. Because projected traffic is expected to double during this event, additional web servers will be needed to handle the workload. Of course, maintaining fast response times as the workload increases is essential to success. By deploying your web app in the cloud and connecting it to your on-premise server farm with an IMDG, you can seamlessly double your traffic-handling capacity without interrupting current shopping activity on your site. You don't even need to make changes to your application. The combined deployments transparently work together to serve web traffic, and data freely flows between them within the IMDGs at both sites.

These synergies form a solid basis for making 2014 a watershed year for IMDGs in the cloud. But, there's another big trend that will further drive adoption. As the discussion around "Big Data" analysis heats up, the emerging combination of Big Data and cloud computing - cloud-based analytics - promises to fundamentally change the technology of data mining, machine learning and many other analytics use cases. In 2014, we expect to see the trend toward in-memory, predictive analytics sharply increase, and cloud computing will be a fundamental enabler of that trend.

IMDGs integrate memory-based data storage and computing to make real-time data analysis easily accessible to users and help extend a company's competitive edge. IMDGs automatically take full advantage of the cloud's elasticity to run analytics in parallel across cloud servers with lightning fast performance. Now it's possible to host a real-time analytics engine in the cloud and provide on-demand analytics to a wide range of users, from SaaS services for mobile devices to business simulations for corporate users. Or, maybe you want to spin-up servers with, say, a terabyte of memory, load the grid, run analytics across that data, and then release the resources. In an extreme example, chemistry researchers recently used Amazon Web Services to achieve a "petaflop" of computing power</a> running an analysis of 205,000 molecules for just one week. The elasticity of the cloud again makes the difference by providing the equivalent of a parallel processing supercomputer at your fingertips without the huge capital investment (it costs $33,000 total).

To sum-up, in 2014 we expect firms to adopt cloud computing and cloud-hosted IMDGs at a rapid rate, and the trends of in-memory computing and data analytics will converge to enable fast adoption of in-memory data grid technology in public, private, and hybrid cloud environments. Enterprises that take advantage of this convergence are expected to enjoy a quantum leap in the value of their data without the need to break their IT budgets.

More Stories By William Bain

Dr. William L. Bain is founder and CEO of ScaleOut Software, Inc. Bill has a Ph.D. in electrical engineering/parallel computing from Rice University, and he has worked at Bell Labs research, Intel, and Microsoft. Bill founded and ran three start-up companies prior to joining Microsoft. In the most recent company (Valence Research), he developed a distributed Web load-balancing software solution that was acquired by Microsoft and is now called Network Load Balanc¬ing within the Windows Server operating system. Dr. Bain holds several patents in computer architecture and distributed computing. As a member of the Seattle-based Alliance of Angels, Dr. Bain is actively involved in entrepreneurship and the angel community.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
The competitive landscape of the global cloud computing market in the healthcare industry is crowded due to the presence of a large number of players. The large number of participants has led to the fragmented nature of the market. Some of the major players operating in the global cloud computing market in the healthcare industry are Cisco Systems Inc., Carestream Health Inc., Carecloud Corp., AGFA Healthcare, IBM Corp., Cleardata Networks, Merge Healthcare Inc., Microsoft Corp., Intel Corp., an...
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
StackIQ has announced the release of Stacki 3.2. Stacki is an easy-to-use Linux server provisioning tool. Stacki 3.2 delivers new capabilities that simplify the automation and integration of site-specific requirements. StackIQ is the commercial entity behind this open source bare metal provisioning tool. Since the release of Stacki in June of 2015, the Stacki core team has been focused on making the Community Edition meet the needs of members of the community, adding features and value, while ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
It’s 2016: buildings are smart, connected and the IoT is fundamentally altering how control and operating systems work and speak to each other. Platforms across the enterprise are networked via inexpensive sensors to collect massive amounts of data for analytics, information management, and insights that can be used to continuously improve operations. In his session at @ThingsExpo, Brian Chemel, Co-Founder and CTO of Digital Lumens, will explore: The benefits sensor-networked systems bring to ...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
"Avere Systems is a hybrid cloud solution provider. We have customers that want to use cloud storage and we have customers that want to take advantage of cloud compute," explained Rebecca Thompson, VP of Marketing at Avere Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"We formed Formation several years ago to really address the need for bring complete modernization and software-defined storage to the more classic private cloud marketplace," stated Mark Lewis, Chairman and CEO of Formation Data Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.