Welcome!

News Feed Item

MemSQL Announces New Tiered Database Platform for Combined Real-Time and Historical Data Analysis

New Integrated Architecture Delivers World's Fastest In-Memory Database Processing for Both Operations and Analytics, Plus Rapid-Transfer Flash-Optimized Column Store for Deep Analysis

SAN FRANCISCO, CA -- (Marketwired) -- 02/06/14 -- MemSQL, the leader in distributed in-memory database technology, today announced MemSQL v3.0, which combines the world's fastest in-memory row store with a new highly compressed column store. This integration provides a tiered storage architecture that leverages memory for real-time transactions and analytics, and a flash-optimized column store for deep analysis. MemSQL v3.0 provides access to both storage formats through an ANSI SQL interface with built-in utilities for transferring data between formats to eliminate the costly and time-consuming extract, transform and load (ETL) process. This capability leads to shorter development cycles and reduced latency. The result is a single database for both real-time and historical data in a fast, scalable and flexible platform.

As a customer of MemSQL, CPXi has been in production with the new columnar architecture for almost six months. CPXi is a global digital media holding company providing multi-screen messaging that leverages display, social, mobile and video advertising at scale and serves billions of managed impressions daily.

"The new capabilities in MemSQL v3.0 allow CPXi to leverage historical data stores and use them to shape a real-time response," said Gil Resh, senior vice president of product and technology at CPXi. "We measure online engagement over time and capture vast amounts of data -- more than 250 billion records with hundreds of different dimensions -- to make millisecond decisions about which ads to show, even while the webpage is loading."

With most database offerings, organizations must use separate databases for transactions, analytical processing and data warehousing. This siloed storage approach creates separate collections of data that cannot be analyzed together without costly ETL, a slow and error-prone process. Moreover, separate data stores require engineers to develop on multiple platforms and create custom connectors for databases that were not designed for interoperability.

"Enterprises are increasingly seeking to make the smartest possible decision in the critical moment, in order to increase efficiency, reduce operational risk, and maximize income potential. Such enterprises need to leverage Big Data with real-time analysis of their data so they can best serve both their customers and their bottom lines," said IDC analyst Carl Olofson. "The purpose of MemSQL v3.0 technology is to enable enterprises to process coordinated transactional and up-to-the-minute analytical workloads. This ensures data timeliness and integrity while obviating the need for complex ETL or CDC data transfers, thus saving resources and eliminating related operational delays, frustration and human error."

MemSQL v3.0 has a simplified data infrastructure that makes it possible to maintain one active dataset, allowing users and applications to conduct real-time analysis that incorporates both live and historical data. Companies can use MemSQL's in-memory row store to ingest streaming data and run time-sensitive analytics. As data volumes grow, data can be moved to the new flash-optimized column store for highly compressed long-term storage and for deep analysis. This tiered storage architecture provides greater flexibility, and the result is a simpler, more cost-effective and less latency-prone data infrastructure.

"Our customers' top priorities are speed and scalability,'" said Eric Frenkiel, co-founder and CEO of MemSQL. "Our new flash-optimized column store means customers can combine real-time and historical data for added flexibility, extracting the full value of their data."

Key Features

MemSQL v3.0 features a tiered storage architecture that allows companies to combine in-memory row and flash-optimized columnar engines to capture, store and query hundreds of terabytes of data in real-time. Highlights include the following:

  • Workloads can access data in both row and column stores.
  • MemSQL v3.0 is highly compressible -- deploy in memory, flash or disk.
  • Built-in utilities transfer data seamlessly between storage tiers.
  • MemSQL v3.0 is horizontally scalable on commodity hardware.

Availability
MemSQL v3.0 will be available for download in Q2 2014. For more information regarding an enterprise license including full support and services, please contact [email protected] or (855) 4-MEMSQL (463-6775).

Additional Resources

About MemSQL

MemSQL is the database for fast data processing. With a single platform, companies can converge live data with their data warehouse to accelerate applications and power real-time operational analytics. MemSQL's Big Data platform brings speed, scale, and simplicity to enterprise customers worldwide. Based in San Francisco, MemSQL is a Y Combinator company funded by prominent venture capitalists and angel investors, including Accel Partners, Khosla Ventures, Data Collective, First Round Capital and IA Ventures. For more information, please visit www.memsql.com.

Add to Digg Bookmark with del.icio.us Add to Newsvine

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, provideed economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session also included a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Join Impiger for their featured webinar: ‘Cloud Computing: A Roadmap to Modern Software Delivery’ on November 10, 2016, at 12:00 pm CST. Very few companies have not experienced some impact to their IT delivery due to the evolution of cloud computing. This webinar is not about deciding whether you should entertain moving some or all of your IT to the cloud, but rather, a detailed look under the hood to help IT professionals understand how cloud adoption has evolved and what trends will impact th...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
"We are a custom software development, engineering firm. We specialize in cloud applications from helping customers that have on-premise applications migrating to the cloud, to helping customers design brand new apps in the cloud. And we specialize in mobile apps," explained Peter Di Stefano, Vice President of Marketing at Impiger Technologies, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...