Click here to close now.


News Feed Item

MemSQL Eliminates ETL and Delivers Real-Time Advantage to Ad Tech Leader CPXi

New Tiered Storage Architecture Leverages 250 Billion Rows of Data for Real-Time Bidding and Offers Hundreds of Thousands of Dollars in Annual Savings

SAN FRANCISCO, CA -- (Marketwired) -- 03/20/14 -- MemSQL, the leader in distributed in-memory database technology, today announced that global digital-media holding company CPXi recently deployed MemSQL v3.0 in order to utilize all of the company's data -- 250 billion rows of real-time and historical data -- for its real-time bidding operations. CPXi provides multi-screen messaging that leverages display, social, mobile and video advertising at scale and serves billions of managed impressions daily. Extract, transform and load (ETL) is expensive and required CPXi to have extra machines and storage to run this time-intensive process. By switching from Hadoop to MemSQL, CPXi eliminated ETL processes and also cut 50 percent of its Amazon Elastic Compute Cloud (EC2) instances and 50TB of storage, which led to hundreds of thousands of dollars in annual savings.

"As we tested multiple vendors, we came to see that we don't just have Big Data, we have Huge Data, which brings new problems and complications," said Mike Zacharski, chief operating officer at CPXi. "With its real-time queries, cost-effective data accessibility and reliability, MemSQL provided the right solution to meet our unique business requirements -- including our large data sets -- so that we can operate more effectively and efficiently and push innovation forward within the highly competitive ad tech space."

Digital media clients are increasingly demanding real-time bidding (also known as programmatic ad buying), a dynamic auction process that ingests and analyzes billions of data points in real time to produce fast and accurate bids and more targeted ads. But many media companies are struggling to implement real-time bidding because of the sheer volume of data and velocity of processing that it requires.

Before MemSQL, CPXi was one of those companies: loading data from its tiered architecture into its analysis tools was an expensive, cumbersome process that could take 12 to 24 hours, which meant that data aged and became less relevant before it could be analyzed. MemSQL eliminated that problem: CPXi now does a front-line ingest into row store and then a real-time transfer of the data to the column store for analysis. This new consolidated, tiered storage architecture with a unified SQL interface eliminates CPXi's ETL process and simplifies the complexity of its database infrastructure. The architecture also helps CPXi to scale up staffing by exposing a familiar ANSI SQL interface so that new employees can come up to speed swiftly.

"Performance and reliability are crucial in the ad tech space, and CPXi needed the ability to scale out and analyze data quickly in order to provide the most accurate results for their customers," said Eric Frenkiel, co-founder and CEO of MemSQL. "We find our customers have Big Data challenges that require solutions that provide immediate business value. CPXi is a testament to the power of our database, and we're proud to be its valued partner in helping it to better serve customers."

About MemSQL
MemSQL is the database for fast data processing. With a single platform, companies can converge live data with their data warehouse to accelerate applications and power real-time operational analytics. MemSQL's Big Data platform brings speed, scale, and simplicity to enterprise customers worldwide. Based in San Francisco, MemSQL is a Y Combinator company funded by prominent venture capitalists and angel investors, including Accel Partners, Khosla Ventures, Data Collective, First Round Capital and IA Ventures. For more information, please visit

Add to Digg Bookmark with Add to Newsvine

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect ...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
DevOps is gaining traction in the federal government – and for good reasons. Heightened user expectations are pushing IT organizations to accelerate application development and support more innovation. At the same time, budgetary constraints require that agencies find ways to decrease the cost of developing, maintaining, and running applications. IT now faces a daunting task: do more and react faster than ever before – all with fewer resources.