Welcome!

News Feed Item

MemSQL Eliminates ETL and Delivers Real-Time Advantage to Ad Tech Leader CPXi

New Tiered Storage Architecture Leverages 250 Billion Rows of Data for Real-Time Bidding and Offers Hundreds of Thousands of Dollars in Annual Savings

SAN FRANCISCO, CA -- (Marketwired) -- 03/20/14 -- MemSQL, the leader in distributed in-memory database technology, today announced that global digital-media holding company CPXi recently deployed MemSQL v3.0 in order to utilize all of the company's data -- 250 billion rows of real-time and historical data -- for its real-time bidding operations. CPXi provides multi-screen messaging that leverages display, social, mobile and video advertising at scale and serves billions of managed impressions daily. Extract, transform and load (ETL) is expensive and required CPXi to have extra machines and storage to run this time-intensive process. By switching from Hadoop to MemSQL, CPXi eliminated ETL processes and also cut 50 percent of its Amazon Elastic Compute Cloud (EC2) instances and 50TB of storage, which led to hundreds of thousands of dollars in annual savings.

"As we tested multiple vendors, we came to see that we don't just have Big Data, we have Huge Data, which brings new problems and complications," said Mike Zacharski, chief operating officer at CPXi. "With its real-time queries, cost-effective data accessibility and reliability, MemSQL provided the right solution to meet our unique business requirements -- including our large data sets -- so that we can operate more effectively and efficiently and push innovation forward within the highly competitive ad tech space."

Digital media clients are increasingly demanding real-time bidding (also known as programmatic ad buying), a dynamic auction process that ingests and analyzes billions of data points in real time to produce fast and accurate bids and more targeted ads. But many media companies are struggling to implement real-time bidding because of the sheer volume of data and velocity of processing that it requires.

Before MemSQL, CPXi was one of those companies: loading data from its tiered architecture into its analysis tools was an expensive, cumbersome process that could take 12 to 24 hours, which meant that data aged and became less relevant before it could be analyzed. MemSQL eliminated that problem: CPXi now does a front-line ingest into row store and then a real-time transfer of the data to the column store for analysis. This new consolidated, tiered storage architecture with a unified SQL interface eliminates CPXi's ETL process and simplifies the complexity of its database infrastructure. The architecture also helps CPXi to scale up staffing by exposing a familiar ANSI SQL interface so that new employees can come up to speed swiftly.

"Performance and reliability are crucial in the ad tech space, and CPXi needed the ability to scale out and analyze data quickly in order to provide the most accurate results for their customers," said Eric Frenkiel, co-founder and CEO of MemSQL. "We find our customers have Big Data challenges that require solutions that provide immediate business value. CPXi is a testament to the power of our database, and we're proud to be its valued partner in helping it to better serve customers."

About MemSQL
MemSQL is the database for fast data processing. With a single platform, companies can converge live data with their data warehouse to accelerate applications and power real-time operational analytics. MemSQL's Big Data platform brings speed, scale, and simplicity to enterprise customers worldwide. Based in San Francisco, MemSQL is a Y Combinator company funded by prominent venture capitalists and angel investors, including Accel Partners, Khosla Ventures, Data Collective, First Round Capital and IA Ventures. For more information, please visit www.memsql.com.

Add to Digg Bookmark with del.icio.us Add to Newsvine

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
In his session at @ThingsExpo, Dr. Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, presented the findings of a series of six detailed case studies of how large corporations are implementing IoT. The session explored how IoT has improved their economic performance, had major impacts on business models and resulted in impressive ROIs. The companies covered span manufacturing and services firms. He also explored servicification, how manufacturing firms shift from se...
As more and more companies are making the shift from on-premises to public cloud, the standard approach to DevOps is evolving. From encryption, compliance and regulations like GDPR, security in the cloud has become a hot topic. Many DevOps-focused companies have hired dedicated staff to fulfill these requirements, often creating further siloes, complexity and cost. This session aims to highlight existing DevOps cultural approaches, tooling and how security can be wrapped in every facet of the bu...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Translating agile methodology into real-world best practices within the modern software factory has driven widespread DevOps adoption, yet much work remains to expand workflows and tooling across the enterprise. As models evolve from pockets of experimentation into wholescale organizational reinvention, practitioners find themselves challenged to incorporate the culture and architecture necessary to support DevOps at scale.
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically abo...
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...