Welcome!

News Feed Item

Continuent Releases Tungsten Replicator 3.O: Enables Real-Time Data Loading from MySQL, MariaDB and Oracle to Hadoop

Percona Live MySQL Conference 2014 Booth #300––Continuent, Inc., a leading provider of open source database clustering and replication solutions, today announced the immediate availability of Continuent Tungsten Replicator 3.0, a progressive, open source replication solution for Hadoop. Recognizing that fast decision-making depends on real-time data movement, Tungsten Replicator 3.0 replicates transactions from operational database systems such as MySQL, MariaDB and Oracle® to Hadoop in real-time. This produces carbon-copy tables and enables the execution of analytic views in Hadoop within multiple environments including Hive.

By automatically reading the DBMS and forwarding transactions as soon as they commit, Tungsten Replicator reduces the load on operational systems, minimizes the amount of data that needs to move between locations, and allows transactions to quickly enter the data warehouse. When used together with Apache Sqoop, businesses can provision information and then leverage Tungsten Replicator to replicate transactions incrementally in real-time. This provides continuous data loading so existing data and changes can easily be applied into Hadoop. To learn more, visit Continuent at the Percona Live MySQL Conference and Expo in booth #300 on April 1st, at the Collaborate14 in booth #1632 on April 7th or at the Red Hat Summit in booth # 917A on April 14th.

“Today, organizations are struggling to achieve real-time integration between MySQL/MariaDB and Hadoop and Oracle and Hadoop,” said Robert Hodges, CEO, Continuent. “Rapid loading of data into Hadoop enables effective analytics and is becoming increasingly crucial, as Hadoop itself raises the bar for real-time query through technology like the Cloudera-sponsored Impala project, Spark, and HBase. With a flexible topology, our low cost, open source based solution provides organizations with a high-performance, low-impact way to transfer data from multiple upstream systems into Hadoop and to handle billions of transactions daily in order to derive demonstrable business value.”

“Having tried traditional ETL tools and slow data-scraping techniques that put a heavy load on operational systems, we found that we were unable to meet demand,” said Chris Schneider, Database Architect at Groupon. “Hadoop has clearly changed the landscape of data management by providing a central data hub that receives data from across the business; however, there is now a wide range of services that need to move OLTP data efficiently into Hadoop. With Tungsten Replicator we can quickly and flexibly move data out of operational databases into Hadoop where we run analytics that answer important business questions on timelines matching the needs of our users.”

About Continuent

Continuent is a leading provider of database clustering and replication, enabling enterprises to run business-critical applications on cost-effective open source software. Continuent Tungsten provides enterprise-class high availability, globally redundant data distribution and real-time heterogeneous data integration in cloud and on-premises environments. Continuent customers represent the most innovative and successful organizations in the world, handling billions of transactions daily across a wide range of industries. For more information visit www.continuent.com, follow us on Twitter @Continuent or call (866) 998-3642.

MySQL is a trademark of Oracle Corporation and/or its affiliates. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Groupon is a trademark of Groupon Inc. Other names may be trademarks of their respective owners.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Digital transformation has increased the pace of business creating a productivity divide between the technology haves and have nots. Managing financial information on spreadsheets and piecing together insight from numerous disconnected systems is no longer an option. Rapid market changes and aggressive competition are motivating business leaders to reevaluate legacy technology investments in search of modern technologies to achieve greater agility, reduced costs and organizational efficiencies. ...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, will outline ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and...
Containers, microservices and DevOps are all the rage lately. You can read about how great they are and how they’ll change your life and the industry everywhere. So naturally when we started a new company and were deciding how to architect our app, we went with microservices, containers and DevOps. About now you’re expecting a story of how everything went so smoothly, we’re now pushing out code ten times a day, but the reality is quite different.
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
You want to start your DevOps journey but where do you begin? Do you say DevOps loudly 5 times while looking in the mirror and it suddenly appears? Do you hire someone? Do you upskill your existing team? Here are some tips to help support your DevOps transformation. Conor Delanbanque has been involved with building & scaling teams in the DevOps space globally. He is the Head of DevOps Practice at MThree Consulting, a global technology consultancy. Conor founded the Future of DevOps Thought Leade...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...