Welcome!

News Feed Item

MapR Webcast to Feature Live Demo of HP Vertica Analytics Platform on MapR

MapR Technologies, Inc., provider of the top-ranked distribution for Apache™ Hadoop®, today announced a live webcast co-hosted with HP Vertica that will demonstrate the HP Vertica Analytics Platform on MapR on Tuesday, June 3. Subject matter experts from both companies will show how customers can leverage SQL-on-Hadoop. The session will include use cases, a live demo, and a discussion covering the challenges and solutions for SQL-on-Hadoop monitoring and managing.

The HP Vertica Analytics Platform on MapR, which is generally available, is a high-performance, interactive SQL-on-Hadoop solution that tightly integrates the HP Vertica data analytics platform with the enterprise-grade MapR Distribution for Apache Hadoop. All data can be explored in the Hadoop environment, and a unified management console enables monitoring and management of all services running on the same cluster. This enables faster insight from big data for customers by leveraging their existing SQL skills and business intelligence (BI) tools in a highly optimized and efficient environment.

“Unlocking hidden insights in big data by combining the enterprise-grade MapR Distribution for Hadoop with HP Vertica gives business users the fastest time to value by leveraging their existing SQL skills and built-in analytic functions,” said Jon Posnik, vice president of business development at MapR Technologies. “This partnership is a shining example of the new analytics stack, where the power of massively scalable and blazing-fast analytic platforms are tightly integrated and made available with the full breadth and scale of Hadoop.”

HP Vertica Analytics Platform on MapR provides 100% ANSI SQL compliance, with advanced interactive analytic capabilities, and deep BI and ETL tool support to improve analyst productivity through expanded exploration of semi-structured data as well as traditional structured data.

“HP recognizes the new go-to-market potential with the MapR enterprise-grade, production Hadoop platform,” said Chris Selland, vice president of marketing and business development at HP Vertica. “By integrating the HP Vertica Analytics Platform on MapR, our companies can provide powerful analytics and SQL tightly integrated with the full power and breadth of data in Hadoop, giving joint customers new and deeper insights more quickly to their business.”

The HP Vertica Analytics Platform on MapR optimizes SQL-on-Hadoop featuring:

Lower Total Cost of Ownership

  • Use same hardware for running both MapR and HP Vertica with no pre-allocation of nodes
  • Get better data protection with significantly less hardware than other distributions of Hadoop
  • Monitor and manage Hadoop and Vertica services via the MapR Control System

Fastest, Most Open SQL-on-Hadoop

  • Achieve faster performance across a broader range of data types than other SQL-on-Hadoop solutions
  • Use complete and open ANSI SQL, POSIX, and NFS standards

Most Complete Analytics

  • Run and visualize exploratory analytics on semi-structured data and operationalize insights in a single step on a unified platform
  • Analyze data in-place with richest set of built-in analytic functions directly on Hadoop
  • Take advantage of a tightly integrated solution with no connectors required

Enterprise-Grade Reliability

  • Take advantage of the only distribution to offer self-healing high availability for Hadoop
  • Use unique, native, consistent point-in-time snapshots and mirrors from MapR for data recovery and reliability

Availability and Webinar

Customers interested in learning more about the HP Analytics Platform on MapR can contact [email protected] and [email protected].

Join MapR and HP Vertica on Tuesday, June 3rd, at 9am PT/ 12pm ET, to learn the benefits of a SQL-on-Hadoop analytics solution that provides the highest-performing, tightly-integrated platform for operational and exploratory analytics. For more information and to register, click here.

About MapR Technologies
MapR delivers on the promise of Hadoop with a proven, enterprise-grade platform that supports a broad set of mission-critical and real-time production uses. MapR brings unprecedented dependability, ease-of-use and world-record speed to Hadoop, NoSQL, database and streaming applications in one unified distribution for Hadoop. MapR is used by more than 500 customers across financial services, retail, media, healthcare, manufacturing, telecommunications and government organizations as well as by leading Fortune 100 and Web 2.0 companies. Amazon, Cisco, Google and HP are part of the broad MapR partner ecosystem. Investors include Lightspeed Venture Partners, Mayfield Fund, NEA, and Redpoint Ventures. MapR is based in San Jose, CA. Connect with MapR on Facebook, LinkedIn, and Twitter.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.