Welcome!

News Feed Item

New Revolution Analytics Big Data Big Analytics Platform Super-Charges the Next-Generation Enterprise

STRATA CONFERENCE+HADOOP WORLD, Booth #77Revolution Analytics, the only commercial provider of open source R software, today announced the availability of Revolution R Enterprise 7 (RRE 7), the only Big Data Big Analytics platform powered by R, the standard for modern analytics. The new platform is now integrated with more data and compute environments and features a “write once deploy anywhere” functionality allowing data analysts and IT teams to more fully utilize a variety of data management platforms like Hadoop and second-generation enterprise data warehouses (EDW). These new capabilities act as a super-charger to accelerate growth, optimize operations, and expedite data insight and discovery.

“Recent analyst reports predict the total data store will grow to 40 zettabytes by the year 2020,” said David Rich, CEO of Revolution Analytics. “The Big Data wave is swelling and open source R is essential to glean real-time data and discover hidden patterns in data to power game-changing business decisions. Revolution R Enterprise delivers performance, scalability, portability and ease-of-use for R so that Big Data Big Analytics is far simpler to create and deploy while also cost effective, low risk and future proof.”

Click to Tweet: New @RevolutionR Big Data Big Analytics Platform expedites #analytics, super-charges #BigData #Hadoop @Teradata http://bit.ly/RRE-7

Supports More Compute Environments with Lower Engineering Costs

With a multitude of data and compute environments in use today, RRE 7 gives analysts the ability to write code once and deploy it anywhere, in a variety of data management platforms, enterprise data warehouses, grids, clusters, servers and workstations without re-engineering costs. RRE 7 is the industry’s first Big Data Big Analytics platform to include a library of Big Data-ready algorithms that run inside the Cloudera and Hortonworks Hadoop platforms and in Teradata databases, with the highest possible performance.

“The Big Data technology marketplace is varied and rapidly evolving, and CIOs need to make smart decisions today that will continue to pay dividends tomorrow,” said Ben Woo, founder and managing director of Neuralytix. “With Revolution R Enterprise’s open platform and ‘write once deploy anywhere’ capabilities, it's as if the investment on predictive analytics comes with a warranty, to always make the best use of Hadoop and database platforms, and ultimately empower more users across the organization to drive new business insights now and in the future.”

The new RRE 7 platform includes a library of Parallel External Memory Algorithms (PEMAs), pre-built, extended-memory, parallelized versions of the most common statistical and predictive analytics algorithms. Revolution R Enterprise includes PEMAs for data processing, data sampling, descriptive statistics, statistical tests, data visualization, simulation, machine learning and predictive models. All are accessible from easy-to-use R functions, and all ensure the maximum possible performance by making use of the parallel processing power of the host data platform, without the need to move data anywhere.

Moves the Computation to the Data for High-Performance Analytics

The ability to perform analytics on data, by bringing the computation to data, is essential for performance especially with Big Data. Teradata in-database analytics enables organizations to use the power of the Teradata database as a massively parallel and scalable R platform for advanced data processing and statistical modeling with Big Data. By moving the computation to the data, the entire data set can be included in the analysis which helps drive faster results, reduced latency, expanded capabilities and reduced costs and risks. Teradata is the first database to support Revolution R Enterprise PEMAs.

“Revolution Analytics and Teradata have partnered to enable R analytics to be run in parallel within Teradata,” said Bill Franks, chief analytics officer, Teradata. “This brings unprecedented scale and performance to R users. Our clients will find this to be a compelling offer.”

With RRE 7, R-powered analytics can now be invoked inside the Hadoop distributions of both Cloudera’s CDH3/CDH4 and Hortonworks Data Platform 1.3. By eliminating the need to move data out of the Hadoop environment and into the conventional storage that R-based analysis would otherwise require, RRE 7 will allow predictive analytics functionality implemented in R to execute more immediately and quickly. This pushes data analytics beyond simple summaries, queries, ETL and data visualization to produce game-changing insights from data managed within a Hadoop environment.

“By enabling R-powered Big Data analytics, Cloudera customers are able to easily build and deploy predictive analytic models, gleaning insight from massive amounts of data stored and managed within the Cloudera Big Data environment,” said Tim Stevens, vice president, Business and Corporate Development, Cloudera.

“Enterprises now demand from their analytics platform higher capacity infrastructure at lower costs while also working with existing systems,” said Shaun Connolly, vice president of corporate strategy at Hortonworks. “The integration of Revolution R Enterprise 7 with Hortonworks Data Platform further enriches the modern data architecture, by providing advanced, predictive analytics directly within the Hadoop environment.”

Opens R to Business Users and Extends the Impact of Predictive Analytics

Through a recent integration with Alteryx Strategic Analytics software, RRE 7 broadens the reach of R directly to business users. Using an intuitive workflow, users who understand their unique business challenges can make analytics-driven decisions without the need to rely on coding or R experts, helping companies to close the analytic skills gap and benefit from increased analytic insight across more business units.

“Revolution Analytics and Alteryx have collaborated to open up the world of predictive analytics to data analysts through a combination of the Alteryx environment with simple drag-and-drop R-based predictive tools and Revolution Analytics’ scalable R platform,” said George Mathew, president and COO at Alteryx. “Revolution R Enterprise 7 delivers on that combination and provides the scalability that modern analysts using Big Data require - broadening the use of predictive analytics and delivering incredible business value.”

Scales Big Data Big Analytics with Customized Techniques

Revolution R Enterprise 7 delivers unprecedented integration capabilities with the broader ecosystem, empowering a brand new generation of organizations to scale their big data platform, deploy smarter, faster analytics to discover new insights, and drive better business decisions. The new Big Data Big Analytics techniques provide data analysts with more powerful tools to generate and visualize the most reliable predictions and inferences. The following capabilities have been optimized to scale as big as needed:

  • Ensemble Models for Decision Forests—a powerful machine learning technique to produce forecasts, predictions and recommendations.
  • Stepwise Regression—now available for logistic regression and Generalized Linear Models (GLM), stepwise regression functionalities help automate the process by which the most important or relevant variables are selected for inclusion in a predictive model.
  • Decision Tree Visualization—capabilities that make it easier for analytic consumers to understand relationships and correlations within the data. Revolution R Enterprise delivers an interactive Big Data decision tree visualizer that is unique in the marketplace.

Revolution R Enterprise 7 is available now to select customers, and will be available to all subscribers and new customers on December 13, 2013. For more information on Revolution R Enterprise 7, please visit www.revolutionanalytics.com/products/rre.

About Revolution Analytics

Revolution Analytics, with its Revolution R Enterprise (RRE) software, is the innovative leader in Big Data Big Analytics. RRE is powered by the R language, the de facto standard for what Gartner describes as Modern Analytics. RRE is used by enterprises with massive data, performance and multi-platform requirements that need to drive down the cost of Big Data. RRE runs on industry-leading data platforms, and integrates with business intelligence, data visualization, web and mobile apps to build solutions that drive game changing business insights and value.

About Open Source R

R is the most widely used statistical language with more than two million users worldwide. Top university talent are graduating with R skills, ready to help global enterprises innovate and realize value from Big Data. Revolution Analytics contributes to the growing R community with open-source contributions, user group sponsorships, and free Revolution R Enterprise licenses to academia.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
In his session at Cloud Expo, Alan Winters, an entertainment executive/TV producer turned serial entrepreneur, will present a success story of an entrepreneur who has both suffered through and benefited from offshore development across multiple businesses: The smart choice, or how to select the right offshore development partner Warning signs, or how to minimize chances of making the wrong choice Collaboration, or how to establish the most effective work processes Budget control, or how to max...
Imagine having the ability to leverage all of your current technology and to be able to compose it into one resource pool. Now imagine, as your business grows, not having to deploy a complete new appliance to scale your infrastructure. Also imagine a true multi-cloud capability that allows live migration without any modification between cloud environments regardless of whether that cloud is your private cloud or your public AWS, Azure or Google instance. Now think of a world that is not locked i...
SYS-CON Events announced today that Auditwerx will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Auditwerx specializes in SOC 1, SOC 2, and SOC 3 attestation services throughout the U.S. and Canada. As a division of Carr, Riggs & Ingram (CRI), one of the top 20 largest CPA firms nationally, you can expect the resources, skills, and experience of a much larger firm combined with the accessibility and attent...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), will provide an overview of various initiatives to certifiy the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldw...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
Microservices are a very exciting architectural approach that many organizations are looking to as a way to accelerate innovation. Microservices promise to allow teams to move away from monolithic "ball of mud" systems, but the reality is that, in the vast majority of organizations, different projects and technologies will continue to be developed at different speeds. How to handle the dependencies between these disparate systems with different iteration cycles? Consider the "canoncial problem" ...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, will discuss how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He will discuss how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...