Welcome!

News Feed Item

Actian Unshackles Data Scientists to Provide More Accurate Analytics Faster with Deep R Integration

Actian Corporation (“Actian”), the first company to assemble an end-to-end big data analytics platform that runs natively on Hadoop, now provides data scientists with access to blending and enrichment capabilities and highly performant R analytics under one platform for quicker decisions and improved accuracy.

The new integration adds R to Actian’s growing list of hundreds of analytic functions. Data scientists can now execute R in parallel – both in-database and on Hadoop – and access any data source at the point of an R algorithm. Actian frees data scientists from the shackles of low-performing R with highly parallelized low-latency analytics on its Actian Analytics Platform. This expands the breadth of possible queries and leads to faster, more accurate results.

Actian combines highly performant R analytics with data blending and enrichment capabilities to accelerate the entire analytics process for data scientists, freeing up onerous data preparation and enrichment cycles in favor of rapid-iteration data discovery and analytics refinement. For SPSS and SAS users experimenting with R as a low-cost alternative, Actian now provides a comparable environment that addresses the full analytics spectrum at an affordable price point.

With KNIME already deeply embedded in its analytics platform, Actian gives data scientists direct access to more than 1,500 KNIME functions along with a rich library of user-contributed R functions. This powerful combination, coupled with a drag-and-drop interface, amplifies the value of open-source analytics capabilities.

“Actian is freeing data scientists to do what they do best,” said Mark Milani, Senior Vice President of Product Engineering for Actian. “Our continued investment in end-to-end analytic solutions, modern architecture, and extensibility enables stunning breakthroughs in data science productivity on Big Data. Actian’s R solutions dramatically expand the accessibility of high-performance R so a broader pool of practitioners can deliver faster, and more powerful analytic results for their organizations.”

Actian makes R reusable, accessible, and consumable, not just for data scientists, but for less technical users as well. With a user-friendly drag and drop interface, access for SQL users and the ability to share workflows, data scientists can work with business analytics colleagues to deliver a more productive, broadly based analytics team:

  • SQL users can run R scripts in parallel with R embedded in Actian’s MPP analytics database.
  • Data scientists can develop code to run in parallel in data flows or Hadoop with R operators.
  • Other users can add new data sets and blending with shared workflows.
  • R users can execute high-performance R without SQL knowledge with in-database R packages.

"Integration of R with the Actian Analytics Platform will help make R accessible to business analysts, as well as data scientists," said Matt Aslett, research director, data platforms and analytics, 451 Research. "Having the ability to run analytics queries that take advantage of R without analysts needing to be retrained as R programmers should enable Actian Analytics Platform users to focus on the approach that delivers the most efficient use of their existing tools and skills."

To speed time to value, Actian also provides analytic accelerators that give companies a jump start for business areas like market basket analysis, media mix modeling, churn prediction, and ad optimization. Identifying value-added data sources and combining them with Actian’s pre-defined blueprints and analytic functions, enables R users to deploy analytic applications in days instead of months.

Actian will exhibit its R integration, Customer Churn Prediction and other big data analytics solutions in its Booth # 204 at TDWI World Chicago, May 13-16.

About Actian: Accelerating Big Data 2.0™

Actian transforms big data into business value for any organization – not just the privileged few. Actian provides transformational business value by delivering actionable insights into new sources of revenue, business opportunities, and ways of mitigating risk with high-performance in-database analytics complemented with extensive connectivity and data preparation. The 21st century software architecture of the Actian Analytics Platform delivers extreme performance on off-the-shelf hardware, overcoming key technical and economic barriers to broad adoption of big data. Actian also makes Hadoop enterprise-grade by providing high-performance data enrichment, visual design and SQL analytics on Hadoop without the need for MapReduce skills. Among tens of thousands of organizations using Actian are innovators using analytics for competitive advantage in industries like financial services, telecommunications, digital media, healthcare and retail. The company is headquartered in Silicon Valley and has offices worldwide. Stay connected with Actian Corporation at www.actian.com or on FacebookTwitter and LinkedIn.

Actian, Big Data for the Rest of Us, Accelerating Big Data 2.0 and Actian Analytics Platform are trademarks of Actian Corporation and its subsidiaries. All other trademarks, trade names, service marks, and logos referenced herein belong to their respective companies.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.