Click here to close now.


News Feed Item

Alpine 4.0 Puts an End to Big Data's Compromise With a Single Platform for All Enterprise Data

Alpine Data Labs' New Enterprise Platform Allows Data Teams to Access All Enterprise Data, Work Bi-Directionally With Hadoop, Discover and Share Insights at the Speed of Business

SAN FRANCISCO, CA -- (Marketwired) -- 07/22/14 -- Alpine Data Labs announced today the introduction of Alpine Chorus 4.0, the industry's first Advanced Analytics enterprise platform that enables universal data discovery and search, bi-directional integration between Hadoop and all major data platforms, as well as compatibility with Spark and Cloudera 5.

Alpine Chorus 4.0 brings innovation in data discovery, query parallelization and machine learning in distributed environments. The company also introduces the first of its kind life cycle management facility for Hadoop and non-Hadoop platforms which allows for sophisticated machine learning algorithms to be run and managed simply across heterogeneous data systems such as Cloudera, MapR, Pivotal HD or databases like PostGreSQL, Oracle and Greenplum. Complimentary access to Alpine Chorus 4.0 can be found at

The Network Effect of Insights
"Research shows that only 4% of enterprises get business value out of their Big Data investment," says Joe Otto, President and CEO at Alpine Data Labs. "The current industry solutions encourage a siloed and non-scalable approach to Big Data and that simply limits progress. We focus on building the most comprehensive and scalable platform that enterprises can use to achieve Big Data ROI and to better connect people, data and insights. From helping people quickly visualize and work with any data, to running models 100 times faster on Spark, to operationalizing the deployment of real-time models via standards like PMML, customers using Alpine Chorus innovate faster because they can easily run deep algorithms at Big Data scale and in a timeframe of business relevance."

The new solution boasts over 100 new features and furthers the company's advantage in the field of Advanced Analytics. With Alpine Chorus 4.0, data scientists and engineers can be productive on any data -- Hadoop or not; business users are engaged early and quickly add value to the advanced analytics conversation; and finally, executives rely on a standard platform to build repeatable, secure and reusable analytical practices.

Over the last 6 months alone, the company has tripled its customer base and has grown by over 200% in the financial services, online media, government, retail and manufacturing sectors.

Data Discovery Made Simple
Most organizations cut into their competitive advantage early in the analytical process because their data scientists can't easily discover, assemble and transform data before working with it. That process can take months, because moving data is not simple and when it comes to working with Hadoop data, new skill sets need to be acquired.

Alpine Chorus 4.0's universal data discovery capability allows users to search, find and use data regardless of where it is. Using Alpine Chorus' "Google-like" search, users can find and browse any file, model, workflow, comment, dataset, etc. -- and when data is found, they can visualize it through powerful heat maps, scatter plots and histograms, all without data movement.

"This functionality alone made our team more effective. It allowed us to assemble and understand data quickly, without the complexity of working with MapReduce, or Pig or SQL," says Ron Rasmussen, CTO & SVP Engineering at Xactly Corp. "Our ability to work rapidly and iterate at Big Data scale is core to helping us deliver the best products to our customers."

Big Data Analytics at the Speed of Business
"Removing Hadoop's complexity will give any company a head start, but it's not enough," says Steven Hillion, co-founder and Chief Product Officer at Alpine Data Labs. "Once enterprises have identified the data they want to work with, they need to interrogate it without being encumbered by performance issues."

In this new release, the company unveils its Parallel Analytics Engine, a virtual layer that now executes all of Alpine Chorus' algorithms with multiple levels of parallelism. This includes the Workflow Graph Optimizer, which parses analytics workloads and deploys them in parallel to maximize the use of available resources; and the Polymorphic Data Service, which decides at run-time how to optimize queries for each type of data platform. These innovations, unique to Alpine Data Labs, represent the most efficient way to run sophisticated machine learning algorithms on a variety of distributed systems. They also made it possible for Alpine Chorus 4.0 to be the first Advanced Analytics platform to be certified on CDH5 and Spark, benchmarked running complex algorithms at up to a hundred times faster than previously possible.

"With Alpine Chorus 4.0 customers can work on important analytical issues at Big Data speed and keep the business engaged because of the solution's visual, powerful and collaborative approach," says Amr Awadallah, ‎Founder and CTO at Cloudera. "Alpine Chorus is a showcase for analytics innovation in the Big Data Era and we're excited that it features the power of Cloudera 5."

The Internet of People
"The key to analytical excellence is collaboration," says Dan Vesset, Vice President of IDC's Business Analytics research. "Collaboration often gets a bad name because it sounds too abstract. However, our research shows that effective cross-enterprise collaboration has a determinant role in helping Big Data projects succeed and return value. Alpine Data Labs is leading the way here."

The new features in Alpine Chorus 4.0 make the benefits of collaboration very tangible:

  • Data scientists can tap into the innovation of their business counterparts at every point in the analytics process through user-generated data: comments, tags, links and documents applied to models, workflows, datasets and sandboxes.
  • Business Analysts can easily and visually understand data science work through collaborative analytics workspaces, communicating and iterating in real-time, increasing the value and confidence of their analysis.
  • Data and IT engineers rely on Github-like version control features, job scheduling and data management capabilities and can operationalize Big Data Analytics in a secure and consistent manner.
  • Executives benefit from a platform that is innovative, open and secure because all interactions in Alpine Chorus are recorded and auditable.

Alpine Chorus 4.0 rests on key new technological breakthroughs:

1) Visualize Before You Analyze: Universal Search, Interactive Visualizations and Data Augmentation add a layer of understanding on top of any data.
2) Transform and Query Without Extraction: Alpine Chorus comprehensive library of transformation operators -- from simple filters, to variable, null-value replacement operators to pivot, multi-join and normalization functions -- are accessible via sql editor or visual, drag and drop icons. All of Alpine Chorus operators run in place and in parallel.
3) Manage Data In and Out of Hadoop: data can be sent to Hadoop for building Big Data Lakes and out of Hadoop to write the results of large-scale computation done on Big Data to operational systems.
4) Do Predictive Analytics Natively on Big Data: All of Alpine Chorus are written and optimized to execute in parallel, making analysis at Big Data speed a reality.
5) Work With the Latest Innovations: Embraces Data Science standards for real-time scoring (PMML), as well as supports and contributes to open source platform technologies (Spark, Sqoop, Madlib, MLlib, etc). First Advanced Analytics platform to be certified on Spark and Cloudera CDH 5.
6) Extend and Productionize Models: Alpine Chorus REST API available to run, and edit run user defined functions (UDFs) as part of an end-to-end analytic workflow.
7) Manage the Analytics Full-Life Cycle: Github-like Version Control (copy workflow, history capture, revert capability), Check-in, commenting, model review and tracking, Job Scheduling, Data management.

For more, try Alpine for free @

Alpine Chorus is the world's first Enterprise Platform for Advanced Analytics on Big Data and Hadoop. With Alpine, data scientists and business analysts can work with large data sets, develop and collaborate on models at scale without having to use code or download software. Leaders in all industries, from Financial Services to Healthcare, use Alpine to outsmart their competition. Maybe you should too. Find out more at:

Image Available:

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
In his session at @ThingsExpo, Tony Shan, Chief Architect at CTS, will explore the synergy of Big Data and IoT. First he will take a closer look at the Internet of Things and Big Data individually, in terms of what, which, why, where, when, who, how and how much. Then he will explore the relationship between IoT and Big Data. Specifically, he will drill down to how the 4Vs aspects intersect with IoT: Volume, Variety, Velocity and Value. In turn, Tony will analyze how the key components of IoT ...
When it comes to IoT in the enterprise, namely the commercial building and hospitality markets, a benefit not getting the attention it deserves is energy efficiency, and IoT’s direct impact on a cleaner, greener environment when installed in smart buildings. Until now clean technology was offered piecemeal and led with point solutions that require significant systems integration to orchestrate and deploy. There didn't exist a 'top down' approach that can manage and monitor the way a Smart Buildi...
SYS-CON Events announced today that Cloud Raxak has been named “Media & Session Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Raxak Protect automates security compliance across private and public clouds. Using the SaaS tool or managed service, developers can deploy cloud apps quickly, cost-effectively, and without error.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
“The Internet of Things transforms the way organizations leverage machine data and gain insights from it,” noted Splunk’s CTO Snehal Antani, as Splunk announced accelerated momentum in Industrial Data and the IoT. The trend is driven by Splunk’s continued investment in its products and partner ecosystem as well as the creativity of customers and the flexibility to deploy Splunk IoT solutions as software, cloud services or in a hybrid environment. Customers are using Splunk® solutions to collect ...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
As the world moves towards more DevOps and microservices, application deployment to the cloud ought to become a lot simpler. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. In his session at 17th Cloud Expo, Raghavan "Rags" Srinivas, an Architect/Developer Evangeli...