Welcome!

News Feed Item

Alpine 4.0 Puts an End to Big Data's Compromise With a Single Platform for All Enterprise Data

Alpine Data Labs' New Enterprise Platform Allows Data Teams to Access All Enterprise Data, Work Bi-Directionally With Hadoop, Discover and Share Insights at the Speed of Business

SAN FRANCISCO, CA -- (Marketwired) -- 07/22/14 -- Alpine Data Labs announced today the introduction of Alpine Chorus 4.0, the industry's first Advanced Analytics enterprise platform that enables universal data discovery and search, bi-directional integration between Hadoop and all major data platforms, as well as compatibility with Spark and Cloudera 5.

Alpine Chorus 4.0 brings innovation in data discovery, query parallelization and machine learning in distributed environments. The company also introduces the first of its kind life cycle management facility for Hadoop and non-Hadoop platforms which allows for sophisticated machine learning algorithms to be run and managed simply across heterogeneous data systems such as Cloudera, MapR, Pivotal HD or databases like PostGreSQL, Oracle and Greenplum. Complimentary access to Alpine Chorus 4.0 can be found at http://start.alpinenow.com

The Network Effect of Insights
"Research shows that only 4% of enterprises get business value out of their Big Data investment," says Joe Otto, President and CEO at Alpine Data Labs. "The current industry solutions encourage a siloed and non-scalable approach to Big Data and that simply limits progress. We focus on building the most comprehensive and scalable platform that enterprises can use to achieve Big Data ROI and to better connect people, data and insights. From helping people quickly visualize and work with any data, to running models 100 times faster on Spark, to operationalizing the deployment of real-time models via standards like PMML, customers using Alpine Chorus innovate faster because they can easily run deep algorithms at Big Data scale and in a timeframe of business relevance."

The new solution boasts over 100 new features and furthers the company's advantage in the field of Advanced Analytics. With Alpine Chorus 4.0, data scientists and engineers can be productive on any data -- Hadoop or not; business users are engaged early and quickly add value to the advanced analytics conversation; and finally, executives rely on a standard platform to build repeatable, secure and reusable analytical practices.

Over the last 6 months alone, the company has tripled its customer base and has grown by over 200% in the financial services, online media, government, retail and manufacturing sectors.

Data Discovery Made Simple
Most organizations cut into their competitive advantage early in the analytical process because their data scientists can't easily discover, assemble and transform data before working with it. That process can take months, because moving data is not simple and when it comes to working with Hadoop data, new skill sets need to be acquired.

Alpine Chorus 4.0's universal data discovery capability allows users to search, find and use data regardless of where it is. Using Alpine Chorus' "Google-like" search, users can find and browse any file, model, workflow, comment, dataset, etc. -- and when data is found, they can visualize it through powerful heat maps, scatter plots and histograms, all without data movement.

"This functionality alone made our team more effective. It allowed us to assemble and understand data quickly, without the complexity of working with MapReduce, or Pig or SQL," says Ron Rasmussen, CTO & SVP Engineering at Xactly Corp. "Our ability to work rapidly and iterate at Big Data scale is core to helping us deliver the best products to our customers."

Big Data Analytics at the Speed of Business
"Removing Hadoop's complexity will give any company a head start, but it's not enough," says Steven Hillion, co-founder and Chief Product Officer at Alpine Data Labs. "Once enterprises have identified the data they want to work with, they need to interrogate it without being encumbered by performance issues."

In this new release, the company unveils its Parallel Analytics Engine, a virtual layer that now executes all of Alpine Chorus' algorithms with multiple levels of parallelism. This includes the Workflow Graph Optimizer, which parses analytics workloads and deploys them in parallel to maximize the use of available resources; and the Polymorphic Data Service, which decides at run-time how to optimize queries for each type of data platform. These innovations, unique to Alpine Data Labs, represent the most efficient way to run sophisticated machine learning algorithms on a variety of distributed systems. They also made it possible for Alpine Chorus 4.0 to be the first Advanced Analytics platform to be certified on CDH5 and Spark, benchmarked running complex algorithms at up to a hundred times faster than previously possible.

"With Alpine Chorus 4.0 customers can work on important analytical issues at Big Data speed and keep the business engaged because of the solution's visual, powerful and collaborative approach," says Amr Awadallah, ‎Founder and CTO at Cloudera. "Alpine Chorus is a showcase for analytics innovation in the Big Data Era and we're excited that it features the power of Cloudera 5."

The Internet of People
"The key to analytical excellence is collaboration," says Dan Vesset, Vice President of IDC's Business Analytics research. "Collaboration often gets a bad name because it sounds too abstract. However, our research shows that effective cross-enterprise collaboration has a determinant role in helping Big Data projects succeed and return value. Alpine Data Labs is leading the way here."

The new features in Alpine Chorus 4.0 make the benefits of collaboration very tangible:

  • Data scientists can tap into the innovation of their business counterparts at every point in the analytics process through user-generated data: comments, tags, links and documents applied to models, workflows, datasets and sandboxes.
  • Business Analysts can easily and visually understand data science work through collaborative analytics workspaces, communicating and iterating in real-time, increasing the value and confidence of their analysis.
  • Data and IT engineers rely on Github-like version control features, job scheduling and data management capabilities and can operationalize Big Data Analytics in a secure and consistent manner.
  • Executives benefit from a platform that is innovative, open and secure because all interactions in Alpine Chorus are recorded and auditable.

Alpine Chorus 4.0 rests on key new technological breakthroughs:

1) Visualize Before You Analyze: Universal Search, Interactive Visualizations and Data Augmentation add a layer of understanding on top of any data.
2) Transform and Query Without Extraction: Alpine Chorus comprehensive library of transformation operators -- from simple filters, to variable, null-value replacement operators to pivot, multi-join and normalization functions -- are accessible via sql editor or visual, drag and drop icons. All of Alpine Chorus operators run in place and in parallel.
3) Manage Data In and Out of Hadoop: data can be sent to Hadoop for building Big Data Lakes and out of Hadoop to write the results of large-scale computation done on Big Data to operational systems.
4) Do Predictive Analytics Natively on Big Data: All of Alpine Chorus are written and optimized to execute in parallel, making analysis at Big Data speed a reality.
5) Work With the Latest Innovations: Embraces Data Science standards for real-time scoring (PMML), as well as supports and contributes to open source platform technologies (Spark, Sqoop, Madlib, MLlib, etc). First Advanced Analytics platform to be certified on Spark and Cloudera CDH 5.
6) Extend and Productionize Models: Alpine Chorus REST API available to run, and edit run user defined functions (UDFs) as part of an end-to-end analytic workflow.
7) Manage the Analytics Full-Life Cycle: Github-like Version Control (copy workflow, history capture, revert capability), Check-in, commenting, model review and tracking, Job Scheduling, Data management.

For more, try Alpine for free @ http://start.alpinenow.com

ABOUT ALPINE DATA LABS
Alpine Chorus is the world's first Enterprise Platform for Advanced Analytics on Big Data and Hadoop. With Alpine, data scientists and business analysts can work with large data sets, develop and collaborate on models at scale without having to use code or download software. Leaders in all industries, from Financial Services to Healthcare, use Alpine to outsmart their competition. Maybe you should too. Find out more at: www.alpinenow.com

Image Available: http://www2.marketwire.com/mw/frame_mw?attachid=2643012

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

Latest Stories
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.