Welcome!

News Feed Item

Pivotal Advances Hadoop Offering, Integrates In-Memory Processing and Delivers Business Data Lake Architecture for the Enterprise

Pivotal HD 2.0 and Pivotal GemFire XD deliver the most advanced big data analytics platform on the market; Fortifies next-generation capabilities of data driven applications and analytics

SAN FRANCISCO, March 17, 2014 /PRNewswire/ --

Summary

  • Building on its architecture for Business Data Lakes, Pivotal today announced the availability of Pivotal HD 2.0, a commercially supported enterprise distribution of the Apache Hadoop stack, now rebased and hardened on Apache Hadoop 2.2.
  • Pivotal also announced the general availability of Pivotal GemFire XD, an in-memory database seamlessly integrated with Pivotal HD 2.0 that brings high concurrency, real-time transactions and in-memory analytical processing to power next generation big data applications.
  • Pivotal HD 2.0 expands analytic use cases with integration and support of GraphLab, MADlib, and popular languages and formats such as R, Python, Java, and Parquet to create a powerful and easy to use analytical platform for data scientists and analysts in Hadoop.
  • Pivotal is hosting a webinar on March 27, 2014 that expands on today's news.  Registration to Pivotal's Next Generation Business Data Lake webinar can be found at http://bit.ly/1i8on7g.

 

www.gopivotal.com.

Pivotal, the software company at the intersection of big data, PaaS, and agile development, today announced the release of Pivotal™ HD 2.0 and the general availability of Pivotal GemFire™ XD. The combination of Pivotal HD 2.0, the HAWQ™ query engine, and Gemfire XD constitute the foundation for the Business Data Lake architecture, the big data application framework for enterprises, data scientists, analysts and developers that provides a more flexible, faster way to develop data savvy software than what they can do with Hadoop alone.

Breaking New Ground In Real-Time for Apps, Data and Analytics

Behind every leading enterprise there is real-time analytics that drives real-time advantages and intelligence. Available today, Pivotal GemFire XD bridges GemFire's proven in-memory intelligence and integrates it with Pivotal HD 2.0 and HAWQ. Pivotal GemFire technology enables businesses to make prescriptive decisions in real-time, such as stock trading, fraud detection, intelligence for energy companies, routing for the telecom industries, or for scaling reservations for the world's largest annual movement of humans on the planet.

Also new within Pivotal HD is the world's first enterprise integration of GraphLab, an advanced set of algorithms for graph analytics that enables data scientists and analysts to leverage popular algorithms for insight, i.e. page rank, collaborative filtering and computer vision.

Pivotal released Pivotal HD with HAWQ last year to radically increase the speed of analysis for Hadoop queries; it was designed from the ground up as a massively parallel SQL processing engine optimized specifically for analytics.

With Pivotal HD 2.0, new improvements to HAWQ include:  

  • MADlib Machine Learning Library – Unlock deeper predictive insights faster, better with over 50 in-database analytic algorithms;
  • Language Translation – Leverage the full power of R, Python and Java to enable business logic and procedures otherwise cumbersome with SQL;
  • Parquet Support – Beta support for read and write Parquet files, opening the power of HAWQ's SQL query engine on this popular open file type

Supporting Quotes

Nik Rouda, Senior Analyst, ESG

"The combined release of Pivotal HD 2.0 and GemFireXD, introducing real-time SQL queries, makes it easier for developers to get live insights on streaming data sources without having to learn new tools. Not least, the range of new programming language support gives an even richer portfolio of advanced analytics capabilities. Pivotal is well-positioned to change the nature of how big data gets done in the enterprise."

Josh Klahr, Vice President, Product Management, Pivotal

"When it comes to Hadoop, other approaches in the market have left customers with a mishmash of un-integrated products and processes. Building on our industry-leading SQL-on-Hadoop offer, HAWQ, Pivotal HD 2.0 is the first platform to fully integrate proven enterprise in-memory technology, Pivotal GemFire XD, with advanced services on Hadoop 2.2 that provide native support for a comprehensive data science toolset. Data driven businesses now have the capabilities they need to gain a massive head start toward developing analytics and applications for more intelligent and innovative products and services."

Resources

About Pivotal
Pivotal, committed to open source and open standards, recently introduced Pivotal One, the world's first comprehensive multi-cloud Enterprise PaaS. The company is also a leading provider of application and data infrastructure software, agile development services, and data science consulting. Follow Pivotal on Twitter @gopivotal, LinkedIn, and G+.

©2014 Pivotal Software, Inc.  All rights reserved.  Pivotal, GemFire and HAWQ are trademarks and/or registered trademarks of Pivotal Software, Inc. in the United States and/or other Countries. This release contains "forward-looking statements" as defined under the Federal Securities Laws. Actual results could differ materially from those projected in the forward-looking statements as a result of certain risk factors, including but not limited to: (i) adverse changes in general economic or market conditions; (ii) delays or reductions in information technology spending; (iii) the relative and varying rates of product price and component cost declines and the volume and mixture of product and services revenues; (iv) competitive factors, including but not limited to pricing pressures and new product introductions; (v) component and product quality and availability; (vi) fluctuations in VMware, Inc.'s operating results and risks associated with trading of VMware stock; (vii) the transition to new products, the uncertainty of customer acceptance of new product offerings and rapid technological and market change; (viii) risks associated with managing the growth of our business, including risks associated with acquisitions and investments and the challenges and costs of integration, restructuring and achieving anticipated synergies; (ix) the ability to attract and retain highly qualified employees; (x) insufficient, excess or obsolete inventory; (xi) fluctuating currency exchange rates; (xii) threats and other disruptions to our secure data centers or networks; (xiii) our ability to protect our proprietary technology; (xiv) war or acts of terrorism; and (xv) other one-time events and other important factors disclosed previously and from time to time in the filings EMC Corporation, the parent company of Pivotal, with the U.S. Securities and Exchange Commission.  EMC and Pivotal disclaim any obligation to update any such forward-looking statements after the date of this release.

Logo - http://photos.prnewswire.com/prnh/20130910/SF76762LOGO

SOURCE Pivotal

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...