Welcome!

News Feed Item

Pivotal Changes the Economics of Big Data Forever with New "Pivotal Big Data Suite" Offering

Industry's First Big Data "Mega Bundle" Gives Customers Easy and Flexible Access to Pivotal's Hadoop Data Platform, MPP Relational Database and In-Memory Transaction Database with a Single Subscription Price

SAN FRANCISCO, April 2, 2014 /PRNewswire/ --

Summary

  • Pivotal today announced the availability of the Pivotal Big Data Suite, an annual subscription based software, support, and maintenance package that bundles Pivotal Greenplum Database, Pivotal GemFire, Pivotal SQLFire, Pivotal GemFire XD, Pivotal HAWQ, and Pivotal HD into a flexible pool of big and fast data products for customers.
  • The Pivotal Big Data Suite includes an enterprise-class, advanced set of technologies that fully support Apache Hadoop and HDFS. With a cumulative contract minimum, Pivotal will make Pivotal HD, the world's most advanced distribution of Apache Hadoop, available on an unlimited basis at no extra cost, including support. This gives customers a unique opportunity to use any combination of Pivotal HD and Pivotal's world-class big and fast data products in one offering.
  • Unlike multi-vendor, patchwork heterogeneous solutions, the Pivotal Big Data Suite delivers a unified set of capabilities grounded in over three decades of expertise, development, and market leading data management and analytical intelligence. It allows companies to use the technologies that are right for their business, without fear of penalty or waste of investment.
  • Pivotal Big Data Suite is available today. Priced aggressively, the subscription is based on the number of cores on two and three year terms and requires a cumulative contract minimum.
Pivotal today introduced the industry's first big data "Mega Bundle" that gives customers easy and flexible access to Pivotal's Hadoop data platform, MPP relational database and in-memory transaction database with a single subscription price. More at www.gopivotal.com

Pivotal today announced the availability of the Pivotal™ Big Data Suite, an annual subscription based software, support, and maintenance package that bundles Pivotal Greenplum Database, Pivotal GemFire, Pivotal SQLFire, Pivotal GemFire XD, Pivotal HAWQ, and Pivotal HD into a flexible pool of big and fast data products for customers. Pivotal Big Data Suite is available today and is available through an aggressively priced subscription model based on the number of cores on two and three year terms.

The merging of traditional and next-generation data infrastructure technologies is defined as a "Business Data Lake" - a business priority for CIOs in 2014 and beyond and something that can be uniquely delivered by a company like Pivotal with broad data and analytics expertise. The Pivotal Big Data Suite fills a much-needed gap in the market, helping enterprise companies to capitalize on explosive data growth by offering a multi-faceted data portfolio with a "use it as you need it" pricing model. This enables organizations to flexibly move their investment from one technology to another within the suite, at any time. Per core pricing ensures that data that is simply being stored is not taxed, which will be important as enterprises seek to consolidate more and more information into a Business Data Lake. With the Pivotal Big Data Suite, data driven companies now have the ability to store everything, analyze anything and build the right thing without fear or penalty when making data technology investment decisions.

Unlimited, Fully Supported Apache Hadoop
In the face of today's corporate mandate to manage, store and analyze big data, Hadoop has emerged as a leading tool used increasingly by enterprises around the world as the core of their data management solution. With the rapid proliferation of Hadoop-based data storage requirements, traditional software consumption models are significant, cost-prohibitive obstacles to enterprise success in big data. Only Pivotal offers an unlimited subscription to Pivotal HD, today's leading enterprise Hadoop distribution. With the Pivotal Big Data Suite, unlimited Pivotal HD empowers enterprises to stretch their Business Data Lakes to store everything and analyze anything, without fear of runaway license and support costs.

The Pivotal Big Data Suite ties all of the company's leading data technologies together, enabling customers to leverage any one, whenever and wherever they need it. Existing Pivotal Greenplum Database customers can continue to leverage the powerful analytical data capabilities they love, while, at the same time, allow them to extend the use of the Pivotal data portfolio: everything from real time in-memory capabilities via Pivotal GemFire®, Pivotal SQLFire to Pivotal GemFire XD to lightning-fast queries via Pivotal HAWQ™ over Pivotal HD. Unified, simplified, and flexible licensing gives organizations the agility to use the technology that is right for their business.

Enabling the Data Driven Enterprise and Leveraging Cloud Computing

Data sits at the center of any business, the lens through which modern disruptors unearth answers to their most important business questions and the basis for the creation of a new class of applications that better serve the needs of customers. In addition to the Pivotal Big Data Suite, Pivotal leverages Pivotal One, the world's first comprehensive multi-cloud enterprise platform-as-a-service, and their agile development unit, Pivotal Labs, to deliver the most robust and compelling offering for customers to transform their business  and find value in today's era of the cloud.

Supporting Quotes

Shawn Rogers Vice President Research at Enterprise Management Associates
"The Pivotal Big Data Suite meets the needs of companies demanding flexibility and innovation concurrently. The suite delivers an array of functionality combined with a subscription-based model that allows customers to leverage the technology they need without traditional licensing constraints. By providing unlimited and supported Pivotal HD, Pivotal has found a way to encourage data growth and reduce risk for its customers enabling them to thrive and innovate."

Paul Maritz, CEO, Pivotal
"Enterprise customers are ready to move onto the next generation of data computing that gives them the speed, scale and efficiency they need to stay relevant in the market. They should be able to take advantage of modern database technologies and use them collectively without fear of penalty or waste of investment. With the Pivotal Big Data Suite, we are taking the lead for the industry by removing the technical barriers to data off the plates of our customers. Now the choice isn't about Hadoop or a SQL database, in-memory or real-time processing, but efficiency and value."

Availability
Pivotal Big Data Suite is available now. For pricing please visit http://www.gopivotal.com/contact.

Resources

About Pivotal
Pivotal, committed to open source and open standards, recently introduced Pivotal One, the world's first comprehensive multi-cloud Enterprise PaaS. The company is also a leading provider of application and data infrastructure software, agile development services, and data science consulting. To learn more visit www.gopivotal.com. Follow Pivotal on Twitter @gopivotal, LinkedIn, and G+

© 2014 Pivotal Software, Inc.  All rights reserved. 

Pivotal, GemFire and HAWQ are trademarks or registered trademarks of Pivotal Software, Inc. in the U.S. and/or other countries.

This release contains "forward-looking statements" as defined under the Federal Securities Laws.  Actual results could differ materially from those projected in the forward-looking statements as a result of certain risk factors, including but not limited to: (i) adverse changes in general economic or market conditions; (ii) delays or reductions in information technology spending; (iii) the relative and varying rates of product price and component cost declines and the volume and mixture of product and services revenues; (iv) competitive factors, including but not limited to pricing pressures and new product introductions; (v) component and product quality and availability; (vi) fluctuations in VMware, Inc.'s operating results and risks associated with trading of VMware stock; (vii) the transition to new products, the uncertainty of customer acceptance of new product offerings and rapid technological and market change; (viii) risks associated with managing the growth of our business, including risks associated with acquisitions and investments and the challenges and costs of integration, restructuring and achieving anticipated investments and the challenges and costs of integration, restructuring and achieving anticipated synergies; (ix) the ability to attract and retain highly qualified employees; (x) insufficient, excess or obsolete inventory; (xi) fluctuating currency exchange rates; (xii) threats and other disruptions to our secure data centers or networks; (xiii) our ability to protect our proprietary technology; (xiv) war or acts of terrorism; and (xv) other one-time events and other important factors disclosed previously and from time to time in the filings EMC Corporation, the parent company of Pivotal, with the U.S. Securities and Exchange Commission.  EMC and Pivotal disclaim any obligation to update any such forward-looking statements after the date of this release.

www.gopivotal.com.

 

Photo - http://photos.prnewswire.com/prnh/20140402/SF96173
Logo - http://photos.prnewswire.com/prnh/20130910/SF76762LOGO

SOURCE Pivotal Software, Inc.

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web ...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 18th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, shared the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Without lifecycle traceability and visibility across the tool chain, stakeholders from Planning-to-Ops have limited insight and answers to who, what, when, why and how across the DevOps lifecycle. This impacts the ability to deliver high quality software at the needed velocity to drive positive business outcomes. In his general session at @DevOpsSummit at 19th Cloud Expo, Phil Hombledal, Solution Architect at CollabNet, discussed how customers are able to achieve a level of transparency that e...
Much of the value of DevOps comes from a (renewed) focus on measurement, sharing, and continuous feedback loops. In increasingly complex DevOps workflows and environments, and especially in larger, regulated, or more crystallized organizations, these core concepts become even more critical. In his session at @DevOpsSummit at 18th Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, showed how, by focusing on 'metrics that matter,' you can provide objective, transparent, and meaningful f...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.