|By PR Newswire||
|March 17, 2014 01:00 PM EDT||
SAN FRANCISCO, March 17, 2014 /PRNewswire/ --
- Building on its architecture for Business Data Lakes, Pivotal today announced the availability of Pivotal HD 2.0, a commercially supported enterprise distribution of the Apache Hadoop stack, now rebased and hardened on Apache Hadoop 2.2.
- Pivotal also announced the general availability of Pivotal GemFire XD, an in-memory database seamlessly integrated with Pivotal HD 2.0 that brings high concurrency, real-time transactions and in-memory analytical processing to power next generation big data applications.
- Pivotal HD 2.0 expands analytic use cases with integration and support of GraphLab, MADlib, and popular languages and formats such as R, Python, Java, and Parquet to create a powerful and easy to use analytical platform for data scientists and analysts in Hadoop.
- Pivotal is hosting a webinar on March 27, 2014 that expands on today's news. Registration to Pivotal's Next Generation Business Data Lake webinar can be found at http://bit.ly/1i8on7g.
Pivotal, the software company at the intersection of big data, PaaS, and agile development, today announced the release of Pivotal™ HD 2.0 and the general availability of Pivotal GemFire™ XD. The combination of Pivotal HD 2.0, the HAWQ™ query engine, and Gemfire XD constitute the foundation for the Business Data Lake architecture, the big data application framework for enterprises, data scientists, analysts and developers that provides a more flexible, faster way to develop data savvy software than what they can do with Hadoop alone.
Breaking New Ground In Real-Time for Apps, Data and Analytics
Behind every leading enterprise there is real-time analytics that drives real-time advantages and intelligence. Available today, Pivotal GemFire XD bridges GemFire's proven in-memory intelligence and integrates it with Pivotal HD 2.0 and HAWQ. Pivotal GemFire technology enables businesses to make prescriptive decisions in real-time, such as stock trading, fraud detection, intelligence for energy companies, routing for the telecom industries, or for scaling reservations for the world's largest annual movement of humans on the planet.
Also new within Pivotal HD is the world's first enterprise integration of GraphLab, an advanced set of algorithms for graph analytics that enables data scientists and analysts to leverage popular algorithms for insight, i.e. page rank, collaborative filtering and computer vision.
Pivotal released Pivotal HD with HAWQ last year to radically increase the speed of analysis for Hadoop queries; it was designed from the ground up as a massively parallel SQL processing engine optimized specifically for analytics.
With Pivotal HD 2.0, new improvements to HAWQ include:
- MADlib Machine Learning Library – Unlock deeper predictive insights faster, better with over 50 in-database analytic algorithms;
- Language Translation – Leverage the full power of R, Python and Java to enable business logic and procedures otherwise cumbersome with SQL;
- Parquet Support – Beta support for read and write Parquet files, opening the power of HAWQ's SQL query engine on this popular open file type
Nik Rouda, Senior Analyst, ESG
"The combined release of Pivotal HD 2.0 and GemFireXD, introducing real-time SQL queries, makes it easier for developers to get live insights on streaming data sources without having to learn new tools. Not least, the range of new programming language support gives an even richer portfolio of advanced analytics capabilities. Pivotal is well-positioned to change the nature of how big data gets done in the enterprise."
Josh Klahr, Vice President, Product Management, Pivotal
"When it comes to Hadoop, other approaches in the market have left customers with a mishmash of un-integrated products and processes. Building on our industry-leading SQL-on-Hadoop offer, HAWQ, Pivotal HD 2.0 is the first platform to fully integrate proven enterprise in-memory technology, Pivotal GemFire XD, with advanced services on Hadoop 2.2 that provide native support for a comprehensive data science toolset. Data driven businesses now have the capabilities they need to gain a massive head start toward developing analytics and applications for more intelligent and innovative products and services."
- Read the Pivotal Blog on How Enterprises Can Get More Out of Hadoop With a Data Lake http://ow.ly/uBnMf
- Case Study: Scaling Reservations for the World's Largest Train System, China Railways Corporation
- Pivotal Big Data Product Page: http://bit.ly/1i8p54o
Pivotal, committed to open source and open standards, recently introduced Pivotal One, the world's first comprehensive multi-cloud Enterprise PaaS. The company is also a leading provider of application and data infrastructure software, agile development services, and data science consulting. Follow Pivotal on Twitter @gopivotal, LinkedIn, and G+.
©2014 Pivotal Software, Inc. All rights reserved. Pivotal, GemFire and HAWQ are trademarks and/or registered trademarks of Pivotal Software, Inc. in the United States and/or other Countries. This release contains "forward-looking statements" as defined under the Federal Securities Laws. Actual results could differ materially from those projected in the forward-looking statements as a result of certain risk factors, including but not limited to: (i) adverse changes in general economic or market conditions; (ii) delays or reductions in information technology spending; (iii) the relative and varying rates of product price and component cost declines and the volume and mixture of product and services revenues; (iv) competitive factors, including but not limited to pricing pressures and new product introductions; (v) component and product quality and availability; (vi) fluctuations in VMware, Inc.'s operating results and risks associated with trading of VMware stock; (vii) the transition to new products, the uncertainty of customer acceptance of new product offerings and rapid technological and market change; (viii) risks associated with managing the growth of our business, including risks associated with acquisitions and investments and the challenges and costs of integration, restructuring and achieving anticipated synergies; (ix) the ability to attract and retain highly qualified employees; (x) insufficient, excess or obsolete inventory; (xi) fluctuating currency exchange rates; (xii) threats and other disruptions to our secure data centers or networks; (xiii) our ability to protect our proprietary technology; (xiv) war or acts of terrorism; and (xv) other one-time events and other important factors disclosed previously and from time to time in the filings EMC Corporation, the parent company of Pivotal, with the U.S. Securities and Exchange Commission. EMC and Pivotal disclaim any obligation to update any such forward-looking statements after the date of this release.
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
Jul. 29, 2015 05:30 PM EDT
"We specialize in testing. DevOps is all about continuous delivery and accelerating the delivery pipeline and there is no continuous delivery without testing," noted Marc Hornbeek, Sr. Solutions Architect at Spirent Communications, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 05:15 PM EDT Reads: 354
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at @DevOpsSummit, Haseeb Budhani, CEO and Co-founder of Soha, shared five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the friction an...
Jul. 29, 2015 04:30 PM EDT Reads: 497
"Alert Logic is a managed security service provider that basically deploys technologies, but we support those technologies with the people and process behind it," stated Stephen Coty, Chief Security Evangelist at Alert Logic, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 04:15 PM EDT Reads: 317
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Jul. 29, 2015 04:00 PM EDT Reads: 1,064
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to tran...
Jul. 29, 2015 03:15 PM EDT Reads: 367
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Jul. 29, 2015 03:15 PM EDT Reads: 231
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Jul. 29, 2015 03:00 PM EDT Reads: 464
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists addressed this very serious issue of pro...
Jul. 29, 2015 03:00 PM EDT Reads: 1,251
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
Jul. 29, 2015 02:30 PM EDT
With SaaS use rampant across organizations, how can IT departments track company data and maintain security? More and more departments are commissioning their own solutions and bypassing IT. A cloud environment is amorphous and powerful, allowing you to set up solutions for all of your user needs: document sharing and collaboration, mobile access, e-mail, even industry-specific applications. In his session at 16th Cloud Expo, Shawn Mills, President and a founder of Green House Data, discussed h...
Jul. 29, 2015 02:30 PM EDT Reads: 319
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affect t...
Jul. 29, 2015 02:00 PM EDT Reads: 1,165
"Our biggest growth area has been the security services, the managed services - the things that differentiate us in the market that there is no client that's too small and there's no client that's too big," explained Paul Mazzucco, Chief Security Officer at TierPoint, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Jul. 29, 2015 02:00 PM EDT Reads: 371
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Jul. 29, 2015 01:45 PM EDT Reads: 419
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at DevOps Summit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Jul. 29, 2015 01:00 PM EDT Reads: 1,058