|By Business Wire||
|August 11, 2014 12:00 AM EDT||
GE (NYSE:GE) announced today that it has created, in collaboration with Pivotal, the first-ever industrial data lake approach to help global industrial customers like airlines, railroads, hospitals and utilities better access, analyze and store terabytes or even petabytes of industrial-strength Big Data, which is far more complex than other types of new and emerging content and information available today. With a 2,000x performance improvement on time to analysis, major industrial companies can spend less time and money managing intensive processes and focus more on turning the data into actionable insight for increased productivity of assets and operations. Today GE is using the Industrial Internet data lake for managing and analyzing full flight data from its customers which include many of the world’s largest airlines.
“Big Data is growing so fast that it is outpacing the ability of current tools to take full advantage of it,” said Bill Ruh, vice president, GE Software. “Working with Pivotal, we have created a unique industrial data approach that merges information technology (IT) with operational technology (OT) to better match the productivity and efficiency needs of our customers so they get the most value out of their mission-critical information.”
“Big and fast data is a critical piece of how modern industry is reinventing itself in order to innovate and compete,” said Paul Maritz, CEO of Pivotal. “The new industrial data lake architecture answers the call for the fast and highly scalable management of the unique industrial big data that is helping global enterprises transform their operations and build a new class of applications.”
Developed on foundational software elements from Pivotal, the industrial data lake will integrate with PredixTM, GE's software platform for the Industrial Internet that provides a standard and secure way to connect machines, analytics, data and people and is built for the unique scale of industrial data. This announcement builds on the strategic partnership between GE and Pivotal to jointly develop a new data architecture that meets the unique requirements of industrial data and critical infrastructure operations.
All information must be converted into recognizable formats before it can be used - a process that has become the bottleneck when managing industrial Big Data. Conventional approaches like data warehouses can be too slow, expensive and inflexible with nearly 80 percent of project time spent on gathering and preparing the data for analysis. [source: IDC whitepaper]
All of GE’s PredictivityTM software solutions benefit from this industrial data lake approach. It has been piloted in various industrial settings, including GE Aviation, where airplane engines are a fertile ground for Big Data collection and analysis. Using its Flight Efficiency Services, GE collects real-time data generated by the aircraft and its systems and runs advanced analytics on this data to help airlines run their operations more efficiently. For customers like AirAsia, this means savings of more than one percent of their fuel bill each year.
“Gathering and analyzing data to improve our customers’ operations is no longer a futuristic concept, but a real process underway today, and growing in magnitude,” said David Joyce, president & CEO, GE Aviation.
In a 2013 pilot, GE Aviation collected information on 15,000 flights from 25 different airlines at about 14 gigabytes of metrics per flight. With the industrial data lake approach, GE was able to integrate terabytes of full flight data for the first time in industry to produce measurable cost savings of 10x and significantly reduce analysis time from months to days. GE expects the data collection to grow to 10 million flights and 1,500 terabytes of full flight operational data by 2015.
Learn more about GE’s industrial data lake at http://www.gesoftware.com/industrial-data-lake.
GE (NYSE:GE) works on things that matter. The best people and the best technologies taking on the toughest challenges. Finding solutions in energy, health and home, transportation and finance. Building, powering, moving and helping to cure the world. Not just imagining. Doing. GE works. For more information, visit the company's website at www.ge.com.
About GE Software
GE Software connects brilliant machines with best-in-class minds and big data analytics to deliver on our vision for the Industrial Internet. Powered by GE’s Predix platform, our Predictivity software solutions unlock valuable insights and increased productivity through asset performance management for customers across diverse industries including aviation, rail, energy and healthcare. For more, visit GE Software's website at www.gesoftware.com.
Pivotal, the company at the intersection of big data, platform-as-service (PaaS), and agile development, helps companies transform into great software companies. Pivotal offers a complete portfolio of products that converge apps, data and analytics along with Pivotal’s comprehensive PaaS platform, powered by Cloud Foundry®.
©2014 Pivotal Software, Inc. is a registered trademark of Pivotal Software, Inc. in the United States and/or other Countries.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Jan. 24, 2017 05:00 AM EST Reads: 3,104
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Jan. 24, 2017 04:00 AM EST Reads: 6,689
While many government agencies have embraced the idea of employing cloud computing as a tool for increasing the efficiency and flexibility of IT, many still struggle with large scale adoption. The challenge is mainly attributed to the federated structure of these agencies as well as the immaturity of brokerage and governance tools and models. Initiatives like FedRAMP are a great first step toward solving many of these challenges but there are a lot of unknowns that are yet to be tackled. In hi...
Jan. 24, 2017 02:45 AM EST Reads: 3,943
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
Jan. 24, 2017 02:30 AM EST Reads: 4,270
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet and...
Jan. 24, 2017 02:15 AM EST Reads: 6,714
What are the new priorities for the connected business? First: businesses need to think differently about the types of connections they will need to make – these span well beyond the traditional app to app into more modern forms of integration including SaaS integrations, mobile integrations, APIs, device integration and Big Data integration. It’s important these are unified together vs. doing them all piecemeal. Second, these types of connections need to be simple to design, adapt and configure...
Jan. 24, 2017 02:00 AM EST Reads: 1,933
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jan. 24, 2017 02:00 AM EST Reads: 5,630
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 24, 2017 01:45 AM EST Reads: 2,266
"Logz.io is a log analytics platform. We offer the ELK stack, which is the most common log analytics platform in the world. We offer it as a cloud service," explained Tomer Levy, co-founder and CEO of Logz.io, in this SYS-CON.tv interview at DevOps Summit, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 24, 2017 01:15 AM EST Reads: 6,454
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Jan. 24, 2017 01:15 AM EST Reads: 2,969
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Jan. 24, 2017 12:30 AM EST Reads: 5,126
"ReadyTalk is an audio and web video conferencing provider. We've really come to embrace WebRTC as the platform for our future of technology," explained Dan Cunningham, CTO of ReadyTalk, in this SYS-CON.tv interview at WebRTC Summit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 24, 2017 12:30 AM EST Reads: 2,381
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 24, 2017 12:30 AM EST Reads: 6,441
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
Jan. 24, 2017 12:15 AM EST Reads: 4,782
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Jan. 24, 2017 12:15 AM EST Reads: 1,010