Welcome!

Related Topics: @DevOpsSummit, Java IoT, Linux Containers, Open Source Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, Apache, @BigDataExpo, @ThingsExpo

@DevOpsSummit: Article

Apache #Hadoop and #BigData Standards | @CloudExpo #IoT #M2M #BI #ML

The platform’s penetration into enterprises has not lived up to Hadoop’s game-changing business potential

Making Apache Hadoop Less Retro: Bringing Standards to Big Data

Ten short years ago, Apache Hadoop was just a small project deployed on a few machines at Yahoo and within a few years, it had truly become the backbone of Yahoo's data infrastructure. Additionally, the current Apache Hadoop market is forecasted to surpass $16 billion by 2020.

This might lead you to believe that Apache Hadoop is currently the backbone of data infrastructures for all enterprises; however, widespread enterprise adoption has been shockingly low.

While the platform is a key technology for gaining business insights from organizational Big Data, its penetration into enterprises has not lived up to Hadoop's game-changing business potential. In fact, according to Gartner, "Despite considerable hype and reported successes for early adopters, 54 percent of survey respondents report no plans to invest [in Hadoop] at this time, while only 18 percent have plans to invest in Hadoop over the next two years," said Nick Heudecker, research director at Gartner.

These findings demonstrate that although the open source platform may be proven and popular among seasoned developers who require a technology that can power large, complex applications, its fragmented ecosystem has caused enterprises difficulty extracting value from Apache Hadoop investments.

Another glaring barrier to adoption is the rapid and fragmented growth happening with Apache Hadoop components and its platform distribution, ultimately slowing Big Data ecosystem development and stunting enterprise implementation.

For legacy companies, platforms like Apache Hadoop seem daunting and risky. If these enterprises aren't able to initially identify the baseline business value they stand to gain from a technology, they are unlikely to invest - and this is where the value of industry standards comes into play.

Increasing adoption of Apache Hadoop, in my opinion, will require platform distributions to stop asking legacy corporations to technologically resemble Amazon, Twitter or Netflix. Through compatibility across platform distribution and application offerings for management and integration, widespread industry interoperability standards would allow Big Data application and solution providers to offer enterprises a guaranteed and official bare-minimum functionality and interoperability for their Apache Hadoop investments.

Additionally, this baseline of technological expectation will also benefit companies looking to differentiate their offerings. Similarly, standards within this open source-based Big Data technology will enable application developers and enterprises to more easily build data-driven applications - including standardizing the commodity work of the components of an Apache Hadoop platform distribution to spur the creation of more applications, which boosts the entire ecosystem.

A real world illustration of standardization in practice occurs within the container shipping industry, which was able to grow significantly once universal guidelines were implemented. When a formal shipping container standard was implemented by the International Standards Organization (ISO), to ensure the safe and efficient transport of containers, its significant impact increased trade more than 790 percent over 20 years - an incredible case for the unification and optimization of an entire ecosystem to ensure its longevity.

To help today's growing enterprise buyer looking to harness the estimated 4ZB of data the world is generating, the open data community will need to work together to foster the support of standardization across Apache Hadoop to ensure confidence from new adopters in their investment - regardless of the industry they serve.

From platform distributions, to application and solution providers and system integrators, known standards in which to operate will not only help to sustain this piece of the Big Data ecosystem pie, but it will define how these pieces interoperate and integrate more simply for the benefit of the ever-important enterprise.

More Stories By John Mertic

John Mertic is Director of Program Management for ODPi and Open Mainframe Project at The Linux Foundation. Previously, he was director of business development software alliances at Bitnami. He comes from a PHP and Open Source background, being a developer, evangelist, and partnership leader at SugarCRM, board member at OW2, president of OpenSocial, and frequent conference speaker around the world. As an avid writer, he has published articles on IBM Developerworks, Apple Developer Connection, and PHP Architect, and authored the book The Definitive Guide to SugarCRM: Better Business Applications and the book Building on SugarCRM.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
In his session at 20th Cloud Expo, Chris Carter, CEO of Approyo, will discuss the basic set up and solution for an SAP solution in the cloud and what it means to the viability of your company. Chris Carter is CEO of Approyo. He works with business around the globe, to assist them in their journey to the usage of Big Data in the forms of Hadoop (Cloudera and Hortonwork's) and SAP HANA. At Approyo, we support firms who are looking for knowledge to grow through current business process, where even...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain.
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Some people worry that OpenStack is more flash then substance; however, for many customers this could not be farther from the truth. No other technology equalizes the playing field between vendors while giving your internal teams better access than ever to infrastructure when they need it. In his session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will talk through some real-world OpenStack deployments and look into the ways this can benefit customers of all sizes....
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
Cloud Expo, Inc. has announced today that Andi Mann and Aruna Ravichandran have been named Co-Chairs of @DevOpsSummit at Cloud Expo 2017. The @DevOpsSummit at Cloud Expo New York will take place on June 6-8, 2017, at the Javits Center in New York City, New York, and @DevOpsSummit at Cloud Expo Silicon Valley will take place Oct. 31-Nov. 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, Cloud Expo and @ThingsExpo are two of the most important technology events of the year. Since its launch over eight years ago, Cloud Expo and @ThingsExpo have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, I provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading the...
In recent years, containers have taken the world by storm. Companies of all sizes and industries have realized the massive benefits of containers, such as unprecedented mobility, higher hardware utilization, and increased flexibility and agility; however, many containers today are non-persistent. Containers without persistence miss out on many benefits, and in many cases simply pass the responsibility of persistence onto other infrastructure, adding additional complexity.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his general session at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore...
In his session at @ThingsExpo, Steve Wilkes, CTO and founder of Striim, will delve into four enterprise-scale, business-critical case studies where streaming analytics serves as the key to enabling real-time data integration and right-time insights in hybrid cloud, IoT, and fog computing environments. As part of this discussion, he will also present a demo based on its partnership with Fujitsu, highlighting their technologies in a healthcare IoT use-case. The demo showcases the tracking of patie...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, discussed the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docker c...