Related Topics: SYS-CON MEDIA, Java IoT, Linux Containers, Open Source Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, Apache, @BigDataExpo, @ThingsExpo


Apache #Hadoop and #BigData Standards | @CloudExpo #IoT #M2M #BI #ML

The platform’s penetration into enterprises has not lived up to Hadoop’s game-changing business potential

Making Apache Hadoop Less Retro: Bringing Standards to Big Data

Ten short years ago, Apache Hadoop was just a small project deployed on a few machines at Yahoo and within a few years, it had truly become the backbone of Yahoo's data infrastructure. Additionally, the current Apache Hadoop market is forecasted to surpass $16 billion by 2020.

This might lead you to believe that Apache Hadoop is currently the backbone of data infrastructures for all enterprises; however, widespread enterprise adoption has been shockingly low.

While the platform is a key technology for gaining business insights from organizational Big Data, its penetration into enterprises has not lived up to Hadoop's game-changing business potential. In fact, according to Gartner, "Despite considerable hype and reported successes for early adopters, 54 percent of survey respondents report no plans to invest [in Hadoop] at this time, while only 18 percent have plans to invest in Hadoop over the next two years," said Nick Heudecker, research director at Gartner.

These findings demonstrate that although the open source platform may be proven and popular among seasoned developers who require a technology that can power large, complex applications, its fragmented ecosystem has caused enterprises difficulty extracting value from Apache Hadoop investments.

Another glaring barrier to adoption is the rapid and fragmented growth happening with Apache Hadoop components and its platform distribution, ultimately slowing Big Data ecosystem development and stunting enterprise implementation.

For legacy companies, platforms like Apache Hadoop seem daunting and risky. If these enterprises aren't able to initially identify the baseline business value they stand to gain from a technology, they are unlikely to invest - and this is where the value of industry standards comes into play.

Increasing adoption of Apache Hadoop, in my opinion, will require platform distributions to stop asking legacy corporations to technologically resemble Amazon, Twitter or Netflix. Through compatibility across platform distribution and application offerings for management and integration, widespread industry interoperability standards would allow Big Data application and solution providers to offer enterprises a guaranteed and official bare-minimum functionality and interoperability for their Apache Hadoop investments.

Additionally, this baseline of technological expectation will also benefit companies looking to differentiate their offerings. Similarly, standards within this open source-based Big Data technology will enable application developers and enterprises to more easily build data-driven applications - including standardizing the commodity work of the components of an Apache Hadoop platform distribution to spur the creation of more applications, which boosts the entire ecosystem.

A real world illustration of standardization in practice occurs within the container shipping industry, which was able to grow significantly once universal guidelines were implemented. When a formal shipping container standard was implemented by the International Standards Organization (ISO), to ensure the safe and efficient transport of containers, its significant impact increased trade more than 790 percent over 20 years - an incredible case for the unification and optimization of an entire ecosystem to ensure its longevity.

To help today's growing enterprise buyer looking to harness the estimated 4ZB of data the world is generating, the open data community will need to work together to foster the support of standardization across Apache Hadoop to ensure confidence from new adopters in their investment - regardless of the industry they serve.

From platform distributions, to application and solution providers and system integrators, known standards in which to operate will not only help to sustain this piece of the Big Data ecosystem pie, but it will define how these pieces interoperate and integrate more simply for the benefit of the ever-important enterprise.

More Stories By John Mertic

John Mertic is Director of Program Management for ODPi and Open Mainframe Project at The Linux Foundation. Previously, he was director of business development software alliances at Bitnami. He comes from a PHP and Open Source background, being a developer, evangelist, and partnership leader at SugarCRM, board member at OW2, president of OpenSocial, and frequent conference speaker around the world. As an avid writer, he has published articles on IBM Developerworks, Apple Developer Connection, and PHP Architect, and authored the book The Definitive Guide to SugarCRM: Better Business Applications and the book Building on SugarCRM.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Latest Stories
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
In his session at 19th Cloud Expo, Nick Son, Vice President of Cyber Risk & Public Sector at Coalfire, will discuss the latest information on the FedRAMP Program. Topics will cover: FedRAMP Readiness Assessment Report (RAR). This new process is designed to streamline and accelerate the FedRAMP process from the traditional timeline by initially focusing on technical capability instead of documentation preparedness. FedRAMP for High-impact level systems. Early in 2016 FedRAMP officially publishe...
Join IBM November 2 at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how to go beyond multi-speed it to bring agility to traditional enterprise applications. Technology innovation is the driving force behind modern business and enterprises must respond by increasing the speed and efficiency of software delivery. The challenge is that existing enterprise applications are expensive to develop and difficult to modernize. This often results in what Gartner calls...
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, will discuss the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They will also review two "free infrastruct...
Qosmos, the market leader for IP traffic classification and network intelligence technology, has announced that it will launch the Launch L7 Viewer at CloudExpo | @ThingsExpo Silicon Valley, being held November 1 – 3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The L7 Viewer is a traffic analysis tool that provides complete visibility of all network traffic that crosses a virtualized infrastructure, up to Layer 7. It facilitates and accelerates common IT tasks such as VM migra...
WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, discussed how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
Rapid innovation, changing business landscapes, and new IT demands force businesses to make changes quickly. In the eyes of many, containers are at the brink of becoming a pervasive technology in enterprise IT to accelerate application delivery. In this presentation, you'll learn about the: The transformation of IT to a DevOps, microservices, and container-based architecture What are containers and how DevOps practices can operate in a container-based environment A demonstration of how Docke...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
President Obama recently announced the launch of a new national awareness campaign to "encourage more Americans to move beyond passwords – adding an extra layer of security like a fingerprint or codes sent to your cellphone." The shift from single passwords to multi-factor authentication couldn’t be timelier or more strategic. This session will focus on why passwords alone are no longer effective, and why the time to act is now. In his session at 19th Cloud Expo, Chris Webber, security strateg...
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
In the 21st century, security on the Internet has become one of the most important issues. We hear more and more about cyber-attacks on the websites of large corporations, banks and even small businesses. When online we’re concerned not only for our own safety but also our privacy. We have to know that hackers usually start their preparation by investigating the private information of admins – the habits, interests, visited websites and so on. On the other hand, our own security is in danger bec...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...