Welcome!

News Feed Item

Databricks Launches “Certified Spark Distribution” Program to Recognize Vendors Committed to Supporting the Apache Spark Application Ecosystem

Databricks, the company founded by the creators of Apache Spark, the next generation Big Data engine, today announced the “Certified Spark Distribution” program for vendors with a commercial Spark distribution. Certification indicates that the vendor’s Spark distribution is compatible with the open source Apache Spark distribution, enabling “Certified on Spark” applications - certified to work with Apache Spark - to run on the vendor’s Spark distribution out-of-the-box.

“One of Databricks’ goals is to ensure users have a fantastic experience. Our belief is that having the community work together to maintain compatibility and therefore facilitate a vibrant application ecosystem is crucial to this vision,” said Ion Stoica, Databricks CEO. “We first launched the ‘Certified on Spark’ program to help build a robust ecosystem of innovative applications on top of Apache Spark. The ‘Certified Spark Distribution’ program is the other half of the equation, recognizing vendors that are committed to providing a home for these applications to allow the ecosystem to flourish.”

In keeping with the open source nature of Spark, the certification process is fully transparent with open-source tests, lightweight, and 100% free - a mirror image of the “Certified on Spark” process for Spark applications. Vendors fill out a short questionnaire and then simply execute a set of open-source tests - developed and maintained by the community and used to test each release of Apache Spark - against their build of Spark to demonstrate compatibility.

“Certification shouldn’t be used as a tool for lock-in: Certified Spark Distributions are not required to ship all the bits of Apache Spark, or be open source, or prevented from innovating significantly within and around Spark,” said Arsalan Tavakoli-Shiraji, Business Development Lead at Databricks. “They simply need to maintain compatibility with Apache Spark to provide support for the application ecosystem.”

As part of the certification program launch, five vendors have completed the certification process: DataStax, Hortonworks, IBM, Oracle, and Pivotal - industry leaders that have recognized and embraced the power of Spark when integrated with their respective platforms. Each of these vendors put their distributions through the certification process, which included a host of integration tests to ensure full compatibility with the latest Apache Spark release.

“One of the big risks faced by open source projects is fragmentation among distributors. Fragmentation is bad for both users and application developers, and ultimately for the growth of the project,” said Matei Zaharia, Databricks CTO and VP of the Spark project at Apache. “We are delighted that these partners - along with others in the certification pipeline - share our vision of an undivided Spark platform based directly around Apache, and will ensure that all applications built on Apache Spark run on their distributions.”

Vendors interested in certifying their Spark distribution should visit www.databricks.com and select "Apply for Certification." Enterprise users can also visit the Databricks site regularly to see the latest set of certified distributions and applications, and read “spotlight” blog articles that provide deep-dives on the Spark ecosystem by newly certified vendors.

All the inaugural members will be on hand at the upcoming Spark Summit from June 30th to July 2nd in San Francisco to provide greater information on the role of Spark in helping better serve their customers. Additionally, there will be an “Application Spotlight” segment that will highlight innovative “Certified on Spark” applications.

Supporting Quotes:

"DataStax is strongly committed to making Cassandra and Spark the best combination for today's online applications," said Robin Schumacher, VP of products at DataStax. "We have demonstrated that commitment with the integration work we have contributed back to both open source communities as well as the certified versions of Spark and Cassandra we provide in DataStax Enterprise for production environments."

“We support the fact that Apache Spark project provides enterprises with an additional processing engine in Hadoop to execute in-memory algorithms for advanced analytics,” said John Kreisa, vice president of strategic marketing at Hortonworks. “We applaud Databricks’ vision to ensure Spark is fully integrated on YARN, which enterprises have adopted as the data OS for Hadoop.”

"Pivotal's open source credentials are quite extensive - Apache-compatible Hadoop, MADLib, RabbitMQ, CloudFoundry - and now we've added Spark to that set," said Sarabjeet Chugh, Head of Hadoop Product Management at Pivotal. "Additionally, we recognize the importance of a unified community to enable the ecosystem to grow and so are thrilled to back this effort."

About Databricks

Databricks was founded by the creators of Apache Spark, who have been working for the past six years on cutting-edge systems to analyze and process Big Data. They believe that Big Data is a tremendous opportunity that is still largely untapped, and are actively working to revolutionize what enterprises can do with it. Databricks is venture-backed by Andreessen Horowitz. For more information, visit http://www.databricks.com.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
In his session at @DevOpsSummit at 19th Cloud Expo, Robert Doyle, lead architect at eCube Systems, will examine the issues and need for an agile infrastructure and show the advantages of capturing developer knowledge in an exportable file for migration into production. He will introduce the use of NXTmonitor, a next-generation DevOps tool that captures application environments, dependencies and start/stop procedures in a portable configuration file with an easy-to-use GUI. In addition to captur...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Providing secure, mobile access to sensitive data sets is a critical element in realizing the full potential of cloud computing. However, large data caches remain inaccessible to edge devices for reasons of security, size, format or limited viewing capabilities. Medical imaging, computer aided design and seismic interpretation are just a few examples of industries facing this challenge. Rather than fighting for incremental gains by pulling these datasets to edge devices, we need to embrace the i...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...