Welcome!

News Feed Item

Splice Machine Announces Advisory Board

Team of database and technology icons brought together to counsel Splice Machine in pursuit of a new generation of real-time enterprise applications

SAN FRANCISCO, Aug. 26, 2014 /PRNewswire/ -- Splice Machine, provider of the only Hadoop RDBMS, today announced its initial Board of Advisors, which includes technology luminaries Roger Bamford, Mike Franklin, Marie-Anne Neimat and Ken Rudin.

Splice Machine Logo

"We are honored and pleased to have support and expert guidance from some of the smartest and most experienced individuals in the database space," said Monte Zweben, co-founder and CEO, Splice Machine. "We are confident that the deep knowledge and unique backgrounds of our advisors, in combination with the skill sets of the Splice Machine team, gives us the foundation to accelerate our footprint in the market."

Franklin, a widely published academic leader with an extensive knowledge of databases, along with Bamford, Neimat and Rudin, three industry leaders, will advise Splice Machine on its product roadmap and go-to-market strategy.

Roger Bamford is a database luminary and until recently, the Principal Architect of Server Technologies at Oracle. Mr. Bamford, known as the father of Oracle RAC, was an original member of Oracle's database team. Holder of dozens of patents relating to database and clustering technology, he was responsible for many software innovations relating to the architecture and performance of relational databases at Oracle.

Michael Franklin is the Siebel Professor of Computer Science and Chair of the Computer Science Division at UC Berkeley. He is also Director of the UC Berkeley AMPLab, where the open-source data analytics cluster computing framework, Apache Spark, was created. Dr. Franklin has more than 30 years in the database and distributed systems fields as a faculty member, entrepreneur, architect, researcher, consultant, and software developer.

"Over the past 30 years or so, it's been fascinating to see all of the new disruptive innovations in the database industry, but what Splice Machine is doing is truly unique," said Franklin. "As they gain traction in the marketplace, customers of legacy transactional database systems will have a very viable alternative to their costly, scale-up databases." 

In-memory database pioneer Marie-Anne Neimat co-founded TimesTen, Inc., the first company to develop and commercialize an in-memory relational database. Neimat brings a track record of database innovation, serving as Vice President of Engineering for three of Oracle's databases, namely Oracle TimesTen In-Memory Database, which Oracle acquired, Oracle Berkeley Database, also acquired by Oracle, and Oracle NoSQL Database. Neimat is the holder of several patents, and the author of many publications in refereed conferences and journals."

Ken Rudin, head of analytics for Facebook, is an entrepreneur with a unique combination of leadership skills, analytical strength, technology and marketing expertise. Ken honed his database knowledge at Oracle, where he served as General Manager, Data Warehousing and Parallel Systems. He also served as VP and General Manager of Siebel CRM OnDemand Division & VP of Marketing, Siebel Analytics. He further enhanced his skills at Zynga, where he served as VP of Analytics and Platform Technologies. His passion for evangelizing new ideas will help Splice Machine increase its share of voice as a leader in the space.

"The database market is undergoing fundamental shifts in customer needs," said Rudin. "Increasingly, applications and analytics require real-time data and transactions to truly provide deeper insights and respond in real-time. The Splice Machine team stands out to me as having the right combination of vision, product knowledge and expertise to unlock those insights and emerge as a disruptive force in databases by powering a whole new generation of data-driven applications for the enterprise."

The Splice Machine database enables companies to replace traditional RDBMS that are too costly or difficult to scale. A full-featured, transactional Hadoop RDBMS with ANSI SQL compatibility, Splice Machine moves Hadoop beyond its batch analytics heritage to power operational applications and real-time analytics.

To download Splice Machine and try out The Hadoop RDBMS for yourself, visit http://www.splicemachine.com/download.

About Splice Machine

Splice Machine's Hadoop RDBMS, is designed to scale real-time applications using commodity hardware without application rewrites. The Splice Machine database is a modern, scale-out alternative to traditional RDBMSs, such as Oracle®, MySQL™, IBM DB2® and Microsoft SQL Server®, that can deliver over a 10x improvement in price/performance. As a full-featured Hadoop RDBMS with ACID transactions, the Splice Machine database helps customers power real-time applications and operational analytics, especially as they approach Big Data scale.

© 2014 Splice Machine, Inc. All rights reserved. Splice Machine and the Splice Machine logo are trademarks or registered trademarks of Splice Machine, Inc.; all other logos and trademarks mentioned are the property of their respective owners.

Logo - http://photos.prnewswire.com/prnh/20140509/86257

SOURCE Splice Machine

More Stories By PR Newswire

Copyright © 2007 PR Newswire. All rights reserved. Republication or redistribution of PRNewswire content is expressly prohibited without the prior written consent of PRNewswire. PRNewswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

Latest Stories
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
In their Live Hack” presentation at 17th Cloud Expo, Stephen Coty and Paul Fletcher, Chief Security Evangelists at Alert Logic, provided the audience with a chance to see a live demonstration of the common tools cyber attackers use to attack cloud and traditional IT systems. This “Live Hack” used open source attack tools that are free and available for download by anybody. Attendees learned where to find and how to operate these tools for the purpose of testing their own IT infrastructure. The...
SYS-CON Events announced today that IoT Now has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. IoT Now explores the evolving opportunities and challenges facing CSPs, and it passes on some lessons learned from those who have taken the first steps in next-gen IoT services.
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, will present an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He will expound on the industry issues he frequently came up against as an analyst, and...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
ChatOps is an emerging topic that has led to the wide availability of integrations between group chat and various other tools/platforms. Currently, HipChat is an extremely powerful collaboration platform due to the various ChatOps integrations that are available. However, DevOps automation can involve orchestration and complex workflows. In his session at @DevOpsSummit at 20th Cloud Expo, Himanshu Chhetri, CTO at Addteq, will cover practical examples and use cases such as self-provisioning infra...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Jared Parker, Director of Financial Services at Kinetica, will discuss how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich inf...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...