Welcome!

Related Topics: @DXWorldExpo, Java IoT, Containers Expo Blog, @CloudExpo, SDN Journal

@DXWorldExpo: Article

What Will It Really Take to Provide an Agile Big Data Infrastructure?

Meet CodeFutures Corporation at Cloud Expo

"An agile approach to data is really a requirement of just about any application, but even more so with Big Data," stated Cory Isaacson, CEO/CTO of CodeFutures Corporation, in this exclusive Q&A with @ThingsExpo conference chair Roger Strukhoff. "What the enterprise needs is a data platform that can adapt to changing requirements in a flexible and rapid manner. This has not been the case with existing databases."

Cloud Computing Journal: With the MapDB announcement, it sounds like to some degree you're bringing the Java Programming Language into the 21st century modern enterprise. To what degree do you agree with that statement?

Cory Isaacson: This is a very good assessment, while Java has always been a capable language there have been many barriers when it comes to delivering full-featured database technology. For example, APIs like JDBC have worked for many years, but fast performance, convenience and tight integration in Java applications with the database were not there. With MapDB Java developers now have the convenience and agility of the native Java Collections API, with the power of a very fast database engine.

Cloud Computing Journal: You emphasize a lightweight, agile approach. To what degree is this simply a requirement of Big Data applications and to what degree do you think agility is required in general for the modern enterprise? Does the real-time (or almost real-time) nature of a lot of Big Data also drive a need for agility?

Isaacson: An agile approach to data is really a requirement of just about any application, but even more so with Big Data. What the enterprise needs is a data platform that can adapt to changing requirements in a flexible and rapid manner. This has not been the case with existing databases. For example, the NoSQL engines move toward this direction, but with a Big Data store it is really tough to gain agility.

Added to this need are the burgeoning real-time data requirements. With a real-time data flow, it is critical to know what is happening now; it is not enough to get results from a typical historical time window (such as days or weeks after something has occurred). The reason an agile approach is so critical is because the results needed from real-time requirements are also likely to change at a rapid pace, more than with traditional enterprise or Big Data applications.

Cloud Computing Journal: What benefits will your customers and your product receive from the open-source approach?

Isaacson: MapDB is freely available under the Apache 2.0 license, this allows customers (or anyone) to use the product as they see fit. Because it is open source also means that we receive major feedback from users, enabling extremely fast support and stability in the product. We often hear in days or hours if there is an issue with a pre-release version, then it can be addressed before it makes its way into a final release. The support of the community is vital.

Cloud Computing Journal: You mention support of databases up to 100GB in size - are there "typical" databases of this size that you encounter? In other words, what sorts of applications and initiatives are driving databases of this size?

Isaacson: We have seen users of MapDB use it for everything from pure in-memory to large disk-based databases. So I would not say there is a "typical" size, but the good news is that a developer can comfortably scale their database as needed, from small to very large without needing to be concerned with the details.

Cloud Computing Journal: What sort of increased interest in Big Data have you seen over the past year or so, and what sort of questions do you anticipate from customers at Cloud Expo?

Isaacson: The interest in Big Data is explosive right now, especially given the new real-time element, and new data generators such as the Internet of Things.

Virtually every customer we are working with has some sort of Big Data initiative or requirement; it is becoming tightly integrated into business strategies and planning. Big Data will be a vital commodity in the economy from here on out and it will affect almost every business from large to small.

The types of questions we anticipate at Cloud Expo are:

  • What will it really take to provide an agile Big Data infrastructure?
  • How can real-time data flows be leveraged for real-time strategic advantage?
  • What capabilities will enable application developers to respond faster to business requirements, without the restrictive nature of current database technologies? In other words, how can we make the job for developers easier, while enabling far more powerful Big Data access?

In addition to all of these questions, we expect to get quite a few regarding our upcoming technology releases - we'll have many things to discuss with technologists and managers while at the event.

More Stories By Elizabeth White

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
Machine learning provides predictive models which a business can apply in countless ways to better understand its customers and operations. Since machine learning was first developed with flat, tabular data in mind, it is still not widely understood: when does it make sense to use graph databases and machine learning in combination? This talk tackles the question from two ends: classifying predictive analytics methods and assessing graph database attributes. It also examines the ongoing lifecycl...
Transformation Abstract Encryption and privacy in the cloud is a daunting yet essential task for both security practitioners and application developers, especially as applications continue moving to the cloud at an exponential rate. What are some best practices and processes for enterprises to follow that balance both security and ease of use requirements? What technologies are available to empower enterprises with code, data and key protection from cloud providers, system administrators, inside...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Consumer-driven contracts are an essential part of a mature microservice testing portfolio enabling independent service deployments. In this presentation we'll provide an overview of the tools, patterns and pain points we've seen when implementing contract testing in large development organizations.
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the ste...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.