Welcome!

Blog Feed Post

Using DBaaS to Align Database Management with Your Application Delivery Processes

By

Today’s application development teams are moving fast, but is your team being held back by slow database management practices and tools? For many organizations, this is a significant problem that deserves as much attention as getting application code changes flowing smoothly into production. In fact, they are tightly interdependent challenges that need to be solved together.

Your Database Environment: Perception vs. Reality

Your Database Environment: Perception vs. Reality

Think about the application development process for a moment. How long does it take you to stand up a new copy of your database? Are you sure you have configured everything correctly and consistently? Do your developers and testers have access to production data for realistic testing? For most organizations, it’s unlikely that the answer to these questions are “minutes”, “very certain”, and “yes”.

It’s also critical to embrace the distributed and multi-instance nature of your organization’s data landscape. Gone are the days of a single, large and controlled database that sits underneath all of your applications. Even in a single application team, it’s a fairly universal truth that you will have multiple environments. From individual developer sandboxes, to integration and test infrastructure, and through to production clusters (and even DR copies), we aren’t really working with a single monolithic database environment. Each one of these environments means a new deployment of your database configuration.

Mapping Database Management to Application Delivery Processes

Data is different from code. Data represents the persistent state of your application, yet dynamically changes and grows as the application is used. Code, on the other hand, is much more of a static entity that can be versioned cleanly and deployed through configuration management tools. Many advanced approaches and tools have been developed to manage configuration drift, and ensuring that the code that is expected to be deployed is, in fact, the code deployed is now easier than ever. It’s the dynamic nature of data that makes managing it different from code deployments.

So what can you do about this difference? First, understand that there are aspects of your database environment that can be managed in a similar way as code and configuration. They fall into two buckets: the definition (configuration) of the database system itself and the definition of the schema / database code. Next, think about how the data being generated by your production application can be re-purposed efficiently for testing both new versions of your applications and the processes you will use to deploy your applications.

One great example of these concepts rolled into a consistent line of thinking is the work of Martin Fowler and Pramod Sadalage on Evolutionary Database Design. Evolutionary Database Design is a methodology for ensuring that your database environment is able to adapt and evolve along with your application. It advocates several specific approaches to ensuring that the database layer is an integral part of the entire application delivery pipeline.

While the Evolutionary Database Design pattern is certainly only one of many approaches to take, it’s well thought through and worth reviewing to find some key themes:

  1. Database instances need to be (re)generated on-demand – Development, integration, test, staging, and production environments should be consistent.
  2. Database schemas should be defined as “refactorings” – Your schema should be created by series of evolutionary migrations from one version to the next.
  3. Actual data matters – Use test data for earlier on in the application’s lifetime, but be prepared to use copies of production data later.

How does DBaaS Fit?

A Database-as-a-Service (DBaaS) platform can play a large role in how you advance your application delivery processes. Specifically, DBaaS can address two of the three themes from above: instance creation and data portability.

Instance creation is perhaps the easiest to comprehend, because that’s what most DBaaS solutions are solving at their core. Think about the definition of your database environment as configuration metadata (or a blueprint), which when used to deploy a new database instance from a DBaaS platform should result in a consistent database engine deployment for each instantiation. It shouldn’t matter if the environment’s specification is as simple as a single node or as complex as a multi tier cluster configured for sharding, it should be just as easy to get a new environment online.

The second role that a DBaaS platform can play is in helping to move the data itself around in the application lifecycle process. It should support operations like replica creation and database cloning, again via a self service UI and with APIs. These two functions are often though about as “operational” considerations for a database environment, but if you think about the whole application development system, they can become surprisingly useful to the entire team.

First, let’s define them:

Moving Data and DB Config through the Application Lifecycle

Moving Data and DB Config through the Application Lifecycle

  • Cloning is when a snapshot of the database system is created, and then transferred into a new “instance”. It’s a point in time copy created on-demand.
  • Replication is the synchronous or asynchronous replay of database changes against a “copy” of the database, which can then be detached from the “master” database to operate as a standalone system.

Both of these features offer a similar end result: you get a copy of the database that you can use for other purposes. Cloning takes a bit longer, but can easily happen off hours to minimize impact on the production environment. Replication can ensure a much faster “turn up” of the new environment with up-to-date data, but needs to be created ahead of the request for an instance. Regardless of the differences, they both provide a foundation with which each developer, tester and release manager can easily get a version of the production database when it’s needed.

Think about that in the context of bug-fixing: All too frequently, developers are handed user reports of an issue that is hard to reproduce without having a copy of the production data in place to replicate the situation. If it was easy for a developer to instantiate their own copy of the production environment (on demand) from a clone created each night, all of a sudden that developer will be able to resolve the bug much more effectively. Not only that, but the testing process will be able to ensure that the proposed patch does, in fact, solve the problem with real-world data as input.

Summarizing

You can move the code easily, but data and databases are a much harder a challenge without the right tools. A well designed Database-as-a-Service (DBaaS) environment can play a key role in helping you be more nimble in your treatment of data services within your application development pipeline. A DBaaS should offers solutions to both the “course grained” configuration management / automation challenges of the database engines themselves and the need to have self service access to copies of the dynamic data that your applications generate.

The right DBaaS solution should give you:

  • Self service access new DB instances for your development and testing teams
  • Move database engine configurations through the application development pipeline
  • Easily pull data back into staging, test and dev environments to give your teams access to real-world data
  • Support production rollout of application releases through replication and / or database cloning
  • APIs to make these operations an integral part of your continual integration and deployment processes

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

Latest Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 C...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of rec...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of ...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busine...
Enterprises are universally struggling to understand where the new tools and methodologies of DevOps fit into their organizations, and are universally making the same mistakes. These mistakes are not unavoidable, and in fact, avoiding them gifts an organization with sustained competitive advantage, just like it did for Japanese Manufacturing Post WWII.
The revocation of Safe Harbor has radically affected data sovereignty strategy in the cloud. In his session at 17th Cloud Expo, Jeff Miller, Product Management at Cavirin Systems, discussed how to assess these changes across your own cloud strategy, and how you can mitigate risks previously covered under the agreement.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
As Enterprise business moves from Monoliths to Microservices, adoption and successful implementations of Microservices become more evident. The goal of Microservices is to improve software delivery speed and increase system safety as scale increases. Documenting hurdles and problems for the use of Microservices will help consultants, architects and specialists to avoid repeating the same mistakes and learn how and when to use (or not use) Microservices at the enterprise level. The circumstance w...