Welcome!

Related Topics: @DevOpsSummit, Java IoT, Containers Expo Blog, @CloudExpo, @BigDataExpo, SDN Journal

@DevOpsSummit: Article

The DevOps Database | Part 2

Applying Systems Thinking to Database Change Management

In my first post in this series, I discussed the underpinning principles of all DevOps patterns as eloquently stated by Gene Kim, author of "The Phoenix Project."  In this post I'd like to dig a little deeper into The First Way.  As a refresher:

The First Way: Systems Thinking - This Way stresses the performance of the entire system of value delivery.  Instead of becoming laser focused on the part of the process for which an individual or team is responsible, the individual or team works to understand the entire process from requirements generation to customer delivery.  The goal is to eliminate the delivery impediments that arise when a project transitions from one isolated silo to another.  Understanding the entire system allows business, development, and operations to work towards a common goal in a consistent manner.

When we started Datical, our first step was to perform extensive market validation.  We quickly learned that database schema management was going to be tough nut to crack. In the scores of conversations we had with people throughout the ALM spectrum, we learned that the process for managing and updating the database schema that supports an application was at best murky and at worst a black box.  So how do we elevate a process owned and understood by a few people in an organization to the level of visibility required to understand that process as part of a larger system?

What follows is a list of concepts pertaining to System Thinking that we've rallied around in providing our database change management solution Datical DB. Instead of the black box, database change management can become a transparent and flexible part of your value delivery system.

Start With Reality
When beginning a new project based on previous development, don't rely on a stack of SQL scripts on a file server or even in a source code control system. As you know, databases evolve over time and sometimes out of process changes happen.  When a database schema is modified to resolve an error condition or performance degradation, these alterations are usually handled in a support ticketing system.  You can never be certain that they were added to the stack of scripts used to build out a fresh environment.  In light of this, any database change management solution should start by generating a baseline from the working system: your production schema.  This ensures that design and development activity are taking into account not only those schema objects generated in Dev, but also those which originated in other stops in the system.

Don't Script! Model!
To keep this blog post short(er) I'll point you now to a blog post I wrote a few months ago on modelling vs. scripting.

Version Twice, Deploy Once
The first level of versioning any database change management tool should provide is versioning of the gold standard. Versioning the scripts or model that you use to build a fresh database instance for your application provides a ton of benefits.  You can track how your schema has evolved over time and across releases, you can tightly couple a schema definition to the version of the application it supports using the same branch/tag/merge workflow that you use for your application code, and you can quickly stand up a new database instance that you know is correct for any released version or experimental branch you are working in.  The second level of versioning takes place in each database instance.  Tracking the version of the schema deployed on a specific instance makes deployment and troubleshooting much easier.  Will this application build work with the schema on this instance?  What changes need to be applied to this database to catch it up to the latest version?  Were the right changes applied to this test database to validate a closed defect?  If you are tracking the individual changes applied in a database and the impetus for those changes, these questions can be answered quickly and easily.

Unify Your Modes of Delivery
We've found that a lot of uncertainty was generated when database updates were handled by several individuals using their own tools and methods to affect the required changes.  To remove this uncertainty, all database change must be executed using the same tools and processes across departments and individuals.   The tricky part: a continuous integration system has different requirements than a developer working iteratively or a DBA processing a batch of changes in a headless environment during a maintenance window.  In order to unify your database change process, the solution you use should be accessible to all of these individuals while maintaining the consistency of deployment activities.  That's why Datical DB provides a rich GUI experience for developers that helps them craft & deploy changes in dev environments; tight integrations with popular build and release automation frameworks that preserve the frameworks' workflows while providing Datical DB functionality; and a command line interface that allows users in headless environments to deploy changes in the exact same manner that the GUI or a 3rd-party integration would.  By unifying the modes of delivery you are constantly testing your release practices. By the time the production push rolls around, your system works.

More Stories By Pete Pickerill

Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. Pete has spent the majority of his career in successful startups and the companies that acquired them including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by Symantec, Inc.) and Phurnace Software.

Latest Stories
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, discussed how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a practic...