|By Pete Pickerill||
|March 26, 2014 12:00 PM EDT||
In the final post in this series about bringing DevOps patterns to database change management, we’re going to discuss the Third Way. Here’s a refresher on the Third Way from the introductory post in this series:
The Third Way: Culture of Continual Experimentation & Learning – This way emphasizes the benefits that can be realized through embracing experimentation, risk-taking, and learning from failure. By adopting this kind of attitude, experimentation and risk-taking lead to innovation and improvement while embracing failure allows the organization to produce more resilient products and sharpen skills that allow teams to recover more quickly from unexpected failure when it does occur.
The Third Way is by far the most intriguing of the “The Ways” to me. I’ve spent the lion’s share of my career in early stage start-ups where cycles of experimentation, learning, and failure are the norm. When bringing a new product or service to market, your latest release is never your last. It may not even be the last release this week. You are constantly experimenting with new workflows and technology, learning about your target market, and getting more valuable information from your early failures than your early successes. The Third Way is crucial to the success of an early stage company.
While the benefits of the Third Way still apply to more established companies and product lines, practicing it becomes more difficult. The potential negatives of experimentation and risk-taking are much harder to stomach when you have a large base of paying customers with SLAs. This aversion to risk is most acute when you’re talking about your data platform where outages, performance problems and data loss are not an option. Complicating matters further is how difficult it can be to unwind the database changes that were affected to support a specific version of your app. Application code can usually be uninstalled and replaced with the previous working version fairly simply should problems arise. Reverting the database changes that support that version of the application is more akin to defusing a bomb. Database changes must be reverted delicately and meticulously to avoid errors and omissions that could negatively impact your data platform.
What DBAs and release managers need to facilitate experimentation and risk taking on the data platform is a special combination of tools and process. This combination should make it easy to identify the root cause of issues, quickly remediate problems caused by application schema structure, and revert to a previous version of the schema safely. When we started Datical, we spent several hours in conversation and at white boards exploring these unique needs and hammering out a path to usher the Third Way into regular database activity.
A Rollback Designed With Every Change
The biggest problem with experimentation in the data platform is how difficult it is to move backward and forward through your schema’s version history. We feel the best way to bring more flexibility to the process of upgrading and reverting schema is an attitude shift. Your rollback strategy for each database change must become as important as the change itself. The best time to craft your rollback strategy is when the change itself is being designed. When the motivation for the change is fresh in your mind and the dependencies of the object being created, dropped or altered are clearly mapped out, a developer or DBA can better craft a rollback strategy for which every contingency has been considered. This leads to a stronger safety net and makes your application schema as agile and easily managed as your application code. The database is no longer preventing you from being bold but quickly and safely moving between versions to accommodate experimentation.
The Richness of Model Based Comparison & Remediation
I’m a native Texan, born and raised in Austin. I know the jokes about how proud and vocal Texans can be about their home state. That being said, I spend more time telling people about the model based approach Datical has applied to Database Change Management than I do telling them how wonderful it is that I was fortunate enough to be born in the best state in the country. The advantages of the model based approach really shine through when it comes to experimentation and troubleshooting. The model allows you to annotate all objects and modifications to your application’s schema with the business reason that prompted them. This detailed history is invaluable when designing new changes or refactoring your schema as part of an experimental exercise. You immediately know what objects are most crucial, what your dependencies are, what areas you need to tread lightly in, and what areas are ripe for experimentation due to the changing needs of your business. Designing intelligently eliminates risk.
Troubleshooting with models is also dramatically faster and more reliable than other methods. Programmatically comparing models allows you to determine the differences between two databases much more quickly than manually comparing diagrams or SQL scripts. You know with certainty exactly what has changed and what is missing in a fraction of the time that human review takes.
Once you have identified the differences, remediation is as simple plugging one, some or all of the determined differences into the model and deploying those changes to the non-compliant instance.
Flexible Quick Rollback
If you’ve taken our advice to implement your rollback strategy when you implement a change, recovering from disaster becomes testable, fast, and simple. Before going to production or a sensitive environment, you should always test your rollback steps in dev, test and staging. This will allow you to make any tweaks or changes to your rollback strategy before you are in a pinch. Think of it like testing your smoke alarm. Hopefully you’ll never need it, but it’s nice to know that it’ll work if you do.
Let’s say the worst happens. You deploy a new set of database changes and the application performance degrades or errors are logged. The decision is made to revert the entire installation to the previous version. Because you have carefully designed, tested and refined your rollback strategy, rolling back the database changes becomes a push button operation or a single invocation of a command line tool. No more running disparate SQL scripts or undoing changes on the fly. You can be confident that your database has been returned to the same state it was in before the upgrade.
The database has long been handled with kid gloves and for good reason. Data is a precious resource for consumers and businesses. Consumers provide data to businesses and trust that it will be kept safe and used to offer them better products and services. Businesses rely on data to strategize, grow, and become more efficient and profitable. It is the lifeblood of our economy. As our ability to collect and process data becomes greater and greater, the rate at which an enterprise must move on what is learned from that data becomes linearly faster. Data must be kept safe, but the database must become more agile to accommodate the growing pressure for faster value realization of business intelligence initiatives. DevOps patterns hold the key to this necessary agility while maintaining or improving the security and integrity of data stores. The database and DBAs need to be brought into the application design process earlier and must be treated as the first class stakeholder and commodity that they are. Companies that acknowledge this and move to adopt DevOps patterns and include their database teams will be at a distinct competitive advantage to those that don’t. Don’t miss the boat!
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Nov. 28, 2015 01:00 PM EST Reads: 470
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Nov. 28, 2015 12:00 PM EST Reads: 332
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty ...
Nov. 28, 2015 12:00 PM EST Reads: 547
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, explored the value of Kibana 4 for log analysis and provided a hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He examined three use cases: IT operations, business intelligence, and security and compliance. Asaf Yigal is co-founder and VP of Product at log analytics software company Logz.io. In the past, he was co-founder of social-trading platform Currensee, which...
Nov. 28, 2015 12:00 PM EST Reads: 227
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Su...
Nov. 28, 2015 11:45 AM EST Reads: 399
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Nov. 28, 2015 11:15 AM EST Reads: 411
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Nov. 28, 2015 11:00 AM EST Reads: 511
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
Nov. 28, 2015 10:30 AM EST Reads: 308
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Nov. 28, 2015 10:15 AM EST Reads: 243
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Nov. 28, 2015 10:00 AM EST Reads: 395
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Nov. 28, 2015 10:00 AM EST Reads: 187
In recent years, at least 40% of companies using cloud applications have experienced data loss. One of the best prevention against cloud data loss is backing up your cloud data. In his General Session at 17th Cloud Expo, Sam McIntyre, Partner Enablement Specialist at eFolder, presented how organizations can use eFolder Cloudfinder to automate backups of cloud application data. He also demonstrated how easy it is to search and restore cloud application data using Cloudfinder.
Nov. 28, 2015 09:00 AM EST Reads: 178
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and t...
Nov. 28, 2015 08:45 AM EST Reads: 433
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
Nov. 28, 2015 08:45 AM EST Reads: 331
As organizations shift towards IT-as-a-service models, the need for managing & protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection & E-Discovery of your data - whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise.
Nov. 28, 2015 07:45 AM EST Reads: 208