|By Ed Featherston||
|March 11, 2014 09:00 AM EDT||
When was the last time you had a technology conversation that did not include the word ‘cloud' in it? Gartner predicts that by 2016 the bulk of IT spend will be for the cloud. Gartner also believes that ‘nearly half of large enterprises will have hybrid cloud deployments by the end of 2017.' Cloud technology continues to evolve at breakneck speeds and business wants to move to the cloud equally as fast. This presents significant challenges for technologists who need ensure the business doesn't go crashing into a brick wall while moving at these speeds.
Don't Let It Be by Accident
I will lead with a personal bias. I have been brought into too many clients over the years to help solve problems, only to discover that the root cause was accidental architecture. You may ask, how does one create an architecture by accident? In Beware the Accidental Cloud Architecture, Bruce Tierney (a director for Oracle's SOA Suite), describes it quite well. He discusses how many lines-of-business (LOB) will outsource applications and installations to a variety of SaaS vendors. This is frequently done bypassing the IT organization entirely.
The net result is the "evolution of an architecture that was not planned but grew ad-hoc by incrementally adding one new connection after another to a cloud vendor into a new ‘accidental cloud architecture.'" The challenge becomes trying to integrate these ‘siloed' applications from different vendors with different environments and tools. The integration becomes ad-hoc. The costs inherent in dealing with the results of an accidental architecture far outweigh any perceived benefits.
The perception for many LOB is that by going to the cloud they no longer need IT or architecture. In an earlier blog, Why do I need an architect? I discussed the on-again / off-again viewpoints on the need for architecture. The client I mentioned in that blog was among those that suffered from an accidental architecture. As many have heard me say before, no technology negates the need for proper planning and design. Someone must always be looking at the bigger picture, or failure will surely follow.
If we can agree we don't want our architectures to be accidental, we can now discuss a long-standing debate: Is architecture a science or an art?
The Argument for Science
At this point, you may be saying ‘of course architecture is a science, it's technology.' First, I would argue that architecture is more than just technology. Technology is a key component that architecture addresses, but architecture's key driver is the business, and includes process, management and discipline as well as just technology. At the risk of sounding clichéd, whether it's a science or not also depends on how you define science. Tom Graves recently wrote a blog discussing this subject, The Science of Enterprise Architecture. In it he puts forth a very compelling argument of ‘science as portrayed' vs ‘science as practiced,' the inherent differences, and how they apply to architecture.
For those expecting architecture to be ‘science as portrayed,' there is an implied absolute certainty and control that is not realistically achievable. ‘Science as practiced' is more real-world and applicable in architecture. Every solution and architecture is always a series of trade-offs. There are no perfect solutions that you can be absolutely certain about. You can achieve a level of effective certainty and control. These are the tradeoffs that are made during design. By treating architecture as a ‘science as practiced,' you will achieve much higher levels of control and certainty than from the accidental architectures.
One must be sure to set the proper expectations. Absolute certainty will not be achieved. Effective certainty can be achieved. It provides significant benefits over accidental architectures where there will be no certainty of outcomes.
The Argument for Art
I will admit another personal bias here. There is a part of me that views architecture (and technology in general) as a form of creativity and art. How many of us, when explaining or working on an architecture with a group, gravitate to a white board and start drawing? Recently I was having this discussion on Twitter with several folks. One of them, Ruth Malan pointed me to an interesting article in the NY Times, Architecture and the lost art of drawing. The article is about building architecture, but I found it very relevant to technology architecture as well.
There was one quote in particular that struck me, ‘Drawings are not just end products: they are part of the thought process of architectural design. Drawings express the interaction of our minds, eyes and hands.' I find that statement true of technology architecture as well. Architecting systems is a science, it is a discipline, and it is a process. That process is a creative one. This may sound counter to it being a science and discipline, but I think it's part of the balance that helps produce quality systems. When trying to solve complex problems, how often have you been encouraged ‘to think outside the box,' or to be creative? That to me is asking for the artist in you to emerge.
Are Your Architectures Science, Art or Accidental?
I sincerely hope we can all agree - no one wants accidental architectures. As you can tell, I view good architecture as a successful combination of science and art. In the ever-growing and expanding cloud universe, I find the balance even more critical. Science as practiced gives us the level of effective certainty needed to provide robust and stable systems. Creativity enables us to think outside the box, providing us with the opportunity to make leaps in leveraging technology to the benefit of the business that might not have been achieved otherwise.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Sep. 1, 2015 03:00 AM EDT Reads: 458
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, discussed why containers should be paired with new architectural practices such as microservices rathe...
Sep. 1, 2015 01:00 AM EDT Reads: 407
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Aug. 31, 2015 11:00 PM EDT Reads: 499
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Aug. 31, 2015 10:30 PM EDT Reads: 399
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Aug. 31, 2015 09:00 PM EDT Reads: 368
Organizations from small to large are increasingly adopting cloud solutions to deliver essential business services at a much lower cost. According to cyber security experts, the frequency and severity of cyber-attacks are on the rise, causing alarm to businesses and customers across a variety of industries. To defend against exploits like these, a company must adopt a comprehensive security defense strategy that is designed for their business. In 2015, organizations such as United Airlines, Sony...
Aug. 31, 2015 07:15 PM EDT Reads: 483
The Internet of Things is in the early stages of mainstream deployment but it promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. IoT is a complex structure of hardware, sensors, applications, analytics and devices that need to be able to communicate geographically and across all functions. Once the data is collected from numerous endpoints, the challenge then becomes converting it into actionable insight.
Aug. 31, 2015 07:00 PM EDT
Puppet Labs has announced the next major update to its flagship product: Puppet Enterprise 2015.2. This release includes new features providing DevOps teams with clarity, simplicity and additional management capabilities, including an all-new user interface, an interactive graph for visualizing infrastructure code, a new unified agent and broader infrastructure support.
Aug. 31, 2015 06:30 PM EDT Reads: 523
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Trel...
Aug. 31, 2015 06:00 PM EDT Reads: 243
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for ‘normal’ companies without hyper-scale resources. In his session at 17th Cloud Expo, David Cauthron, founder and chief executive officer of Nimboxx, will discuss the evolution of virtualization (hardware, application, memory, storage) and how commodity / open source hyper converged infrastructure (HCI) so...
Aug. 31, 2015 04:45 PM EDT
In their Live Hack” presentation at 17th Cloud Expo, Stephen Coty and Paul Fletcher, Chief Security Evangelists at Alert Logic, will provide the audience with a chance to see a live demonstration of the common tools cyber attackers use to attack cloud and traditional IT systems. This “Live Hack” uses open source attack tools that are free and available for download by anybody. Attendees will learn where to find and how to operate these tools for the purpose of testing their own IT infrastructu...
Aug. 31, 2015 04:30 PM EDT Reads: 462
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Aug. 31, 2015 04:30 PM EDT
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Aug. 31, 2015 03:15 PM EDT
IBM’s Blue Box Cloud, powered by OpenStack, is now available in any of IBM’s globally integrated cloud data centers running SoftLayer infrastructure. Less than 90 days after its acquisition of Blue Box, IBM has integrated its Blue Box Cloud Dedicated private-cloud-as-a-service into its broader portfolio of OpenStack® based solutions. The announcement, made today at the OpenStack Silicon Valley event, further highlights IBM’s continued support to deliver OpenStack solutions across all cloud depl...
Aug. 31, 2015 03:00 PM EDT Reads: 249
Red Hat is investing in Tesora, the number one contributor to OpenStack Trove Database as a Service (DBaaS) also ranked among the top 20 companies contributing to OpenStack overall. Tesora, the company bringing OpenStack Trove Database as a Service (DBaaS) to the enterprise, has announced that Red Hat and others have invested in the company as a part of Tesora's latest funding round. The funding agreement expands on the ongoing collaboration between Tesora and Red Hat, which dates back to Febr...
Aug. 31, 2015 02:45 PM EDT Reads: 379