Welcome!

Blog Feed Post

Greedy (IT) Algorithms

Greedy algorithms can result in the right solution in the end, but rarely do

Don and I were having a discussion with our oldest son the other night about writing a chess program. There are myriad options for implementing the learning aspects of a chess program, but this is not a task intro-algorithmsfor the timid. He ended up proposing a much simpler solution (this was just an exercise in ‘can I write it’, after all) that would have essentially used a very greedy algorithm; one that made a decision regarding the computer’s next move based on current state of the board and what move would give it the most benefit right now.

For those of you not familiar with the concept, here’s a good definition from Wikipedia:

A greedy algorithm is any algorithm that follows the problem solving meta-heuristic of making the locally optimal choice at each stage with the hope of finding the global optimum.

The relevant (for this discussion at least) part of the definition is “making the locally optimal choice at each stage with the hope of finding the global optimum.” Greedy algorithms, it turns out, are not just for programming and developers, and imitate real-life more closely than we might think.

See, IT often makes it decisions using greedy algorithms. Choices are made based on what is optimal right now with the hope that it will also be optimal in the long run. Greedy algorithms can turn out to be “the right choice” in the long run, but it’s often because of luck (karma, chance, whatever) and not because of any great strategic thinking on the part of the decision makers.

We often see the use of greedy algorithms to make decisions when there is a problem with application delivery: performance, security, reliability. Because there is a problem right now that needs to be solved, well, right now, IT uses a very greedy algorithm that chooses a solution that is optimal now, but is often not the right choice in the long run.


HOW TO LOSE AT (IT) CHESS

What generally happens is this: an application has reached capacity, but usage is still growing. IT immediately executes a greedy algorithm, the result of which is to purchase a load balancer and a second server. Problem solved. But as usage grows so do attacks, and suddenly there crops up another problem, this time one of application security. IT again executes a greedy algorithm and comes up with a web application firewall. An acquisition and a deployment later and voila! problem solved.

But relatively soon after that performance begins to be a problem. IT, falling back on what it knows best, executes yet another greedy algorithm and comes up with another locally optimal solution: application acceleration. Another point solution is purchased and deployed, with a little more difficulty due to the growing complexity in the network and application architecture, and voila! problem solved.

You can see that as this continues the number of point solutions employed to solve problems that crop up is going to continue to grow, which is going to increase the costs to manage and maintain the architecture while simultaneously making it more and more difficult to troubleshoot any issues that may crop up in the future.

Using a greedy algorithm to make decisions about IT’s “next move” does not guarantee the optimal long term solution; it merely means that the optimal solution right now has been chosen.

That’s tactical thinking, not strategic thinking. And to be successful at (IT) chess you have to fit those tactical moves – which are necessary - into a broader strategy.


SACRIFICE THE PAWN

IT, like a chess grandmaster, needs to think more strategically when addressing solutions regarding the delivery of applications. Rather than view the chess-board architecture of IT as a set of connected but disparate squares, it needs to be viewed as a holistic battlefield across which a strategy can be employed that results in long term success.

We know that deploying applications requires long term consideration for scalability, reliability, security, and performance. At some point one itchessgame or all of these concerns will be a problem in need of a solution in the data center. Rather than look at architecture of the data center with a greedy eye, it is more efficient – financially, architecturally, and from a management point of view – to look at the architecture with an eye toward an optimal solution that affords an opportunity later on to “make the winning move” when other issues crop up.

Early on you may have a need only for scalability or reliability. A load balancer is certainly an answer to those needs, but it is not the globally optimum solution. You need to sacrifice the pawn now in order to take the king later: investing in a platform now rather than acquiring a solution lays the foundation for a successful strategy to winning the IT chess game in the long run. A unified platform that allows IT to deploy additional solutions when (and if) needed reduces overall costs in the long run, simplifies the architecture – making troubleshooting and management less resource intense – and improves performance of applications by eliminating extraneous devices in the infrastructure that can add latency and points of failure.

Greg Ness said it best in his recent guest post here on DevCentral: “As the cloud tears down silos, one trick pony solutions (including freeware) will have an uphill battle for relevancy, especially as enterprises tear down silos. [emphasis added]”

Chess grandmasters aren’t one trick ponies; they don’t use a single ‘trick’ to win, they employ a strategy. Greedy algorithms can’t compete with a grand strategy because they don’t look far enough ahead. They buy for the now, they don’t invest in the future.

IT needs to stop using greedy algorithms if they are to architect the next generation data center and start using a strategy that takes advantage of innovations and evolutionary network and application network infrastructure to construct a data center capable of implementing solutions that are globally optimal. While it is possible to win using greedy algorithms a sound, thoughtful strategy with an eye toward the future of the entire data center will almost always trump the immediate, locally optimal solution.

 

Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Related blogs & articles:

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
This session will provide an introduction to Cloud driven quality and transformation and highlight the key features that comprise it. A perspective on the cloud transformation lifecycle, transformation levers, and transformation framework will be shared. At Cognizant, we have developed a transformation strategy to enable the migration of business critical workloads to cloud environments. The strategy encompasses a set of transformation levers across the cloud transformation lifecycle to enhance ...
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Ge...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
So the dumpster is on fire. Again. The site's down. Your boss's face is an ever-deepening purple. And you begin debating whether you should join the #incident channel or call an ambulance to deal with his impending stroke. Yes, we know this is a developer's fault. There's plenty of time for blame later. Postmortems have a macabre name because they were once intended to be Viking-like funerals for someone's job. But we're civilized now. Sort of. So we call them post-incident reviews. Fires are ne...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
Hackers took three days to identify and exploit a known vulnerability in Equifax’s web applications. I will share new data that reveals why three days (at most) is the new normal for DevSecOps teams to move new business /security requirements from design into production. This session aims to enlighten DevOps teams, security and development professionals by sharing results from the 4th annual State of the Software Supply Chain Report -- a blend of public and proprietary data with expert researc...
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, softwar...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The digital transformation is real! To adapt, IT professionals need to transform their own skillset to become more multi-dimensional by gaining both depth and breadth of a wide variety of knowledge and competencies. Historically, while IT has been built on a foundation of specialty (or "I" shaped) silos, the DevOps principle of "shifting left" is opening up opportunities for developers, operational staff, security and others to grow their skills portfolio, advance their careers and become "T"-sh...
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app secu...
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed...
ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of computational needs for many industries. Their solutions provide benefits across many environments, such as datacenter deployment, HPC, workstations, storage networks and standalone server installations. ICC has been in business for over 23 years and their phenomenal range of clients include multinational corporations, universities, and small busines...
This sixteen (16) hour course provides an introduction to DevOps, the cultural and professional movement that stresses communication, collaboration, integration and automation in order to improve the flow of work between software developers and IT operations professionals. Improved workflows will result in an improved ability to design, develop, deploy and operate software and services faster.
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...