Related Topics: @CloudExpo

@CloudExpo: Blog Post

Is the Enterprise Datacenter a Dying Breed?

There is no doubt in my mind that we will continue to grow our own datacenter.

As an SDN network provider focused on the datacenter, we spend a good amount of time understanding the state of data centers today, tomorrow and some time into the future.

There is no question that the use of Software as a Service (SaaS) applications in the cloud is growing rapidly. Plexxi itself is a shining example, few of the applications we use are in-house across all functional areas.

There are many reasons why we picked cloud-based applications for our needs. As a small company, in many cases there is a very simply economic choice to make. Paying for a cloud based service is simply cheaper than building your own infrastructure. Creating a datacenter infrastructure is not cheap, and maintaining it and the applications that run on top is a serious investment. When you are small, that overhead is hard to carry and per user based charges for a cloud based application is much easier to swallow.

But as small as we are, we have clear needs for in house datacenter resources, and we are not in a very compute or storage intensive business. We have built a mini datacenter in our test environment. This is where we do our scaling tests, our integration testing with external systems, and even run big data applications as part of the test and development cycle. We have a growing environment where we validate larger and larger systems through simulation.

We are extremely focused to make sure that all our applications are as tightly integrated as they can be. We constantly chase our application providers for hooks and integrations that allow us to create a seamless environment with clear workflows from one application to another. Some of these integrations can only be done on non cloud based versions of the applications we use. Our use of some of these applications is heavy enough that access performance is becoming an issue. Productivity loss is hard to measure but very real.

There is no doubt in my mind that we will continue to grow our own datacenter. There are some things we have to run in house to ensure a controlled environment with dedicated access, others will be more hybrid with local cache and proxy versions for cloud based applications.

This week I read this article where Intel’s CIO Kim Stevenson talks about Intel’s own datacenter infrastructure. Of course Intel is somewhat unique in the sense that they create one of the most critical pieces of datacenter resources, but really they are a big multinational like so many others that have compute and storage needs for their business.

In the article, Kim articulates some of the key reasons why the enterprise datacenter will not disappear. A direct quote: “That’s because the company runs mission-critical applications for developing intellectual property, manufacturing, customer service, and product development, and thus far, these work better internally”, followed by “the company is very sensitive about its proprietary data”. In just two quotes, these are key reasons to have certain things in-house. Access, performance, flexibility, customization, security, locality. The first few will improve with better cloud environments and access to them, but those last few will have a much higher resistance.

The size of Intel’s datacenters is quite impressive. 630,000 Xeon cores across 50,000 servers. And their utilization close to 90% throughout the day. That would be one heck of a compute workload to place into the cloud. Yes, Intel is large. But there are so many others like them, some with perhaps even heavier compute and storage requirements than Intel. Large pharmaceuticals performing chemical research and analysis, oil and gas companies feeding huge amounts of data into their compute centers in search of natural resources, banks, insurance companies and credit card companies storing millions and billions of transactions and try to find patterns in an attempt to understand us better and sell us more.

There is no question that many of our applications will move to the cloud. Pure economics will drive that. But at the same time there will continue to be resistance for a long time to come to move certain applications and data into the cloud. And as Intel’s numbers show, those are very significant amounts of resources.

The enterprise datacenter will continue to exists and grow for a long time to come. Where and how we run our applications will show a shift of applications into the cloud. The boundary between local and cloud will blur, with some applications fully in the cloud, others fully local, and many in a hybrid between the two for performance, security, scaling or elasticity reasons. And it is there that we as an industry creating datacenter infrastructures need to focus.

[Today's fun fact: The 4th of July is (not surprisingly) the day with the highest hot dog consumption in the US, a staggering 150 million on that one day alone. For tomorrow, happy 4th to all in the US and a happy friday to everyone else. As for Saturday: Hup Holland Hup.]


The post Is the Enterprise Datacenter a Dying Breed? appeared first on Plexxi.

Read the original blog entry...

More Stories By Marten Terpstra

Marten Terpstra is a Product Management Director at Plexxi Inc. Marten has extensive knowledge of the architecture, design, deployment and management of enterprise and carrier networks.

Latest Stories
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Today every business relies on software to drive the innovation necessary for a competitive edge in the Application Economy. This is why collaboration between development and operations, or DevOps, has become IT’s number one priority. Whether you are in Dev or Ops, understanding how to implement a DevOps strategy can deliver faster development cycles, improved software quality, reduced deployment times and overall better experiences for your customers.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and ...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU’s GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes.