|By Marten Terpstra||
|July 3, 2014 06:00 AM EDT||
As an SDN network provider focused on the datacenter, we spend a good amount of time understanding the state of data centers today, tomorrow and some time into the future.
There is no question that the use of Software as a Service (SaaS) applications in the cloud is growing rapidly. Plexxi itself is a shining example, few of the applications we use are in-house across all functional areas.
There are many reasons why we picked cloud-based applications for our needs. As a small company, in many cases there is a very simply economic choice to make. Paying for a cloud based service is simply cheaper than building your own infrastructure. Creating a datacenter infrastructure is not cheap, and maintaining it and the applications that run on top is a serious investment. When you are small, that overhead is hard to carry and per user based charges for a cloud based application is much easier to swallow.
But as small as we are, we have clear needs for in house datacenter resources, and we are not in a very compute or storage intensive business. We have built a mini datacenter in our test environment. This is where we do our scaling tests, our integration testing with external systems, and even run big data applications as part of the test and development cycle. We have a growing environment where we validate larger and larger systems through simulation.
We are extremely focused to make sure that all our applications are as tightly integrated as they can be. We constantly chase our application providers for hooks and integrations that allow us to create a seamless environment with clear workflows from one application to another. Some of these integrations can only be done on non cloud based versions of the applications we use. Our use of some of these applications is heavy enough that access performance is becoming an issue. Productivity loss is hard to measure but very real.
There is no doubt in my mind that we will continue to grow our own datacenter. There are some things we have to run in house to ensure a controlled environment with dedicated access, others will be more hybrid with local cache and proxy versions for cloud based applications.
This week I read this article where Intel’s CIO Kim Stevenson talks about Intel’s own datacenter infrastructure. Of course Intel is somewhat unique in the sense that they create one of the most critical pieces of datacenter resources, but really they are a big multinational like so many others that have compute and storage needs for their business.
In the article, Kim articulates some of the key reasons why the enterprise datacenter will not disappear. A direct quote: “That’s because the company runs mission-critical applications for developing intellectual property, manufacturing, customer service, and product development, and thus far, these work better internally”, followed by “the company is very sensitive about its proprietary data”. In just two quotes, these are key reasons to have certain things in-house. Access, performance, flexibility, customization, security, locality. The first few will improve with better cloud environments and access to them, but those last few will have a much higher resistance.
The size of Intel’s datacenters is quite impressive. 630,000 Xeon cores across 50,000 servers. And their utilization close to 90% throughout the day. That would be one heck of a compute workload to place into the cloud. Yes, Intel is large. But there are so many others like them, some with perhaps even heavier compute and storage requirements than Intel. Large pharmaceuticals performing chemical research and analysis, oil and gas companies feeding huge amounts of data into their compute centers in search of natural resources, banks, insurance companies and credit card companies storing millions and billions of transactions and try to find patterns in an attempt to understand us better and sell us more.
There is no question that many of our applications will move to the cloud. Pure economics will drive that. But at the same time there will continue to be resistance for a long time to come to move certain applications and data into the cloud. And as Intel’s numbers show, those are very significant amounts of resources.
The enterprise datacenter will continue to exists and grow for a long time to come. Where and how we run our applications will show a shift of applications into the cloud. The boundary between local and cloud will blur, with some applications fully in the cloud, others fully local, and many in a hybrid between the two for performance, security, scaling or elasticity reasons. And it is there that we as an industry creating datacenter infrastructures need to focus.
[Today's fun fact: The 4th of July is (not surprisingly) the day with the highest hot dog consumption in the US, a staggering 150 million on that one day alone. For tomorrow, happy 4th to all in the US and a happy friday to everyone else. As for Saturday: Hup Holland Hup.]
Silver Spring Networks, Inc. (NYSE: SSNI) extended its Internet of Things technology platform with performance enhancements to Gen5 – its fifth generation critical infrastructure networking platform. Already delivering nearly 23 million devices on five continents as one of the leading networking providers in the market, Silver Spring announced it is doubling the maximum speed of its Gen5 network to up to 2.4 Mbps, increasing computational performance by 10x, supporting simultaneous mesh communic...
Feb. 14, 2016 05:00 AM EST
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 14, 2016 04:30 AM EST Reads: 401
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 14, 2016 04:00 AM EST Reads: 260
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 14, 2016 04:00 AM EST Reads: 486
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 14, 2016 03:45 AM EST Reads: 476
The principles behind DevOps are not new - for decades people have been automating system administration and decreasing the time to deploy apps and perform other management tasks. However, only recently did we see the tools and the will necessary to share the benefits and power of automation with a wider circle of people. In his session at DevOps Summit, Bernard Sanders, Chief Technology Officer at CloudBolt Software, explored the latest tools including Puppet, Chef, Docker, and CMPs needed to...
Feb. 14, 2016 03:30 AM EST Reads: 390
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Feb. 14, 2016 02:00 AM EST Reads: 434
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 14, 2016 01:15 AM EST Reads: 294
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 13, 2016 11:15 PM EST Reads: 315
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Feb. 13, 2016 09:00 PM EST Reads: 259
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 13, 2016 08:45 PM EST Reads: 401
One of the bewildering things about DevOps is integrating the massive toolchain including the dozens of new tools that seem to crop up every year. Part of DevOps is Continuous Delivery and having a complex toolchain can add additional integration and setup to your developer environment. In his session at @DevOpsSummit at 18th Cloud Expo, Miko Matsumura, Chief Marketing Officer of Gradle Inc., will discuss which tools to use in a developer stack, how to provision the toolchain to minimize onboa...
Feb. 13, 2016 08:00 PM EST Reads: 154
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 13, 2016 07:00 PM EST Reads: 421
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 13, 2016 04:45 PM EST Reads: 208
CIOs and those charged with running IT Operations are challenged to deliver secure, audited, and reliable compute environments for the applications and data for the business. Behind the scenes these tasks are often accomplished by following onerous time-consuming processes and often the management of these environments and processes will be outsourced to multiple IT service providers. In addition, the division of work is often siloed into traditional "towers" that are not well integrated for cro...
Feb. 13, 2016 04:00 PM EST Reads: 514