|By Don MacVittie||
|June 22, 2011 09:15 AM EDT||
There’s a whole lot of talk about cloud revolutionizing IT, and a whole lot of argument about public versus private cloud, even a considerable amount of talk about what belongs in the cloud. But not much talk about helping you determine what applications and storage are a good candidate to move there – considering all of the angles that matter to IT. This blog will focus on storage, the next one on applications, because I don’t want to bury you in a blog as long as a feature length article.
It amazes me when I see comments like “no one needs a datacenter” while the definition of what, exactly, cloud is still carries the weight of debate. For the purposes of this blog, we will limit the definition of cloud to Infrastructure as a Service (IaaS) – VM containers and the things to support them, or Storage containers and the things to support them. My reasoning is simple, in that the only other big category of “cloud” at the moment is SaaS, and since SaaS has been around for about a decade, you should already have a decision making process for outsourcing a given application to such a service. Salesforce.com and Google Docs are examples of what is filtered out by saying “SaaS is a different beast”. Hosting services and CDNs are a chunk of the market, but increasingly they are starting to look like IaaS, as they add functionality to meet the demands of their IT customers. So we’ll focus on the “new and shiny” aspects of cloud that have picked up a level of mainstream support.
Cloud Storage is here, and seems to be pretty ready for prime-time if you stick with the major providers. That’s good news, since instability in such a young market could spell its doom. The biggest problem with cloud storage is the fact that it is implemented as SOAP calls in almost every large provider. This was an architectural decision that had to be made, but between CIFS, NFS, iSCSI, and FCoE, it wasn’t necessary to choose a web service interface, so I’m unsure why nearly all vendors did. Conveniently, nearly immediately after it was clear the enterprise needed cloud storage to look like all the other storage in their datacenters, cloud storage gateways hit the market. In most cases these were brand new products by brand new companies, but some, like our ARX Cloud Extender, are additions to existing storage products that help leverage both internal and external assets. Utilizing a cloud storage gateway allows you to treat cloud storage like NAS or SAN disk (depending upon the gateway and cloud provider), which really opens it up to whatever use you need to make of the storage within your enterprise.
Generally speaking, WAN communications are slower than LAN communications. You can concoct situations where the LAN is slower, but most of us definitely do not live in those worlds. While compression and deduplication of data going over the WAN can help, it is best to assume that storage in the cloud will have slower access times than anything but tape sitting in your datacenter.
Also, if you’re going to be storing data on the public Internet, it needs to be protected from prying eyes. Once it leaves your building, there are so many places/ways it can be viewed, that the only real option is to encrypt on the way out. Products like ARX Cloud Extender take care of that for you, check with your cloud storage gateway vendor though to be certain they do the same. If they do not, an encryption engine will have to be installed to take care of encryption before sending out of the datacenter. If you end up having to do this, be aware that encryption can have huge impacts on compression and deduplication, because it changes the bytes themselves. Purchasing a gateway that does both makes order of operations work correctly – dedupe, then compress, then encrypt (assuming it supports all three).
With that said, utilizing compression, TCP optimizations, and deduplication will reduce the performance hit to manageable levels, and encryption mitigates the data-in-flight security risk going to the cloud, and the data-at-rest security risk while stored in the cloud. Between the two, it makes cloud storage appealing – or at least usable.
Our criteria then become pretty straight-forward… We want to send data to the cloud that either has no access time constraints (eg: can be slower to open), or will be faster than existing technology.These criteria leave you a couple of solid options for piloting cloud storage usage.
1. Backups or Archival Storage. They’re stored off-site, they’re encrypted, and even though they’re slower than LAN-based backups, they’re still disk to disk (D2D), so they’re faster backing up, and much faster for selective restores than tape libraries. If you already have a D2D solution in-house, this may be less appealing, but getting a copy into the cloud means that one major disaster can’t take out your primary datacenter and its backups.
2. Infrequently Accessed or Low Priority Storage. All that chaff that comes along with the goldmine of unstructured data in your organization is kept, because you will need some of it again some day, and IT is not in a position to predict which files you will need. By setting up a cloud storage Share or LUN that you use tiering or some other mechanism to direct those files to, the files are saved, but they’re not chewing up local disk. That increases available disk in the datacenter, but keeps the files available in much the same manner as archival storage. Implemented in conjunction with Directory Virtualization, the movement of these files can be invisible to the end users, as they will still appear in the same place in the global directory, but will physically be moved to the cloud if they are infrequently accessed.
Cloud storage is no more a panacea than any other technical solution, there’s just some stuff that should not be moved to the cloud today, perhaps not ever.
1. Access time sensitive files. Don’t put your database files in the cloud (though you might check out Microsoft Azure or Oracle’s equivalent offering). You won’t like the results. Remember, just because you can, doesn’t mean you should.
2. Data Critical to Business Continuity. Let’s face it, one major blow that takes out your WAN connection takes out your ability to access what’s stored in the cloud. So be careful that data needed for normal operation of the business is not off-site. It’s bad enough if access to the Internet is down, and public websites running on datacenter servers are inaccessible, but to have those files critical to the business – be it phone orders, customer support, whatever – must be available if the business is keeping its doors open. Redundant WAN connections can mitigate this issue (with a pricetag of course), but even those are not proof against all eventualities that impact only your Internet connectivity.
With cloud storage gateways and directory virtualization, there are definite “win” points for the use of cloud storage. Without directory virtualization, there are still some definite scenarios that will improve your storage architecture without breaking your back implementing them. Without a cloud storage gateway, most cloud storage is essentially useless to enterprise IT (other than AppDev) because none of your architecture knows how to make use of web services APIs.
But if you implement a cloud storage gateway, and choose wisely, you can save more in storage upgrades than the cost of the cloud storage. This is certainly true when you would have to upgrade local storage, and cloud storage just grows a little with a commensurate monthly fee. Since fee schedules and even what is charged for (bytes in/out, bytes at rest, phase of the moon) change over time, check with your preferred vendor to make certain cloud is the best option for your scenario, but remember that deploying a directory virtualization tool will increase utilization and tiering can help remove data from expensive tier one disk, possibly decreasing one of your most expensive architectural needs – the fast disk upgrade.
Next time we’ll look at applications from a general IT perspective, and I’m seriously considering extending this to a third blog discussing AppDev and the cloud.
SYS-CON Events announced today that Catchpoint, a global leader in monitoring, and testing the performance of online applications, has been named "Silver Sponsor" of DevOps Summit New York, which will take place on June 7-9, 2016 at the Javits Center in New York City. Catchpoint radically transforms the way businesses manage, monitor, and test the performance of online applications. Truly understand and improve user experience with clear visibility into complex, distributed online systems.Founde...
Dec. 1, 2015 04:15 AM EST
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facin...
Dec. 1, 2015 03:30 AM EST Reads: 529
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Dec. 1, 2015 03:00 AM EST Reads: 468
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
Dec. 1, 2015 03:00 AM EST Reads: 467
The revocation of Safe Harbor has radically affected data sovereignty strategy in the cloud. In his session at 17th Cloud Expo, Jeff Miller, Product Management at Cavirin Systems, discussed how to assess these changes across your own cloud strategy, and how you can mitigate risks previously covered under the agreement.
Dec. 1, 2015 03:00 AM EST Reads: 103
Most of the IoT Gateway scenarios involve collecting data from machines/processing and pushing data upstream to cloud for further analytics. The gateway hardware varies from Raspberry Pi to Industrial PCs. The document states the process of allowing deploying polyglot data pipelining software with the clear notion of supporting immutability. In his session at @ThingsExpo, Shashank Jain, a development architect for SAP Labs, discussed the objective, which is to automate the IoT deployment proces...
Dec. 1, 2015 01:15 AM EST Reads: 119
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
Dec. 1, 2015 01:00 AM EST Reads: 435
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, explored the value of Kibana 4 for log analysis and provided a hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He examined three use cases: IT operations, business intelligence, and security and compliance. Asaf Yigal is co-founder and VP of Product at log analytics software company Logz.io. In the past, he was co-founder of social-trading platform Currensee, which...
Nov. 30, 2015 10:00 PM EST Reads: 286
Countless business models have spawned from the IaaS industry – resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his general session at 17th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, an IBM Company, broke down what we have to work with, discussed the benefits and pitfalls and how we can best use them ...
Nov. 30, 2015 03:45 PM EST Reads: 106
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Nov. 30, 2015 03:15 PM EST Reads: 248
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Nov. 30, 2015 03:00 PM EST Reads: 494
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
Nov. 30, 2015 02:00 PM EST Reads: 372
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Su...
Nov. 30, 2015 01:45 PM EST Reads: 437
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Nov. 30, 2015 01:45 PM EST Reads: 439
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
Nov. 30, 2015 01:00 PM EST Reads: 541