Welcome!

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo

@CloudExpo: Blog Feed Post

The Future of Cloud Is Hybrid... and Seamless

Right now, hybrid cloud models are disconnected and managed manually

It's probably no surprise that I have long advocated the position that hybrid cloud would eventually become "the standard" architecture with respect to, well, cloud computing. As the dev/ops crowd at Glue Con was recently reminded by the self-styled "most obnoxious man in cloud", Josh McKenty, you can only add to what exists in the data center. You can't simply rip and replace, forklifts are not allowed, and allowances must be made for how to integrate with existing systems no matter how onerous that might be. The future is, as he put it, open and closed, traditional and modern, automated and human.

I would add to that, it is both public and private, with respect to cloud.

Hybrid cloud models were inevitable for all these reasons and more. Suffice to say that there is unlikely to be a technology that will turn data centers into the green fields every starry-eyed young architect and engineer wishes they could be.

So if the question is no longer what cloud model will ultimately win the hearts and minds of the enterprise, the question must turn to other more tactical concerns, such as integrating the two models into a seamless, well-oiled machine.

hybrid-today Right now, hybrid cloud models are disconnected and managed manually. Oh, there are scripts and APIs, yes. But those are mainly concerned with provisioning and management. They aren't about actually using the cloud as the extension of the data center it was promised to be. They're still separate entities, for the most part, and treated as such. They're secondary and tertiary data centers. Stand-alone centers of computing that remain as disconnected operationally as they are physically.

They aren't a data center fabric, yet, even though the unicorn and rainbow goal of hybrid cloud is to achieve just that: distributed resources that act as a single, unified entity. Like a patchwork quilt, sewn from many different blocks but in the end, a single cohesive product. If not in topology, then in usage. Which is the point of many technologies today: abstraction. Abstraction enables the decoupling of interface from implementation, applications from networks, and control from data.

Doing so liberates applications (which is ultimate the reason for what we all do) from being bound to a given location, frees resources to meld with the broader data center fabric, and offers business greater freedom.

But it isn't just the applications that must be unchained from the data center jail. It is the numerous services within the network that support those applications that must also be set free. Security. Availability. Identity. Access. Performance. Applications are not islands, they are worlds unto themselves comprised of a variety of network and application services that must accompany them as they traverse these new, unfettered boundaries.

As Barrett Lyon, founder of Defense.Net put it so well in his recent blog, what we need is to seamlessly merge these environments without concern for their physical separation:

By having such a solid foundation, the next step is to seamlessly merge the DDoS defense network with F5’s hardware to create the world’s first true hybrid cloud. The vision is that customers can create their own local DDoS defense, and when volumetric attacks hit, at a specific point they’re “automatically” offloaded to the cloud.

 

Why Defense.Net and F5: The Hybrid Cloud

Barrett's proposal regarding a hybrid DDoS model carries with it shades of cloud bursting for applications, but goes a step further with the notion that hybrid cloud (at least for DDoS) should be seamless. And why shouldn't it?  The definition of cloud brokers includes this capability. To seamlessly automate the provisioning of services and applications based on some relevant criteria. For DDoS, certainly there is a consideration of bandwidth consumption. For applications, it may be demand and capacity. Or it might consider costs and location of the user.

The criteria are not so much the important point but rather it is the capability to achieve this functionality. To be able to seamlessly take advantage of a data center distributed across multiple environments, both on-premise and cloud, both public and private. We've seen the beginnings of these types of seamless integrations with cloud identity federation - the use of standards like SAML to promote access control over applications that reside beyond the corporate borders but within its overall perimeter.

Corporate borders are expanding. They must necessarily include all manner of cloud environments and they cannot continue to be disconnected operational islands. We need to consider that if the future is hybrid and composable, that we ought to be able to manage such a environment more seamlessly and with greater attention to architectures that not only accept that premise, but exploit it to the advantage of IT and the business.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
All organizations that did not originate this moment have a pre-existing culture as well as legacy technology and processes that can be more or less amenable to DevOps implementation. That organizational culture is influenced by the personalities and management styles of Executive Management, the wider culture in which the organization is situated, and the personalities of key team members at all levels of the organization. This culture and entrenched interests usually throw a wrench in the work...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Virtualization over the past years has become a key strategy for IT to acquire multi-tenancy, increase utilization, develop elasticity and improve security. And virtual machines (VMs) are quickly becoming a main vehicle for developing and deploying applications. The introduction of containers seems to be bringing another and perhaps overlapped solution for achieving the same above-mentioned benefits. Are a container and a virtual machine fundamentally the same or different? And how? Is one techn...
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Information technology (IT) advances are transforming the way we innovate in business, thereby disrupting the old guard and their predictable status-quo. It’s creating global market turbulence. Industries are converging, and new opportunities and threats are emerging, like never before. So, how are savvy chief information officers (CIOs) leading this transition? Back in 2015, the IBM Institute for Business Value conducted a market study that included the findings from over 1,800 CIO interviews ...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, will discuss the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information,
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Whether you like it or not, DevOps is on track for a remarkable alliance with security. The SEC didn’t approve the merger. And your boss hasn’t heard anything about it. Yet, this unruly triumvirate will soon dominate and deliver DevSecOps faster, cheaper, better, and on an unprecedented scale. In his session at DevOps Summit, Frank Bunger, VP of Customer Success at ScriptRock, discussed how this cathartic moment will propel the DevOps movement from such stuff as dreams are made on to a practic...