|By Ofir Nachmani||
|February 22, 2014 03:00 PM EST||
The ever-growing web as well as the `consumerization of IT` provide end users with an abundance of options and full discretion. We now understand the vital need to internalize the concept of spreading knowledge and information across a variety of applications and platforms as a means to thrive. Fortunately, we now have the opportunity to utilize multiple clouds.
There are various incentives when considering multi-cloud deployment, including regulations, high-availability, and global presence, to name a few. However, when looking at the main factors, two key words come to mind: Evolution and Freedom.
IT has made incredible advancements over the past decade, and from the looks of it, there are no signs of slowing down. The infinite public IaaS resources have translated into an endless supply of new applications. While the environment is made up of many intricacies, countless accommodating innovations are progressively being adopted on a regular basis. That is the beauty of our dynamic IT environment, there is always something new on the table.
The consumerist habit of buying an application for a dollar, then tossing it out after a day without blinking an eye, has made its way to the business world. Now, enterprises and end users, alike, are expecting to enjoy the bottom-up adoption of cloud computing to materialize on their work desks. With risk-free trials lowering cloud lock-in opportunities, the world of IT is becoming more and more heterogeneous. For critics of change, this concept may seem hard to swallow, nonetheless, it is out of our hands and acceptance is the healthiest choice.
The researcher and cloud pundit Dave McCrory, who coined the term "Data Gravity", intended to help IT consumers assess the potential for vendor lock-in, insisting that data stay as "free" as possible. In the olden days, database storage options were limited to on-premise or collocation centers, decreasing the range of flexibility if budget or data distinctions played a significant role. The evolution of IT has granted users the choice and opportunity to store and scale their relational databases using platforms such as Amazon RDS, Google Cloud RDS and innovative startups like ScaleBase. Now, users have the freedom and responsibility to store and read data using different platforms and switch according to the needs of their application.
Developers and DevOps today recognize their responsibility and importance of building a platform that is separate from the underlying infrastructure. Application containers, such as Docker and Cloudbees, enable that independence. This delineation is crucial when discussing cloud adoption and is discussed in depth in RavelloSystems' post approach to the issue.
There will always be gaps when coordinating between cloud capabilities. In fact, in terms of network speed, Google has demonstrated that its platform can complement AWS cloud (based on an actual integration case study). The benefits of utilizing multiple platforms heavily outweigh the impediments. As mentioned above, the continuous cultivation and upgrade of cloud solutions presents end users with the best quality and cutting edge offerings. The option to take advantage of every vendor's prime offer places end users in an ideal situation. Therefore, if one vendor has a more attractive offer than what you currently use, it is your sure that you are able to switch easily.
Complexity is definitely a challenge. Generally speaking, utilizing multiple clouds is perceived to create a much more complex environment than a single platform. However, the level of complexity is not contingent upon the number of platforms or clouds in a system. If cloud management is in place, including policies, automation and transparency, moving to a multi-cloud platform configuration should generate lower marginal costs. While it may seem easier said than done, I conducted research of my own, learning that Emind, have managed to move applications between clouds in a matter of weeks. On the topic of complexity, transparency plays a significant role. With endless processes, policies, failures and successes, transparency draws conclusions, which lead to increased understanding, education and improvement, making it one of the most important elements in cloud management.
Don't be fooled, the cloud vendors as well as the blogger who typed these words want to justifiably lock you in. Nonetheless, it is extremely important to evaluate all of the risks involved and maintain low optional transition costs. Do not fear change. It is fun and advantageous to have more options and utilize multiple solutions. There will never be a one-stop-shop for the cloud. Internalizing that with the cloud, IT is an extreme heterogeneous environment promotes greater flexibility and innovation.
To learn more about the factors and players shaping today's heterogeneous IT environment, join Ofir Nachmani's session at the the next IGT Meetup Conference that will take place at the Google Campus in Tel Aviv. Free Registration here
SYS-CON Events announced today that StarNet Communications will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. StarNet Communications’ FastX is the industry first cloud-based remote X Windows emulator. Using standard Web browsers (FireFox, Chrome, Safari, etc.) users from around the world gain highly secure access to applications and data hosted on Linux-based servers in a central data center. ...
Aug. 28, 2016 12:15 PM EDT Reads: 812
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Aug. 28, 2016 11:45 AM EDT Reads: 633
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Aug. 28, 2016 11:30 AM EDT Reads: 1,937
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
Aug. 28, 2016 11:00 AM EDT Reads: 656
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Aug. 28, 2016 11:00 AM EDT Reads: 3,110
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Aug. 28, 2016 10:45 AM EDT Reads: 812
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Aug. 28, 2016 10:30 AM EDT Reads: 4,016
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Aug. 28, 2016 10:30 AM EDT Reads: 866
Enterprises have forever faced challenges surrounding the sharing of their intellectual property. Emerging cloud adoption has made it more compelling for enterprises to digitize their content, making them available over a wide variety of devices across the Internet. In his session at 19th Cloud Expo, Santosh Ahuja, Director of Architecture at Impiger Technologies, will introduce various mechanisms provided by cloud service providers today to manage and share digital content in a secure manner....
Aug. 28, 2016 09:30 AM EDT Reads: 759
StarNet Communications Corp has announced the addition of three Secure Remote Desktop modules to its flagship X-Win32 PC X server. The new modules enable X-Win32 to safely tunnel the remote desktops from Linux and Unix servers to the user’s PC over encrypted SSH. Traditionally, users of PC X servers deploy the XDMCP protocol to display remote desktop environments such as the Gnome and KDE desktops on Linux servers and the CDE environment on Solaris Unix machines. XDMCP is used primarily on comp...
Aug. 28, 2016 09:30 AM EDT Reads: 703
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, wil...
Aug. 28, 2016 07:30 AM EDT Reads: 776
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Aug. 28, 2016 07:00 AM EDT Reads: 2,402
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Aug. 28, 2016 03:30 AM EDT Reads: 2,362
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
Aug. 28, 2016 02:00 AM EDT Reads: 1,783
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Aug. 28, 2016 01:45 AM EDT Reads: 2,164