Welcome!

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

The Foundation of Cloud Computing Infrastructures: Virtualization

Research Challenges in Cloud Infrastructures for 2009

Ignacio Martín Llorente's Blog

One of the relevant contributions of cloud computing is the Infrastructure as a Service (IaaS) model. There are a number of research challenges in cloud infrastructures that, in my opinion, will need to be addressed in 2009. The open research issues are mainly related to new virtualization technologies to enable efficient, dynamic and scalable Cloud operation and interoperation.

Cloud computing enables the deployment of an entire IT infrastructure without the associated capital costs, paying only for the used capacity. The new “Infrastructure as a Service” paradigm has been introduced to better respond to changing computing demands, so allowing to add and remove capacity in order to meet peak or fluctuating service demands. Amazon Elastic Compute Cloud (Amazon EC2), GoGrid and FlexiScale are examples of cloud providers of elastic capacity, offering an interface for remote management of virtualized server instances within their proprietary infrastructure. These commercial clouds do not provide any detail about the internal management of the virtual machines or the physical infrastructure.

Open source cloud computing tools, such as Eucalyptus and Globus Nimbus, let organizations build and customize there own cloud infrastructure. These relevant tools focus on the client perspective, being fully functional with respect to cloud compatible interfaces and providing higher level functionality for security, contextualization and image management. However, they do not support the dynamic allocation and balance of computing resources among virtual machines to meet the scalable and dynamic computing requirements of enterprise datacenters, such as flexible support for dynamic virtual machine placement and infrastructure management.

The RESERVOIR Project

RESERVOIR is the main European research initiative in virtualized infrastructures and Cloud Computing. RESERVOIR is a joint research programme coordinated by IBM Haifa with 13 European partners: Elsag Datamat, CETIC, OGF.eeig, SAP Research, Sun Microsystems, Telefonica I+D, Thales, Umea University, University College of London, DSA-Research at Universidad Complutense de Marid, University of Lugano and University of Messina. The aim of this project is to develop the open-source technology to enable deployment and management of complex IT services across different administrative domains. Its open-source approach will support the definition of open standards for cloud computing, breaking the lock-in imposed by vendors today and allowing any organization to build its own local or public cloud infrastructure. The first-class management entity is a complex service, as a group of interconnected virtual machines with placement constrains, that can run across different cloud sites, being federation of cloud providers one of its main research challenges.

The cloud infrastructure layer in RESERVOIR is the VEE Management layer, which provides execution of groups of interconnected virtual machines as a service. Its other two main research activities complement this layer to provide service management functionality on top of infrastructure clouds (Service Management ActivityTelefonica I+D) and to provide virtualization platforms with advance functionality for performance and reallocation optimization ( coordinated by VEE Infrastructure Enablement Activity coordinated by IBM Haifa).

In the context of the VEE Management Activity, coordinated by DSA-esearch at UCM, the project is conducting research in cloud infrastructures to meet the main challenges in the dynamic and scalable management of virtual machines in datacenters, such us the efficient management of groups of virtual machines within and across sites, elasticity support to meet variations in service workload, dynamic placement algorithms, architectures and placement heuristics for federation of sites, and enhanced Cloud interfaces.

Private Cloud Infrastructures

A key component in a cloud infrastructure backend is the distributed virtual infrastructure manager (also called internal cloud or distributed VM Manager), which allows the dynamic placement of virtual machines on a pool of physical resources according to business needs. There is a growing interest in the community in these tools for leasing compute capacity from the local infrastructure (see for example the conclusions of the Cisco Cloud Computing Research Symposium by Ruben S. Montero, co-leader of the OpenNebula project at DSA-research, and the cloud computing predictions for the new year by Randy Bias, VP Technology Strategy at GoGrid). The aim of these deployments is not to expose to the world a cloud interface to sell capacity over the Internet, but to provide a dynamic and flexible private infrastructure to run service workloads.

The OpenNebula VM Manager is a core component in the RESERVOIR VEE Management layer that is being enhanced to meet the demanding requirements of the business use cases in the project. This open-source alternative to commercial tools for VM management provides an efficient, dynamic and scalable management of VMs within datacenters, private clouds, involving a large amount of virtual and physical servers. OpenNebula can interface with a remote cloud site, being the only tool able to access on-demand to Amazon EC2 for dynamic scaling the local infrastructure based on actual usage. Furthermore, the integration of OpenNebula and Haizea provides the only distributed virtual infrastructure management solution offering advance reservation of capacity.

Further Research in Cloud Infrastructures

There are many other topics for further research in cloud infrastructures that will be addressed in 2009:

  • Concerning the application of cloud computing, relevant topics are performance and reliability running scientific and business applications in Clouds; content distribution systems using Clouds; and Grid, HPC and data-intensive computing in Clouds.
  • Concerning technologies to enable Cloud Computing, interesting topics are new architectures for Cloud infrastructures; Cloud interfaces, programming models and tools; integration with infrastructures for Grid Computing; SLAs, privacy, security and pricing; management of network capacity; heuristics for energy efficiency and high availability; and advance reservation of capacity.
  • Concerning federation of Cloud Providers, research topics are interoperability and portability between Cloud providers; open business policies framework for relationships between infrastructure providers; and higher value self-awareness, self-knowledge, and self-management capabilities.

Although there exist several commercial clouds selling computing power, there are many open research issues to build the next generation of cloud infrastructures. These topics are mainly related to new technologies to enable efficient, dynamic and scalable Cloud operation and interoperation.

 

More Stories By Ignacio M. Llorente

Dr. Llorente is Director of the OpenNebula Project and CEO & co-founder at C12G Labs. He is an entrepreneur and researcher in the field of cloud and distributed computing, having managed several international projects and initiatives on Cloud Computing, and authored many articles in the leading journals and proceedings books. Dr. Llorente is one of the pioneers and world's leading authorities on Cloud Computing. He has held several appointments as independent expert and consultant for the European Commission and several companies and national governments. He has given many keynotes and invited talks in the main international events in cloud computing, has served on several Groups of Experts on Cloud Computing convened by international organizations, such as the European Commission and the World Economic Forum, and has contributed to several Cloud Computing panels and roadmaps. He founded and co-chaired the Open Grid Forum Working Group on Open Cloud Computing Interface, and has participated in the main European projects in Cloud Computing. Llorente holds a Ph.D in Computer Science (UCM) and an Executive MBA (IE Business School), and is a Full Professor (Catedratico) and the Head of the Distributed Systems Architecture Group at UCM.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists looked at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deliver...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
With the introduction of IoT and Smart Living in every aspect of our lives, one question has become relevant: What are the security implications? To answer this, first we have to look and explore the security models of the technologies that IoT is founded upon. In his session at @ThingsExpo, Nevi Kaja, a Research Engineer at Ford Motor Company, discussed some of the security challenges of the IoT infrastructure and related how these aspects impact Smart Living. The material was delivered interac...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...