Related Topics: SYS-CON MEDIA


The Role Of Cloud In IT Asset Management

Managing Cloud IT Asset Management

The past few decades have seen a continuous evolution in the way IT back-end infrastructure is managed. Mainframe servers have given way to client servers, virtualization and now the cloud. With standalone user terminals giving way to internet-connected smartphones and tablets, users are today more than ready to find a solution of their choice on the web instead of solely relying on company infrastructure.

This poses a significant challenge to the way IT assets are managed in an organization and is perhaps the reason why businesses are today investing millions of dollars in upgrading to cloud-based services. According to one estimate, business spending on public clouds is expected to have crossed $23.2 billion in 2017. At the same time, private cloud is expected to have grown to $13.8 billion at the end of last year. This transition changes not only the way IT services are being delivered but also the way IT interacts with the rest of your organization.

Managing IT Assets In The Public Cloud

Traditionally, IT asset management tools are used to handle the various hardware tools, licenses and configuration management. This changes in a public cloud setup where an organization may not have any hardware to handle. Not only this, a public cloud server may not provide the organization with the ability to access the underlying hardware. This is tricky since a lot of software applications are licensed based on the number of CPUs being used and their configuration. It may be challenging to keep these software installations in a compliant state over a public cloud.

While public cloud may seem limitless at the outset, that may not be the case. For instance, if you make use of Microsoft SQL for your database, Microsoft may let you access the virtual CPU data for compliance calculations if you make use of Azure. But access to this may be denied for a business that wishes to use Amazon Cloud, for instance. Such demands may force your organization to choose between a handful of public cloud infrastructure alternatives that may not always be ideal for your business.

One of the biggest issues come with software licensing. It is important for your organization to let their software vendors know that their software shall be deployed over a public cloud. Given the ubiquity of the public cloud today, most software vendors have licenses that allow for public cloud deployment.

The other area of ITAM that needs to be relooked at is contract management. Public cloud providers come with their own SLAs, and this may not always align with the SLAs that you promise your customers. It is important to find a provider who can offer a better SLA than what you promise your customers.

ITAM For The Private Cloud

Organizations that move their infrastructure to a private cloud mostly do it for two reasons - to save on operational costs or to save capital expenses. IT Asset Management in the private cloud is a lot more significant than its role in the public cloud because your organization's IT is responsible for the cloud hardware. As it is with the public cloud, ITAM is critical for the management of the deployed software. This includes mapping the software versions with their mandated hardware requirements and making sure that your hardware aligns with the software deployed.

This brings us to another critical role played by asset management in the private cloud - resource management. Hardware in the cloud can be easily scaled up or down. This means the infrastructure could be scaled up to larger CPU to meet higher demands. If your software is configured for a lower CPU hypervisor, then an audit could put your organization in trouble although the scaled-up infrastructure would improve the performance for your users. With a proper resource management policy in place, ITAM can make sure that compliance requirements in a software license are adhered to while designing new models and scaling up infrastructure in the future.

There is a much larger role for ITAM to play with respect to resource management. In the absence of unfettered access, users are likely to waste computing resources that they do not need. This may be prevented through chargebacks. Chargebacks are essentially a price tag on the service consumed by each department. While there has been a lot of discussion about the feasibility of implementing chargebacks as a means to curb resource wastage, studies have shown that in the absence of such a mechanism, employees are likely to waste precious resources for which they are not being billed for. The job of managing chargebacks or other forms of resource management rests with the business' IT department.

It, however, needs to be pointed out that no organization falls strictly into any one category completely. The average business today hosts their applications in private clouds, public clouds, hybrid clouds as well as locally all at once. The rules and policies required to manage hardware and software licenses and contracts thus may not be tied to any one particular type of service delivery and needs to be presented holistically. IT has been in such a transition mode for a few years now and is expected to continue to do so. Your ITAM should account for this while devising its strategy.

More Stories By Harry Trott

Harry Trott is an IT consultant from Perth, WA. He is currently working on a long term project in Bangalore, India. Harry has over 7 years of work experience on cloud and networking based projects. He is also working on a SaaS based startup which is currently in stealth mode.

Latest Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will d...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists discussed how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations might...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Le...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Dhiraj Sehgal works in Delphix's product and solution organization. His focus has been DevOps, DataOps, private cloud and datacenters customers, technologies and products. He has wealth of experience in cloud focused and virtualized technologies ranging from compute, networking to storage. He has spoken at Cloud Expo for last 3 years now in New York and Santa Clara.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure ...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.