Welcome!

Article

VDI - Citrix? VMware? Or should you even care?

VDI deployments bring significant benefits but is it right for your infrastructure?

PCs are part of everyday life in just about every organization. First there’s the purchase of the hardware and the necessary software followed by an inventory recorded and maintained by the IT department. Then normal procedure would dictate that the same IT department would then install all required applications before delivering them physically to the end user. Then over a period of time the laptop/PC would be maintained by the IT department with software updates, patches, troubleshooting etc. to ensure full utilization of employees. Once the PC/laptop becomes outdated, the IT department is then tasked with the monotonous task of removing the hardware, deleting sensitive data and removing any installed applications to free up licenses. All of this is done to enable the whole cycle to be repeated all over again. So in this vicious circle, there are obvious opportunities to better manage resources and save unnecessary OPEX & CAPEX costs, one such solution being virtual desktops.

Having witnessed the financial rewards of server virtualization, enterprises are now taking note of the benefits and usage of virtualization to support their desktop workloads. Consolidation, centralization are now no longer buzz words which were once used for marketing spin but are instead tangible realities for IT managers who initially took that unknown plunge into what was then the deep mystical waters of virtualization. Now they’re also realizing that by enabling thin clients the cost of their endpoint hardware is also significantly driven down by the consequent lifespan extension of existing PCs. Indeed the future of endpoint devices is one that could revolutionize their existent IT offices – a future of PC/laptop-less office desks replaced by thin client compatible portable iPads? Anything is now possible.

There’s also no doubting that VDI brings with it even further advantages one being improved security. With data always being administered via the datacenter rather than from the vulnerability of an end user’s desktop, risks of data loss or theft are instantly mitigated. No longer can sensitive data potentially walk out of the company’s front doors. Also with centralized administration, data can instantly be protected from scenarios where access needs to be limited or copying needs protection. For example a company that has numerous outsourcers / contractors on site can quickly set their data and application access to be specified or even turned off. Indeed there is nothing stopping an organization in setting up ‘a contractor’ desktop template which can be provisioned instantly and then decommissioned the moment the outsourced party’s contract expires. 

By centralizing the infrastructure, fully compliant backup policies can also become significantly easier. With PCs and hard drives constantly crashing leading to potential data loss, the centralized virtual desktop has an underlying infrastructure which is continuously backed up. Additionally with the desktop instance not being bound to the PC’s local storage but instead stored in the server, recovery from potential outages are significantly quicker with even the option of reverting the virtual desktops back to their last known good states. Imagine the amount of work the customary employees that constantly bombard the IT helpdesk with countless “help I’ve accidentally deleted my hard drive” phone calls could actually get done now, not to mention the amount of time it will free up for your IT helpdesk team. In fact you might even end up with an IT helpdesk that gets to answer the phone instead of taking you straight to voicemail.

Additionally an IT helpdesk team would also be better utilized with the centralized, server-based approach allowing for both the maintenance of desktop images and specific user data all without having to visit the end user’s office. Hence with nothing needing to be installed on the endpoint, deployment becomes incredibly faster and easier with VDI than the traditional PC desktop deployment. This can also be extended to the laborious practice of having to individually visit each desktop to patch applications, provision and decommission users, as well as upgrade to newer operating systems. By removing such activities, the OPEX savings are more than substantial. 
OPEX savings can also be seen with the added benefit of optimizing the productivity of highly paid non-technical end users by avoiding them having to needlessly maintain their desktop applications and data. Furthermore the productivity of employees can also be improved significantly by a centralized control of which applications are used by end users and a full monitoring of their usage, so long gone should be the days of employees downloading torrents or mindlessly chatting away on social networks during working hours. Even the infamously slow start up time of Windows which has consequently brought with it the traditional yet unofficial morning coffee/cigarette break can be eradicated with the faster Windows boot up times found with VDI. Even lack of access to an employee’s corporate PC can no longer be used as an excuse to not log in from home or elsewhere remotely when required – a manager’s dream and a slacker’s nightmare.

So with all these benefits, where lies the risk or obstacle to adopting a VDI infrastructure for your company? Well as with most technology there rarely exists a one solution fits all scenario and VDI is no different. Prior to any consideration for VDI, a company must first assess their infrastructure and whether VDI could indeed reap these benefits or alternatively possibly cause it more problems.
One of the first issues to look for is whether the organization has a high percentage of end users which manipulate complex or very large files. In other words if a high proportion of end users are constantly in need of using multimedia, 2D or 3D modeling applications, or VOIP, than VDI should possibly be reconsidered for a better managed desktop environment. The performance limitations that came about with server-based computing platforms such as Microsoft's Terminal Services with regards to bandwidth, latency and graphics capabilities are still fresh in the mind of many old school IT end users and without the correct pre-assessment those old monsters could rear their ugly head. For example an infrastructure that has many end users using high performance / real time applications should think carefully before going down the VDI route regardless of what the sales guys claim.

Despite this though if having taken all this into consideration and realizing your environment is suited to a VDI deployment the benefits and consequent savings are extensive despite the initial expenditure. As for which solution to take this leads to another careful consideration and one that needs to be investigated beyond the usual vendor marketing hype. 

Firstly when it comes to server virtualization, there currently is no threatening competition (certainly not in the Enterprise infrastructure) to VMware’s VSphere 4. In the context of desktop virtualization though, the story has been somewhat different. Citrix’s XenDeskTop for those who’ve deployed it certainly know that it has better application compatibility than VMview 3. Add to the problems of multimedia freeze framing that would often occur with the VMview 3 solution and Citrix looked to have cornered a market in the virtual sphere which initially seemed destined to be monopolized by VMware. Since then VMware have hit back with VMview 4 which brought in the vastly improved PCOIP display protocol which dwarfs their original RDS protocol and simplified their integration with Active Directory and overall installation of the product, but in performance terms XenDeskTop still has an edge. So it comes as no surprise that rumours are rife that VMWorld 2010 which is soon to take place in a couple of weeks will be the launching pad for VMview 4.5 and a consequent onslaught on the Citrix VDI model. Subsequent retaliation is bound to follow from Citrix who seemed to have moved their focus away from the server virtualization realm in favour of the VDI milieu which can only be better for the clients that they are aiming for. Already features such as Offline Desktop, which allow end users to download and run their virtual desktops offline and then later resynchronize with the data center are being developed beyond the beta stage. 

So the fact remains that quickly provisioning desktops from a master image and instantly administering policies, patches and updates without affecting user settings, data or preferences is an advantage many will find hard to ignore. So while VDI has still many areas for improvement, depending on your infrastructure it may already be an appropriate time to reap the rewards of its numerous benefits.


More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

Latest Stories
Excitement and interest in APIs has skyrocketed in recent years. However, if you ask a room full of IT professionals "What is an API", you will get a wide array of answers. There exists a wide knowledge gap between API experts and those that have a general idea of what they are, but are unsure of what they have been for in the past, what they look like now, and how they can be used to expand your business in the future. In this session John will cover what the history of APIs, what an API looks ...
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time t...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DevOps with IBMz? You heard right. Maybe you're wondering what a developer can do to speed up the entire development cycle--coding, testing, source code management, and deployment-? In this session you will learn about how to integrate z application assets into a DevOps pipeline using familiar tools like Jenkins and UrbanCode Deploy, plus z/OSMF workflows, all of which can increase deployment speeds while simultaneously improving reliability. You will also learn how to provision mainframe syste...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next...
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure...
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...