Welcome!

Related Topics: SYS-CON MEDIA

SYS-CON MEDIA: Article

Desktop Virtualization... The Right Way (Part 5)

A Bad User Experience Will Break You

The user discussion doesn't just end with an understanding of the topology. Desktop virtualization architecture will only get a users so far: access to a virtualized desktop. If the virtualized desktop does not provide the required experience in different scenarios, users will find ways of reverting back to their traditional model or find a way to make life very difficult for you, the architect of this less than stellar solution.

Trying to get these users back is challenging as the bad perceptions must be changed and that takes time. Many of the missteps with regards to the user experience are based on improper analysis and planning. In order to have an environment that is aligned with the user community, understanding the following items are critical.

  • Network Impact: Desktop virtualization requires a network connection, either temporary or permanent depending on the virtual desktop model selected.  Trying to understand the network impact is not a trivial task and will never get one to the exact numbers because user will do different things like typing, printing, browsing, Flash video, WMV video, online Facebook games, etc. However, the Performance Assessment and Bandwidth Analysis white paper should help understand the impact of each activity and allow an architect to plan appropriately.
  • Peripherals: One of the beauties of a traditional desktop is it is customizable with peripherals: printers, scanners, webcams, and external drives. These requirements must be understood and supported, but not at the expense of security.  For example, should users be able to copy data from the data center to a personal USB storage drive? This might be construed as a security hole. What about allowing a user to copy a file from the USB drive to the data center? This might put the data center at risk for viruses or malware. The justification for certain devices must be determined, but regardless of the outcome, proper security procedures must be put into place.
  • Resources: Users who are not given the proper amount of dedicated resources (CPU and memory) are either left with a desktop experience that is unusable due to the constant delays and sluggish responses because of competing resource requests or a desktop with ample resources but costing the business significant amounts of money due to unused and idle hardware. Although it is easier to allocate one resource configuration for every user, users have different requirements and should be given different configurations. It is usually a better option to create 3-4 different resource configurations for Light, Normal and Power users.  With proper analysis of the requirements, users can be placed into one of a few defined configurations.
  • Mobility: A user's requirement for offline mobility plays an important part in the over analysis. This one requirement significantly limits the possibilities for the user in respect to the most appropriate FlexCast model. Many desktop virtualization models require an active network connection. An active network connection is not guaranteed for the mobile user. Identifying this group of users allows for the design of an offline model of desktop virtualization.


These are some of the most important things to understand regarding the users and their experience expectations.  If a user believes they are allowed to use a webcam within their virtual desktop and it does not work, that user now has a bad perception. The experience matters to the user,  so it must matter to the architect.

More Stories By Daniel Feller

Daniel Feller, Lead Architect of Worldwide Consulting Solutions for Citrix, is responsible for providing enterprise-level architectures and recommendations for those interested in desktop virtualization and VDI. He is charged with helping organizations architect the next-generation desktop, including all flavors of desktop virtualization (hosted shared desktops, hosted VM-based desktops, hosted Blade PC desktops, local streamed desktops, and local VM-based desktops). Many of the desktop virtualization architecture decisions also focuses on client hypervisors, and application virtualization.

In his role, Daniel has provided insights and recommendations to many of the world’s largest organizations across the world.

In addition to private, customer-related work, Daniel’s public initiatives includes the creation of best practices, design recommendations, reference architectures and training initiatives focused on the core desktop virtualization concepts. Being the person behind the scenes, you can reach/follow Daniel via Twitter and on the Virtualize My Desktop site.

Latest Stories
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develop...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...