Where do you work?

Most of the work done in offices is virtual. That’s to say, it isn’t work that requires a physical process. People work at computers with their minds rather than at a workbench with their hands. If the work is virtual, why lock it in a box? Why confine it to a location? This might be a throwback to when work wasn’t virtual, who knows? In the past, if you wanted to work in multiple places, people would lug laptop computers around so they could have the same workstation experience wherever they went. They could boot up their laptop and have all their software and data waiting for them. But if something happened to that laptop, disaster! This is where virtualization steps in. Although it comes in different flavors, the core concept is that people can login to their same ‘computer’ from anywhere using any computer. This means that when working from home, they can use another computer to access the same computing experience that they have at the office. Since no critical information or software is located on the laptop, if their laptops dies, simply replace it.

Set your computers free.

Rather than being trapped in a physical box, virtual workstations exist logically in a one-to-many environment. In the data center, groups of servers can be combined with specialized software that allows the hosting of these virtual workstations. Users then have simple software loaded on their local computer that allows access to their workstation. Another benefit of virtualization is resiliency. Should a desktop computer have a hardware failure, it can be swapped out with a replacement in minutes, barely missing a beat. Nothing to reinstall and no data lost. Their ‘computer’ is safe and sound in a secure data center. Choosing the correct virtualization schema that provides the highest level of safety and utility is key. In some instances, such as where field workers require mobile computing, a mix of technologies is required.

The history.

Virtualization is generally believed to have begun in the mainframe days in the late 1960s and early 1970s. IBM implemented time-sharing solutions with the shared usage of computer resources among a large group of users dramatically reducing cost. This made it possible for organizations, and even individuals, to use a computer without owning one. Text based terminals without any computing ability were used to access the virtual computers within an actual computer.

Cost.

The capacity of a single server is so large that it is almost impossible for most workloads to effectively use it. Most organizations have the most powerful computers performing the simplest tasks, wasting the resource. The best way to improve resource utilization, and at the same time simplify data center management, is through virtualization. Typical factors that drive up costs are reduced, maintenance, overbuilt workstations, lost time, multiple use licenses, etc.

How it's done.

Starting at the ‘top’, virtualization can be implemented using desktop as a service (DaaS) from a third-party provider. The ‘desktops’ run in virtual machines that are hosted on infrastructure supplied by a cloud provider. Users access their desktop environment from a wide variety of devices, including PCs, laptops, tablets and some smartphones. This bypasses the traditional IT model in which IT administrators install an operating system and applications on every employee device. With that model, administrators often spend too much time and money installing software, managing upgrades and updates and securing the devices. The traditional desktop deployment model is also a poor fit for an increasingly mobile and remote workforce. Employees can use a wide variety of devices, including desktops, laptops and mobile devices all with a consistent user experience across all devices. This approach enables workers to easily access the same applications and data regardless of the device they use. In the DaaS model, the computing, storage and network infrastructure are managed by a cloud provider. The organization that provides desktops to its employees can manage the desktop operating system, applications, anti-virus software and any other desktop-related tasks—or work with a third-party managed desktop service provider. DaaS provides all the benefits of a cloud-based managed service. For example, with DaaS, you eliminate the large up-front costs of building on-premises. DaaS offerings typically use a subscription model that requires no up-front investment. You also offload all the administrative work required to support, maintain, patch and upgrade to the DaaS supplier. Not using a DaaS provider by self hosting the servers onsite offers agility and more cost predictability. While there are also technical considerations that improve performance when the servers and clients are on the same onsite local network, the organization is burdened with initial build-out costs.  As with DaaS offerings, this virtual desktop infrastructure (VDI) delivers desktops to devices from a local centralized data center. Generally, workstations use remote desktop protocol (RDP) to remotely connect from one device to another. You run the RDP client software either on top of an operating system (thick client) or on bare metal (thin client) such as older PCs. Stand alone devices (thin clients) are also available. These are small and low cost, making them easy to deploy. In practice, you make the connection, and the virtual computer’s desktop appears on your device. Then, you can use that computer, transfer files, install or uninstall software, and work with it just as you work with your own device. RDP requires two pieces of software to work. RDP client software running on the computer requesting access and Remote Desktop Services (RDS) software running on the computer being accessed. RDS is a key feature of Microsoft Windows Server. It enables users to remotely connect to Windows desktop.

For remote connections, RDP connections are usually run in secure network tunnels for additional security.

Which to use?

These technologies all make remote connections possible. They provide a desktop that feels like the local desktop users are accustomed to. Their performance is pretty much similar as far as the user is concerned in terms of speed and latency. What is chosen is more a function of the organization’s needs and concerns. As with any IT implementation, utility, security, and compliance issues decide.