What’s wrong with a physical reference computer?

Spencer profile
Spencer Dunford|August 18, 2014
General blog image
General blog image

Using a physical device as a reference computer is the long standing process for creating disk images, and for good reason. Given the history of how imaging processes came to be, this was really the only option. But that isn’t the case these days. In addition, now that we have so many devices and configurations to support, tying an image to hardware isn’t a scalable or sustainable process. Even though breaking tradition and habit can be a challenge, there are new methodologies for imaging that should be evaluated and which can save time, money, and frustration.

Hardware technology is advancing and evolving at a breakneck pace. It isn’t like the old days, where we’d have just a couple of models to maintain — hardware diversity and bring-your-own-device (BYOD) trends are forcing IT to adapt to new management methodologies. Using legacy imaging methods simply isn’t sustainable for this new landscape. From laptops and desktops to tablets and slates, having multiple device types within a company makes image management and deployment increasingly difficult. When you use a physical machine as a reference computer or a sector-based hard-disk image, you have to acquire and maintain additional computers just for the imaging process. When using this legacy deployment practice, you can end up with a large image library that will be increasingly difficult to maintain over time.

No one wants to be the guy that has to sift through equipment in the IT closet to find the specific machine that needs to be updated. Creating a separate image for each device in your environment can lead to bandwidth and storage use that doesn’t need to occur, especially at the cost of such high overhead. In addition to the time it takes to make separate images for each model, you can spend even more time managing and maintaining those images. In the long run, using a physical machine as a reference computer can increase expenses, take longer, and create general confusion.

When it comes to image creation, you have to make sure that your reference machine is clean and only contains the operating system, applications, and patches that you want to deploy. Often times when there are problems at the endpoint after a deployment, the issues actually started back at the reference machine. When customers experience problems on their newly deployed endpoints, we can typically trace those problems back to an error that was never corrected on the reference machine and that has now been replicated throughout the environment. It is important to make sure that the initial reference computer is perfect, and this can take a longer time to complete when using a physical reference computer.

Image updates

Often when an image needs to be updated, the physical device that the image was first built on needs to be built back up to a usable state. Whether your update contains current patches or new applications that your organization now wants to utilize, it usually takes some time to set up the machine the way you want it. As you have probably experienced, rebuilding your reference machine is an all-day project that takes additional time and effort. You have to make the changes and configurations, go through the process of recapturing the image, and then deploy the updated image back out over the network. What it boils down to is that your image library is outdated, you do not trust the quality of your images, and your organization is at greater security risk.

If I shouldn’t use a physical device for a reference computer, what should I use?

Virtualization is the answer to all of the glaring implications of using a physical machine. Using a virtual machine (VM) to create your image leads to a clean and controlled deployment in any environment from the start. For those who have been imaging from a physical device for years and years, using a VM as a reference machine is going to require some rethinking. You’ll notice, however, that using a VM is quickly becoming an imaging best practice when developing a desktop deployment process. Using a VM provides a clean, controlled, economical, reliable, and repeatable environment to build from.

Virtual machine benefits

Virtual machines do not contain drivers associated with any physical devices. Once the virtual machine is captured, the image carries only the software-specific information that you want perpetuated out to the rest of your environment. Using a VM makes image management and updates easier by logically and physically separating software information from the hardware that you support. This means fewer images to create and a smaller image library to maintain. Updating images is easy using a VM. Simply power up the VM from your desk, update and apply your changes, and then shut it down. This makes for a faster and easier method to keep updates current and on schedule.

Using a virtual machine as a reference computer also allows for a much more flexible set of tools that can be continuously updated over an evolving environment, while minimizing overhead and keeping the process organized. We require it here at SmartDeploy, for these reasons and more. Contact us to chat more about ways to improve your Windows deployment strategy.

Spencer profile
Spencer Dunford

Spencer was a SmartDeploy employee.


Related articles

Ready to get started?

See how easy device management can be. Try SmartDeployfree for 15 days — no credit card required.