Virtualization: Benefits, Challenges
Tuesday, December 8, 2015 @ 03:12 PM gHale
Editor’s Note: Use of virtual systems continues to grow in the manufacturing automation industry. This is the second of a two-part series, excerpted from a Statseeker white paper, looking at the growing nature of virtualization. This report looks at the benefits and challenges of implementing virtualization.
By Frank Williams
Virtualization has significant benefits in computing and in networking and that is why both have been accepted so readily. This is especially true in OT networking and control systems, where the rest of the system is intended to live for 30 years, and the life of the computer and network components is less than two years.
Virtualization also permits rapid changes and agile re-deployment, which is necessary in the Internet of Things (IoT) environment.
Virtualizing computers and servers, as well as network components, can add a significant measure of safety and robustness to the network.
Storing images of the virtual machines off site, in the cloud, or at another location means that if the site has an accident, or the site network ends up destroyed by weather (like Hurricane Katrina did to many petrochemical plants) it will be easy to re-construct the systems, re-use the disk images, and be back in business months earlier than if the systems were not virtual. In addition, virtual systems have a failover mode, where a defective disk simply switches to a backup on the fly, and the failed component can end up repaired, while the system continues to run.
As we have noted, especially in OT systems, such as building automation, factory automation and process control system networks, there is a fundamental issue with lifecycle.
The control system, its I/O, and the final control elements (valves, etc.) end up built to last the life of the project—easily 30 years. Unfortunately, through the action of the market and Moore’s Law, computer, server, and network components have a lifecycle of about 18 months. This disparity is infamous in project work, where the time from project initiation to startup of the network and control system may be as long as 48 months. So, when the end-user receives control of the system, it is obsolete by as much as 30 months, and may not be maintainable.
Virtualization solves this problem by creating virtual machines that run on operating systems that would otherwise be obsolete and no longer maintained. As an extreme example, there might be a LIMS (Laboratory Information Management System) that operates on Windows 95, and which would have to be completely re-written to run on a modern operating system. Running this application on a virtual machine allows the user to continue to use the application, and the operating system it was written for, without worry that it is obsolete and not maintainable.
More Secure Environment
Completely virtualizing your servers and networks allows you a measure of security that you didn’t have before. Just virtualization won’t necessarily make your system secure, but it will get rid of much of the chance for hardware to be compromised by, say, inserting a USB stick, or a CD-Rom or DVD with malware on it. Virtualization severely reduces the number of physical devices that you need to have control over, too.
You also have the ability of easier network segmentation, and more direct control with policies and procedures. This means a great deal when you have a dynamic network where the edges are variable, due to user devices going in and out of the network.
While there are great benefits from virtualization, there can also be serious challenges. One of the challenges is the IT staff, OT staff, or system administrators must truly know their servers and network. Especially in a virtualization overlay on an existing physical network, the administrator must know exactly what the system is doing, what it needs to do and how it will it can grow for future expansion.
You can’t just throw another managed switch on a line and call it good. You need to make sure the data center you are virtualizing has adequate and appropriate electric power and backup generation in case of power outages. You need to make sure the building you’re in has adequate heating and cooling resources, and it is secure from physical penetration.
From your design, you need to make sure that the virtualized system has enough availability to operate better than the old system did.
As a part of that understanding, users should consider:
• Changing best practices. This means system administrators need to borrow from OT systems the concept of FEED—front end engineering design. The virtualized network must end up specified at least as well as you would specify a physical set of hardware and software. A FEED must be clear and complete, and bought off on by all the stakeholders in the system.
• Changing standards. “The nice thing about standards is that there are so many of them,” said legendary computer scientist Andrew S. Tanenbaum. Tanenbaum may be cynical, but he is not wrong. One of the things that can bite a virtual system is a change in a standard that makes the way the system is virtualized not work, or not work well. The system administrator needs to keep up on standards better than just running a standard hardware/firmware system. One of the issues virtual systems must deal with is the hardware. Often, the idea the system is virtual is taken to mean you can run the system on significantly less costly servers and other hardware. This is far from true. In fact, the hardware and firmware you use in a virtual system needs to be much more robust than a conventional system.
• Changing the architecture of the network. You are going to have to implement from the very beginning a network information management tool. In any virtual environment, it is even more critical than in a standard networking situation, to be able to see down into the system—to be able to see all the devices and nodes, virtual or not, that are on your network. The user needs the ability to scale from a small system to a huge system of various interfaces. Otherwise, you’ll drive yourself crazy trying to troubleshoot the virtual system. You will also need to avoid virtual machine sprawl, and storage will need to be centralized and not located at each computer. And in doing that, you need to make sure that security is not dropped off. In a virtual network, combining a network information management tool with a good vulnerability scanner is critical to proper security implementation.
• New skills and organization for IT and admin staff. You and your staff need to have training and experience in handling virtualization and virtual networks. The system is not the same as a standard system. It needs to be operated, designed, and maintained differently, and those skill sets must be available to you before you start virtualizing your systems.
• IoT, the Cloud and Virtualization. Virtualization is ubiquitous, and the sensor-centric networks that make up the IoT are becoming ubiquitous as well. Most data goes to the cloud, where virtual servers and hosted desktops permit DaaS (Data as a Service) applications to be ubiquitous as well.
Virtualization technology is in thousands of devices and systems already, and with the huge growth of IoT and Cloud computing, lives in the demanding and intense manufacturing automation environment will end up smoother, more efficient and profitable.
Automation industry veteran Frank Williams is the chief executive at Statseeker, a provider of network monitoring technology. For more details click here to view the white paper entitled “Virtualizing Your Network Benefits and Challenges”.
Leave a Reply
You must be logged in to post a comment.