1. Context of the Problem
The focus of IT industry on the virtualization technology has been increasing in the past years.Virtualization is considered to be among the most significant technologies, which have been making influence on computing.In spite of the fact that the roots of virtualization date back to the second half of the previous century, the current peak of its popularity makes analysts predict the growing use of virtualization within companies over the next years. Virtualization is predicted to change a computing landscape due to such promising benefits as greater security, infrastructure consolidation, ease of management, lower costs, better productivity of employees, and others (Campbel & Jeronimo, 2006).
Virtualization is known to be the simulation of hardware and software. It is a way that allowed one to simplify their IT infrastructure, creating a more dynamic and flexible infrastructure (Virtualize your IT Infrastructure, 2013). Virtualization can also be viewed as part of an overall IT enterprise trend that includes autonomic computing and is implemented with the aim tocentralize administrative tasks in the process of improving scalability and utilization of the common hardware resource.Virtualized componentsmay include operating systems (OS), hardware platforms, network devices, storage devices, etc. (Scarfone, Souppaya & Hoffman, 2011).
Being a framework ofdividing computer resources (software, hardware, time-sharing, etc.) intoseparate virtual machines, which are able toexecute instructions and do not depend on the operating system, virtualization allows people to run a system that is independent in operating within the existing operating system using the existing hardware resources.Thus, virtualization gives the possibility of running several instances of an operating system concurrently on one computer.
The roots of virtualization go back to the 1960s when it was developed by IBM with the aim to utilize the mainframe hardware by partitioning them into virtual machines in a logical way and perform many different applications and tasks at the same time. In the 1960s and 1970s, IBM invested a lot of effort and time in developing robust time-sharing solutions. At that time, these services were too expensive as they consumed more resources than what could be developed (History of Virtualization, 2011). The average consumer knew very little when it came to virtualization. Virtualization was too expensive to even produce as the technology was not so developed.
During the 80s and 90s, virtualization was discarded and overlooked due to the desktop computing era, and the OS of that time revolved around x86 hardware platforms. This new age in technology was known as the ‘client/server’ age when virtualization was nearly forgotten. At that period, the leaders of the PC/Desktop and Server world involved Windows, Linux, and Apple. However, new challenges appeared. They included the high cost of management and maintenance, disaster and failure protection, and high infrastructure cost. These challenges led to the invention of virtualization for x86 platform. During the same time frame, businesses were depending more and more on computers for their operations (History of Virtualization, 2011). Companies moved from paper pushing to running their accounting, human resources, and many other industry-specific and custom-built applications on mainframes or microcomputers. The personal computing solution was a more affordable way out. Over time, many organizations expanded their server rooms by introducing increased costs to maintain them.
During the late 90s, the issues revolving around IT infrastructure caused VMware to intervene and develop their own virtualization platform. This technology shattered the status quo in how the physical computer and servers were handled. Resources became underutilized over time because of the x86 system as technology evolved. With the evolution of technology, servers use only around 10 to 15% of capacity today. The costs associated with powering these servers as well as the costs of cooling the environment in a server room increased. The ability of running multiple operating systems on one physical box intrigued many IT professionals. Computers based on the x86 design are faced with the same problems of rigidity and underutilization that mainframes were faced in the 1960s (History of Virtualization, 2013). Not only the costs of servers but also the costs of powering all network routers, switches, and need for external storage increased. Over time, organizations expanded their infrastructure by applying storage arrays and implementing storage area network (SAN) solutions.
Some of the problems related to the low utilization of infrastructure, increasing physical infrastructure costs, costs of IT management, insufficient protection form failover and disaster, and others. Virtualization had many challenges and obstacles faced by the x86 platforms. It was stated that x86 machines were not designed with the aim of supporting complete virtualization, and VMware overcame big challenges in order to create virtual machines out of x86 computers (History of Virtualization, 2013). Because the Windows OS was designed to be a single user operating system having just one application, it was installed on this operating system without problems. The problems related to compatibility issues where the installation of other programs and the specific requirement of each program caused many different resource errors, as well as many operating system failures. Applications have added more complexities and placed more constraints on the organization over the years. Some of the underutilized hardware resources are related to a central processing unit (CPU), memory, storage, and network. The organizations, which use the single server single application method, continued to purchase servers well underutilized.
Organizations within a single company did not want any common infrastructure. Human Resource and Payroll departments declared that their data was too sensitive to allow the potential of another group to use their systems. The need for virtualization was apparent, and, with the introduction of virtualization in the late 90s by VMware, many of these problems faced by the underutilization of architecture were solved. Companies such as VMware and Microsoft provide virtualization, which focuses on the emulation of physical hardware of a computer to improve performance, management, and reliability of the systems.
Today, organizations face high costs due to the single-server application method of thinking. As storage needs some increase and demands new applications, these organizations have found that their server’s environment started to become more and more unmanageable. Virtualization brought the ability to condense multiple physical servers into one server that would run many virtual machines, allowing the physical server to run at a much higher rate of utilization.
The two most prominent competing corporations in the virtual work are VMware and Microsoft. VMware was founded by Diane Green, Dr. Mendel Rosenblum and Ed Bugnion in 1998. The first Windows and Linux virtual product was released in 1999 and named VMware workstation 1.0 (Virtualization Pro, 2009). In the same year, VMware introduced a virtualization solution, which made it possible to transform the x86 system into an isolated shared hardware infrastructure. The first breakthrough product was the VMware ESX, which targeted servers of the enterprise organizations in 2001. Over the years, this company released many other desktop and server related products. Microsoft in retaliation to VMware’s release acquired virtualization technologies from Connectix Corporation in 2003. Connectix Corporation was a leading provider of virtualization software for Windows and Macintosh-based computing at that time. In July 2006, Microsoft introduced a program of Windows-hosted virtualization that was called Microsoft Virtual PC 2004. Later, during the launch of Windows 2008 server product, Microsoft introduced its Hyper-V solution in direct competition with VMware’s infrastructure x86/x64 virtual solutions.
Virtual machines have been implemented to be used like a standard server but with significantly lower costs of management and maintenance. It is believed that this technology has a huge future potential and perspective and will play an essential role in computing. Though, virtualization, together with the advantages, presents different problems at different stages (planning and analysis, virtual infrastructure maintenance, adaptation and post-adaptation period) of its adopting. In order to continue its sweep through the company, the technology has to overcome the problems related to I/O performance, which are caused by the running of different virtual machines on one server. As soon as the issues of I/O are solved, virtualization will become more useful for production server and applications of an end-user. However, to function well in these areas, it is necessary to improve virtualization security. Virtualization also adds a lot of complexities, including technical, organizational, and philosophical.
Thus, virtualization is considered to be a very powerful tool that provides the ability to reduce server capacity, lower costs, and enhance the information sharing from multiple systems. However, there exist different types of virtualization that can be implemented and have an impact on different sectors of an organization. In case virtualization is properly implemented and utilized, it can bring significant benefits.
2. Statement of the Problem
Hardware utilization has become one of the main issues that are relevant to today’s systems. Many organizations face the issues of increasing physical infrastructure and IT management costs, utilization of low infrastructure, protection from disaster, insufficient failover, and high maintenance of end-users desktops. Compatibility issues were apparent, and, over time, businesses developed the theory of one application for each server, which led to the fact that organizations had an average of 10 to 15 servers in their environment to accomplish specific tasks. Experience and awareness related to virtualization is another challenge as the technical staff of many organizations is almost inexperienced in the virtual environments and does not know how to virtualize their own infrastructure. This study will examine virtualization as a suitable replacement for the single server, using the single application method. Therefore, this proposal will demonstrate a clear understating of virtualization and may be a determining factor for these organizations to invest in a virtualized infrastructure through the replacement of single server by the single application methodology.
3. Research Questions
This research will address virtualization as a reliable and suitable replacement for the single server through the single application method; a good understating of virtualization is a determining factor that may lead to the investment in the virtualized infrastructure. The collected data will determine if there are any challenges in the IT sphere, which may hinder virtualization. The following questions will be used to gather information:
- What are the factors that determine the need for virtualization in the organization?
- How can these factors determine the organization’s need to virtualize?
- How can the implementation of virtualization environment help prevent low infrastructure utilization, increased physical infrastructure costs, increased IT management costs, software compatibility issues, insufficient failover and disaster protection, and high maintenance end-user desktops?
- How can virtualization education and training help increase cultural awareness in an organization regarding its information system technology?
4. Significance of the Study
This case study is important because it recognizes that the use of virtualization to virtualize and in some cases replace conventional hardware platforms has an impact on the overall cost and organizational environment. This study will help clarify how virtualization is used to enhance the organization’s process and how this technology affects all participants. Businesses are continually changing and looking for ways, in which they can elevate the infrastructural issues, and virtualization has shown its effect on the overall costs by providing a service that would utilize more of their system resources. These organizations have changed their views on how they can utilize and incorporate this technology into their design.
A company infrastructure may contain different servers, applications, and operating systems that have to interact with a big number of computers. It is a complex task, which requires a lot of time to support and manage all the assortments of mission-critical technology. In order to increase the utilization of hardware, reduce costs, optimize the infrastructure of a business network, and speed up the routine tasks, the implementation of desktop virtualization may really help. Nowadays, the deployment of virtualization technology is considered to be very popular. It is also a cost effective way of dealing with challenges (Khan, 2011).
This study will further explain the determining factors that will help an organization meet its goals to virtualize the infrastructure. As the technology continues to evolve, more and more organizations will continue to adopt virtualization and utilize more of this technology within their environment. This study will illustrate how the environment can increase infrastructure utilization, decrease physical infrastructure costs, decrease IT management costs, reduce compatibility issues, increase failover and disaster protection methods, and lower maintenance needs to end-users' desktops. Lastly, this study will show the current level of education in relation to virtualization and set the direction for continual improvement in educational programs involving virtualization. While this study may not steer many educational institutions in increasing or decreasing education related to virtualization, but it will help contribute to a better understanding of the technology for all information technology majors.
By virtualizing environments, organizations simplify their IT infrastructural needs by creating a more dynamic and flexible environment. Either using a Microsoft or VMware solution, virtualization helps meet the goals of these organizations and increase the overall productivity. Some of the benefits provided by virtualization help reduce the capital expenses through the use of server consolidation. In essence, by applying a virtual environment, these organizations are promoting a greener IT infrastructure. Virtualization helps improve the operating expenses through virtual automation, minimize the lost revenue by reducing power consumption and associated downtime. The efficiency of using a virtualized software will help managers make the best possible decision for their environment. Applications such as Microsoft Exchange, SQL, SharePoint, IBM Lotus Notes, or any other that is complex and unique will benefit from virtualization.
Depending on the goals of organization, virtualization works with hardware and software, storage, networking and security products. The factors needed in an organization determine how its infrastructure will be virtualized. For instance, an organization that has 14 servers, all of which are unique and complex, may need to evaluate the need for virtualization. All of these servers may be running specific software pieces that increase complexity, which supports the single server single application method. They need to evaluate virtualization as a solution as this study will inform about the significant impacts.
Virtualization provides flexibility where one can allocate more resources to any virtualized server. Either by adding more memory or providing more CPU cores or even allocating more storage, the flexibility of virtualization helps meet the organizational goals. Availability is needed in an organization that requires continual uptime of their servers. The loss of one physical server can be detrimental to an organization. In the virtual infrastructure, one can see the potential of quick brining a hosted machine online. The ability of IT team to allocate resources is considered to be phenomenal. Resource utilization has been an issue with current systems. Applications with their complex nature prevent the load of other applications on the same machine. This causes the fact that organizations are forced to use the single server single application theory when implementing multiple application platforms. As the hardware costs lowered and became cheap enough to purchase, virtualization became more cost efficient. The benefit, which these organizations obtain from virtualization, is the use of new resources. They are able to consolidate all purchased resource requirements.
Cost saving is one of the main reasons why an organization ventures into virtualization. The cost to power 30 servers is far not the same as to power 2 or 3 single virtual servers. Organizations that continue to adopt the single server single application theory continue to expand their server room, which in turn increases the overall cost of their maintenance.
Khan (2011) believes that organizations, which use virtualization to manage workloads and partition by groups of configuring server into flexible resource pools, are well positioned to meet and change demands of a marketplace. It has been stated that VMware virtualization made it possible to reduce capital expenses by server consolidation and improve operating costs through automation, minimizing lost revenue at the same time due to planned and unplanned downtime reduction. VMware virtualization is ready to work with a variety of software and hardware, including security products, storage, and networking (Infrastructure Virtualization and Management, 2013).
Scarfone, Souppaya, and Hoffman (2011) are convinced that an increase in the full virtualization products and services is due to many benefits. The most common reason to adopt full virtualization is operational efficiency. A greater load on each computer enables organizations to use existing hardware in a more efficient way. The use of full virtualization gives servers the opportunity to use more memory and processing resources of a computer than the servers that run an instance of a single OS and single service set. The authors also find that desktop virtualization provides support for applications running on a particular OS and supports control of OSs (Scarfone, Souppaya & Hoffman, 2011, p. 1). Virtualization also provides improved business continuity, customer service, and service level agreements (SLA) on IT services, reduces operating expenses, gives better IT response to the request for new services. Thus, virtualization can reduce costs, improve business continuity and further business agility. According to statistics, virtualization induces the following operational benefits: downtime of a system is reduced by 26%; server incidents are decreased by 27%; testing and reworks are improved by 26%; market application time is improved by 22%; a new project number is increased by 21% (Business and Financial Benefits of Virtualization, 2011, p.3, 5).
Thus, virtualization is regarded to have the following advantages:
- Isolation. In case of a virtual machine crash, other machines (a host machine and other virtual machines) are unaffected due to isolation.
- Partitioning. The configuration of operating systems and various application programs in a single physical system saves costs as the company does not have to purchase other physical machines.
- Encapsulation. A single file, which contains information from the complete environment of a virtual machine, is easy to move, cope and backup (Khan, 2011).
- Cost. The consolidation of smaller servers into powerful ones makes it possible to reduce costs. The cost reductions of VMware cite range from 29 to 64%.
- Adaptability. It is possible to move processors from one virtual machine to another with the help of autonomic computing-based resource allocation techniques.
- Load balancing. It helps improve performance.
- Security. The compartmentalization of environments with different requirements of security in different virtual machines gives opportunity to select the guest operating system and appropriate tools for each environment.
- Legacy applications. It is possible to continue running legacy applications on the old SO, even after migrating to another operating system (Menasce, 2005).
- Combination. Virtualization is able to combine the power of processes, people, and technology to streamline and automate the management of IT infrastructure, leading to the reduction of administration and support costs and providing the insurance of business continuity and maximum system availability.
- Optimization. Virtualization can optimize existing IT investments in order to improve operational efficiency and gain financial flexibility, which is needed to reach the strategic business goal and focus on innovation (Networking Doesn’t Need a VMWare …, 2012), and others.
Due to the benefits, virtualization has become one of the most interesting technologies, which was rapidly adopted during the past few years. It has attracted the computing world with the variety of its features and the fact that it has demonstrated many ways, which may change the traditional data centers. Thus, virtualization has become a very interesting and popular research topic.
5. Research Design and Methodology
After analyzing the research questions, qualitative interview was identified as the most appropriate method to use since it would provide us with a richness of data virtualization, its need, and benefits. We conducted semi-structured interviews and used information received from observations, surveys, focus groups, and internal data to do the virtualization research. Data for the research were derived from the face-to-face interviewswith corporate managers, web presentations, and annual reports of different companies. The research design required the use of semi-structured interviews in order to produce scores in a standardized way and in-depth insights.
5.1. Qualitative Interview
As we seek to provide a deep understanding of virtualization and its need for organizations, qualitative research was found to be the most appropriate form for our research topic. Qualitative interviewing is considered to be very flexible as it responds to the direction of an interviewee and adjusts to the research with the emphasis on the most important issues that might appear while interviewing. This type of interviewing provides a greater interest in the interviewee’s ideas and beliefs and reflects the researcher’s concerns. In the qualitative interview, interviewee is encouraged to go off the topic as it helps see what point is the most interesting or important for the person and allows to reveal the nuances.
Qualitative interviewing does not require following any schedule or guide. Even if any guides or schedules exist, interviewers are allowed to depart from them. Interviewing is regarded to be a view in order to interchange between two or more people on a topic of mutual interest. It helps see the centrality of human interaction and emphasizes the social ‘situatedness’ of research data (Kvale, 1996, p. 14). The order of questions may be changed, and the questions may be even paraphrased. The question should motivate the respondent to give precise and full answers and avoid biases caused by conformity, social desirability, and other disinterest constructs (Hoyle et al., 2002, p. 144).
Thus, the qualitative interview is believed to be a good choice for the research as it helps obtain a deeper understanding through detailed information, ascertain the experiences and opinions, get answers to puzzling questions, and trace events which evolve over time. It is important to remember that a good interview is based on good communication skills. It is vital to ask and be attentive while listening to the answers, identify when it is necessary to talk long and be brief, see when there is a need to change the subject. The interview is conducted in response to businesses virtualizing their infrastructure. The questions, which were asked, relate to the business challenges.
5.2. Semi-Structured Interviews
The interview is dependent on the interviewer communication skills (Clough & Nutbrown, 2007), ability to listen, make pauses and ask clearly structure questions. It is of utmost importance to let the interviewee talk freely, and attention should be also given to humor and humility as they show the trust between participants and relational aspect.
Semi-structured interviews are considered to be non-standardized, and they are often used in qualitative analysis. The aim of the interviewer is to do the research in order to test a specific hypothesis (David & Sutton, 2004, p. 87). He has a list of issues, key themes, and questions, which should be covered. The order of the interview questions can be changed. Hence, the interviewer knows the questions that he is going to cover, but it is allowed to explore different feelings and thoughts. In our case, the questions, which were asked, are related to virtualization, the factors that determine the need for virtualization, implementation of virtualization environment, virtualization education and training. In semi-structured interviews, interviewers have an opportunity to tailor the questions in accordance with the discussed context. In case the interviewee changes the subject, the interviewer is allowed to bring him/her back, asking the prompt questions.
In order to develop appropriate and well-structured questions, we were involved in the observation and informal interviews. The benefits of such types of interviewing are as follows: the prepared questions give the interviewer the opportunity to be competent during the interview; informants have freedom to express their own views and provide us comparable and reliable data.
A survey is considered to be the way of gathering information about certain opinions, needs, or characteristics (Tanur, 1982). Surveys, which are conducted for the research purposes, are characterized in the following way:
- Their purpose is to produce quantitative descriptions of some aspects. Survey research is a quantitative method that requires standardized information about the subject under study (groups, organizations or communities, individuals, applications, projects, or systems).
- Information is collected by asking predefined and structured questions.The answers usually constitute data, which should be analyzed.
- Information is collected in the way of generalizing the findings (Pinsonneault & Kraemer, 1993).
We conducted a comprehensive online survey in order to gather data for the research. Hence, people engaged in the survey were IT managers from private- and public-sector organizations. Respondents were asked to evaluate or manage the technologies of data protection day-to-day, including data replication software, secondary data storage disk or tape systems, and backup and recovery software.
Being a descriptive method, surveys are very useful in the process of data collection. In the survey, researchers select respondents and ask standardized questions. It can be done in a written form completed by a surveyed person, done online, using the phone, or face-to-face. Surveys give an opportunity to collect data from either a small number of people or a large group. They may be used in all types of variables and are easy to make generalizations (Bell, 1996, p. 68). One of the main benefits of a survey is that it completely elicits information concerning attitudes (McIntyre, 1999, p. 75).
5.4. Focus Groups
The focus group research is considered to be a qualitative research method that aims to gather information that is beyond the quantitative research. Focus groups presuppose the organized discussions, which aim to explore a certain set of issues, including views and experiences of people. From the methodological point of view, the focus group interviews include a group of about 6-7 people with similar experiences and backgrounds. They discuss the issue and feel themselves comfortable to take an active part in a dynamic discussion for 1 or 2 hours.
Focus groups encourage answers which provide a better understanding of opinions, perceptions, and attitudes on the research issues (Hennink, 2007, p. 6). The discussion between focus group participants gives an opportunity to hear and exchange issues that may not emerge during the qualitative interviews.
The methodology of a focus group is very useful when the researcher wants to explore and examine people’s thoughts, ideas, why they think this way, what influences their choice while decision making, and others. This qualitative research will show a group of people that share their feelings, thoughts, attitudes, and ideas about virtualization. The organizing focus group explains the need for virtualization.
5.5. Internal Data
Internal data denote information that is useful in decision making. They are usually derived from the company staff, accounting system, Web site report, market reports, and previous studies. A layout of internal data will explain how to virtualize the infrastructure based on the company’s needs. Collected information will relate to the usage of an application and the way virtualization can help organizations expand their capabilities.
Observation is the most usual method of getting information about anything. Done in a systematic manner, it may be a very effective tool in the hands of a researcher. The method of observation includes mechanical and human observations. The advantage of this method is that it is not necessary to rely on the accuracy of respondents’ reports, their willingness and ability to be accurate. Hence, the data collected with the help of observation is considered to be objective and accurate (Observation Method).
Observation may be done in either a formal or informal form, but it is important to take into consideration the psychological and physiological factors, which may influence the way an observer interprets observational data (Slack & Rowley, 2001, p. 37).
The conducted source in observing the virtual environment will relate to the physical architecture’s performance versus the virtual infrastructure performance. Obtained observations will include a video on a server with response time from startup to finish. Each observation will explain the physical infrastructure verses the virtual infrastructure. This study will only explain observational data and not reference any supported data.
6. Organization of the Study
This proposal is organized into chapters and sections. The first section of the first chapter is the context of the problem, which contains the history of why, where and for whom it is a problem. The next section is the statement of the problem, which tells what the proposal will set out to do. Then, there is the section of the four research questions that will guide this proposal. The fourth section discusses the significance of this proposal, and the fifth section explains the research design and methodology. The last part of chapter 1 is the organization of the study.
Chapter 2 will present a literature review and relevant research associated with the problem addressed in this study.
Chapter 3 will deal with factors that determine the need for virtualization in the organization.
Chapter 4 will discuss the factors that determine the organization’s need to virtualize.
Chapter 5 will explain the potential implementation of virtualization environment with potential provided solutions to help prevent low infrastructure utilization, increased physical infrastructure costs, increased IT management costs, software compatibility issues, insufficient failover and disaster protection, and high maintenance end-user desktops.
Chapter 6 will explain virtualization and needed education and training that would help increase cultural awareness in an organization, requiring its information system technology.
The main purpose of the literature review is to study the already conducted relevant research in the field of virtualization. The review will serve to identify and understand better the widely used definitions and concepts, review the theories and methods, which were adopted by researchers in the IT industry, and understand the aspects of virtualization itself. IT teams are able to reap the benefits of virtualizing their data centers, including storage systems, clients, servers, networks, firewalls, applications, and business services. In recent years, virtualization has become a widely discussed subject in the academic literature: a lot of insights are provided, and various factors, which can determine the need for virtualization, are explained. Nowadays, virtualization introduces a new implementation layer to the networks of traditional computers (Bliekertz, 2010). The main contributors to the field of study are Newton, Campbell, Jeronimo, Anderson, Khan, Menasce, Scafone, Bell, Workman, Sellers, Geisa, Barr, Conor, Matthew, Marks, Stokes, Violino, and others.
The rapid growth of Internet technologies and their usage has caused the fact that the virtualization process is starting to replace the practices of current work across organizations. In the last few years, virtualization has blossomed due to the efficiencies it offers. It is especially popular because of the technical advantages and considerable cost savings. Ho, Au, and Newton (2003) state that there exist three important elements, which constitute the supply chain of virtualization. They emerge from the virtual knowledge communities, formation of virtual trading communities, and integration of processes, which are considered to be inter-organizational in the online environment. The consequences and transformations of virtualization caused a structural change in the relations of different business areas.
Campbell and Jeronimo (2006) indicated the growing significance of virtualization technology, its main frame system connection, and use on desktop machines and servers. The authors predict a great future for virtualization as it promises such benefits as greater security, lower costs, infrastructure consolidation, the improved productivity of employees, and others. The huge future potential of virtualization in the IT sphere is also mentioned by a variety of researchers (Rosenblum, Garfinkel, 2005; Stokes, 2009; Scarfone et al., 2011). There is an idea that the new ways of using virtualization will be developed in the near future, and virtualization itself will include anti-forensics, hardware hypervisors, autonomic computing, and mobile visualization. Virtualization will also be an attempt at security and privacy improvement (Stokes, 2009).
In the article “The Importance of Virtualization”(2008),Tara Anderson speaks about the reasons why it is advisable to use virtualization. Among them, she mentions resource utilization, flexibility, cost savings, availability, and others. Hence, in most data centers, virtual machines are implemented with the aim to function like a usual server and prove less management and maintenance costs (History of Virtualization, 2011). The importance of virtualization is also emphasized by Violino (2009). The author speaks about the disaster recovery and importance of virtualization as independent concepts and makes attempts to establish a connection between them. He states that the majority of organizations seeks server virtualization as it is the way to reduce the consumption of energy in data centers, consolidate servers, reduce costs, and increase the business agility (p.1). Violino speaks about the opportunities of virtualization beyond the server, as well as the future of this technology, which may be focused on the client devices. The future prospects of virtualization have been studied by Schultz (2009). He makes an emphasis on the fact that virtualization enables enterprises to assemble a single view of a client by bringing together information stored in repositories (p.1). The next logical step of any company is workspace virtualization.
The effects of virtualization on the performance of an organization and social influences were examined by Workman (2007). Scarfone, Souppaya, and Hoffman (2011) in the book Guide to Security for Full Virtualization Technologies speak about the forms of virtualization and focus their attention on the full virtualization. They speak about operation efficiency as one of the key reasons why organizations adopt virtualization: “organizations can use their existing hardware more efficiently by putting more load on each computer” (p. 2-1). It is considered that the centers of virtualized data are ideal for business continuity due to the fact that they let operations run round the clock (Geisa, 2006; Schultz, 2009). Since virtualization provides flexibility with time, space, and money (Matthew, 2008), many organizations began to evaluate the expense of doing the work internally and the cost of having the work outsourced. It has been proven that virtualization played a vital role in computing as it gave the opportunity to reduce long-term hardware, maintenance, software, and operation costs (Geisa, 2006; Sellers, 2009). Connor (2004) speaks about the future prospects of server virtualization as it allows to move from small markets to the mainstream (p.1). This is the main reason for increasing the rate of virtualization implementation.
The group of researchers (Silwa, 2008; Matthew, 2008; Marks, 2009; Barr, 2009) speaks about the benefits of storage and server virtualization, making an emphasis on the reduction of utility bills, equipment costs, software license fees, freeing floor space, capability of disaster recovery, IT staff reduction, and others. Kevin Lo (2011) adds more benefits, including the configurations of test software, energy conservation, and support of a cross-platform office. He concludes that the virtualization software is available for a variety of needs. Srinivasan V. (n.d.) provides information about the necessity and importance of virtualization for educational sphere, proving that it helps in delivering student labs and doing online assessments. The researcher speaks about the cost-effective solutions given by virtualization, which can support learning, development, and assessment of the next generation.
Dubie (2009) studies the value and concept of desktop virtualization, emphasizing on the fact that the deployment of a successful server virtualization leads IT managers to believe that desktop virtualization is able to provide a variety of benefits. Dubie tries to show that desktop virtualization is a technology that should be seriously thought-out. In spite of the fact that great attention of industry experts is focused on server virtualization, Dubie (2009) suggests desktop virtualization as the next big move to virtualization. That is why he thoroughly examines the concept of desktop virtualization and concludes that it is a right path for the mainstream IT department, which should be pursued.
The current economic climate has caused the rapid promotion of virtualization. Virtualization has been proven to have the following benefits: lower utility bills, equipment costs, and software license fees, disaster recovery capability, and reduction of the number of IT staff (Barr, 2009; Marks, 2009; Matthew, 2008). Geisa (2006) discovers the benefits of virtualization from an architectural point of view. His research shows that less hardware and less expensive hardware may be used in order to do the same work. The better use of infrastructure leads to operational efficiency and simplified management.
In spite of a variety of benefits, which appear due to the implementation of virtualization, it has a long way to go (Symantec, 2009). The above mentioned literature mainly shows the benefits of virtualization. However, being intensively promoted, virtualization cannot be applied to all (Marks, 2009). Implementing virtualization, many organizations face the threat of overloading the server, which may result in IT failure, increased costs, and downtime. On the contrary, in the case several servers run at partial capacity, it may be cheaper (Gittlen, 2010). Chickowski (2009) examines the fact when utilization increases leaving management server costs the same or even increases them because a company implements unfamiliar technologies. It has also been stated that the implementation of virtualization with false expectations could lead to management and project failure, and increased costs (Gittlen, 2010). The negative effects of full virtualization are also mentioned by Scarfone, Souppaya, and Hoffman (2011). They think that the combination of a big number of systems on a single computer may threaten its security: the simple way to share information between computers may turn out to be an attack vector while the creation and maintenance of security boundaries is a complex process.
What Are the Factors that Determine the Need for Virtualization in the Organization?
The technology of virtualization is considered to be the most important issue in the IT industry, which has caused a complete reconsideration of the computing industry. Constantly increasing awareness of the virtualization advantages is caused by government regulation, the economic factor of scarce resources, and growing competition. A growing number of organizations is using virtualization with the aim to reduce the consumption of power and need for air conditioning. Virtualization also makes it possible to get a high availability of critical applications, deployment and migrations of streamlines application. Simplifying IT operations, virtualization allows IT organizations to respond faster to the changing demands of business.
Virtualization may be implemented at the following levels:
- Hardware level. At this level, a virtual machine is able to run several guest OSes, which promote good training and testing that need the interoperability of networking between several OSes. It also enables to run multiple guest OS till it is enough random access memory (RAM), CPU, and hard disk drive (HDD) space. It was introduced in 1990 by IBM and received the name of logical partitioning.
- Operating system level. At this level, only one OS can be virtualized. It means that the guest OS is the host OS. It is considered to be similar to those having many sessions of terminal server where is it not necessary to lock down the desktop. Here exists a possibility to have the speed of a terminal service (TS) session and gain the full access benefit to the desktop, providing user control of the quotas for RAM, CPU, and HDD. The implementation of virtualization as an operating system level is called Server Virtualization and characterized by a separate IP address, which is given to each guest OS.
- Application level. This level enables direct running on the Host OS without any guest OS. It is called Desktop Virtualization or Application Virtualization. Application Streaming is responsible for the delivery of applications to the desktop and provision of local running. In computing of a terminal server, these applications run on the server, but not locally.
All uses of virtualization are centered around the concept that represents an abstraction of physical resources. Before speaking about the factor which makes organizations seek virtualizations, it is worth mentioning its major types as all of them bring the benefits to organizations. Thus, there are three major types of virtualization that are widely used by companies and organizations:
1. Server Virtualization. This is the type that is the center of the greatest attention in the world of virtualization, and it is the starting point of virtualization technology implementation by the majority of organizations. Server virtualization breaks the idea that one server is for one application and lets numerous server consolidation onto one physical server. As a result, an organization needs less physical servers and much higher utilization of existing hardware (The Different Types of Virtualization).
Server Virtualization gives organizations the ability to create virtual machines, which can share the physical resources of the underlying server. Having its own OS and required applications, every virtual machine is able to run in isolation. This technology is called Hypervisor and Support, which takes place between the underlying hardware and virtual machines. Virtualization of the live migration may later bring about load balancing, balancing, higher up times and power management.
Thus, server virtualization provides an opportunity to have one server that does the job of multiple servers in the way of sharing single server resources across multiple environments. The software allows a company or organization to host multiple applications and operating systems both in remote locations and locally and free users from geographical and physical limitations.
2. Client/Desktop Virtualization. This type of virtualization deals with a client, workstation desktop, or laptop. It consists of two parts: desktops that are Client and Server hosted. Client hosted desktops let organizations create a virtual environment in order to run legacy applications. Server hosted desktops keep the client OS, data and applications on data center servers and have the benefits of better hardware utilization, better accessibility, and easy management. Thus, making client machine management easier and enabling better security, client virtualization attracts the interest of IT organizations (The Different Types of Virtualization).
3. Application Virtualization. Virtual applications are never installed physically. The only required thing for installation on client machine is the App-V client. The software is cached or streamed locally on demand and executed as a result. Easy scalability, centralized management, easy availability anddeployment are among the benefits of application virtualization. While an application is running on the server, and the input gestures are transferred back, the screen travels all the way to the terminal. There also exists another technique for Application Virtualization. It is called Presentation Virtualization and supported by several vendors.
In the modern business world, IT enterprises should always follow the latest technologies that enable businesses to run using fewer resources and providing the infrastructure in order to meet the needs of any customer. The need for virtualization in the organization is determined by the fact that every organization seeks to save its time and money. Over the past few years, there has been a constant increase in the number of organizations that use virtualization as part of their IT infrastructure. It is explained by the number of benefits of virtualization. The following benefits are provided by different types of virtualization, and in order to grasp the whole idea of positive factors, which influence the choice to implement virtualization, it is necessary to have a deep look at all of them.
One of the determining factors, as well as the benefits of virtualization, is server consolidation. Providing an opportunity to increase the server infrastructure scale without buying any additional hardware pieces, it leads to more efficient use of resources. Virtualization enables an organization to make the applications of ten servers run on a single machine that requires the same number of physical computers to provide the environment of technical specifications and unique operating system in order to operate (Burger, 2012).
Virtualization helps organizations save money as it gives the opportunity to reduce the number of servers the organization has to run. It means that it is possible to save on hardware costs and energy amount that is needed to provide cooling and run hardware. Organizations can avoid spending the massive cost of building a new data center as the efficiency of existing one will be dramatically increased. Computers have become multi-tasking, and they are able to adapt to large workloads. There is also a noticeable reduction in time that is needed to send out patches and updates. Moreover, with a probability to be turned off from a centralized location, virtual machines make it possible to reduce utility costs. The implementation of virtualization can reduce hardware costs by 80%. Only the improvement of hardware utilization provides an opportunity to reduce server expenditures and server space requirements by 50-60%. Virtualization results in easier refreshment of server hardware, in the same way reducing maintenance costs on older servers that expired warranties. Costs that are spent on disaster recovery and system management can also be reduced with the help of virtualized environment implementation.
Virtualization enables organizations to use their servers and space in a more efficient way, leading to the reduced costs on their real estate value, or this space may be just freed up and used for other purposes. It is obvious that the less floor space is consumed by a data center, the fewer utilities (electricity, heating, etc.) it requires.
Another factor is the ability to conserve energy. Saving hardware costs, virtualization software enables to save money on energy bills. It is evident that the energy costs of servers running in a data center are higher than the costs of getting it. This is the reason why enterprises speak about their need for virtualization in order to minimize the operating costs.
One more positive feature that boosts the desire to get virtualized is that organizations seek to improve manageability and reduce the work of system administration.Virtual machines are much easier to manage than real machines. It is explained by the fact that hardware upgrades can be done by a management console application whereas it is necessary to power it down with real machines, install the hardware, verify the change, and power it up again. Having fewer servers and storage area network devices to manage, IT sector does not feel any need for system administrators to support so many machines. Therefore, they may concentrate on more strategic tasks of administration. Furthermore, virtual machine management can be done via the same console, enabling to reduce the time that is required to deploy them.
Backup reduction and recovery time are considered to make virtualization beneficial. Due to the fact that virtual machines are basically files, their restoration and backup takes much less time than doing the same on several individual machines. In spite of the fact that the files can be huge, they are much easier restored than real machines of the same specification. In this case, the failure of hardware does not influence virtual machines in the same way as physical ones. Moreover, the packages of virtualization use the functionality of their recovery and backup as a way to improve resilience and business continuity.
Testing software configurations is another way of using virtualization software. This function is used before deploying software configuration on a live system. In order to see if a program is compatible with the existing setup, it is advised to test it on a virtual machine. This is very useful for the organizations, which have legacy applications and have to test systems before deployment. It is also possible to test server-client applications virtually as virtual machines have the option to interact with one another within virtual networks. Moreover, the use of virtual machines provides rapid deployment through application isolation in the already known and controlled environment. It is possible to eliminate such unknown factors as mixed libraries that are caused by numerous installs. Now, it takes only several minutes to reinstall sever crashes as the only required thing is to copy a virtual image.
Additional positive feature of virtualization is legacy application maintenance. Old applications may have some issues of compatibility with newer software; thus, a virtual machine can be used in this case. Virtualization also makes the process of software installation easier and gets better use of hardware. It enables organizations to have higher rates of hardware utilization as each server supports enough number of virtual machines, which are needed to increase its utilization from 15% (typical) to 80% (Burger, 2012).
Cross-platform office support is one more beneficial feature of virtualization. It is a common situation when offices run Macs and run one or two programs designed only for Windows. Virtual software makes it easy and affordable. Nevertheless, it is impossible to reverse as the majority of virtualization applications for personal computers let one run only Linux.
Disaster recovery and dynamic load balancing are considered to contribute to the above mentioned benefits of virtualization. As server workloads vary, virtualization enables virtual machines to be moved to underutilized servers. This balancing of a dynamic load brings about the efficient utilization of server resources. Crashes in systems can cause huge economic losses. That is why disaster recovery is a very important component for the IT industry. The technology of virtualization makes it possible to re-imagine a virtual image on another server in the case of machine failure.
Among the benefits of virtualization, one should also mention migration and preservation of legacy software investment, which can be done by populating new designs with several environments of virtualized guests, one for forward-looking OS software and one legacy code. Virtualized environments have good reputation due to better scalability. As such environments are designed to be scalable, they allow more flexibility when it comes to the company’s growth. Virtualization helps avoid buying the components of additional infrastructure by implementing new applications and upgrades. Faster deployment is another efficiency of virtualization as it enables IT staff to create a new virtual machine, only touching a button.
The technology of virtualization improves the agility of organizations, giving them the opportunity to respond to rapid changes in demand. It also lets organizations be faster in the process of deployment of new products and services and able to incorporate contract, off-site and offshore labor, as well as expand into new markets easily. It is not necessary to have high hardware requirement in order to test new applications. Developing and testing codes side-by-side on multiple operating systems or applications enables to reduce testing time and development. Virtualization assists in instant reloading of the test bed from a golden image. It leads to faster cycles of building, testing, and rebuilding. The technology of virtualization gives the possibility to reduce the process of mundane deployment necessary for the implementation of production, decrease the time for procurement and problem of hardware compatibility. It also lets organizations to use a single central location needed for application updates (Mann, 2006, p. 4).
Unplanned issues and planned maintenance with servers create downtime of a system and disrupt operations. In case the organization is virtualized, it has the possibility to move applications easily and quickly from the server to others before the required maintenance is performed. After completion of the work, the apps can be moved back to their original server. In this case, virtualization simplifies IT operations.
Despite the type of operating system the company uses, virtualization software runs on industry standard advanced micro devices (AMD) and Intel x86-based hosts, providing the advantage of standardization. It also gives access to all resources of physical servers, including memory, networking, storage, CPU, and peripherals.
Due to the fact that virtualization is supposed to be an extremely flexible option, it allows IT managers to shrink, expand, or move the virtual computer without modifying the hardware. It becomes possible to move data without affecting access to it. Hence, data is not any longer bound to a physical hard drive, and it provides companies with greater flexibility to change the environment of data storage or grow their own fields of data.
The majority of IT departments, which belong to small or medium businesses, spend at least half of their time doing the routine tasks of administration, including managing and adding workloads of new servers, developing and launching new applications, or adding new employees. Businesses, which have adopted virtualization, rate their IT departments as those, which are more responsive to the needs of changing business. Being used by IT departments, virtualization has been proven to influence effectiveness positively ( VMware, 2009, p. 4-5).
In order to make the best use of virtualization, know what to expect from it in the future, and which key factors influence the choice to implement it, it is worth mentioning the disadvantages it has. The disadvantages of virtualization are complexity (virtualization makes a computer harder to manage and troubleshoot; a virtual server can be managed only with good automation tools); a need in more powerful hardware (virtualization uses fewer servers, but they should be more powerful); single point of failure (in case of a server failure, the entire organization is paralyzed); potential problems with security (in some cases, it is more difficult to manage security in a virtualized system); greater network demands (it requires a greater bandwidth and network requirements; in case a virtualized server is used remitedly, it requires redundant bandwidth, which causes increased running costs); lower tolerance for poor management (virtualized system operators have to manage virtual environments properly; in case they do not, it leads to the virtual machine jumble that is costly in terms of unnecessary licences and time); and the issues concerning third-party support (not all vendors will be able to provide support for their software, which runs on a virtual server).
Despite the above mentioned disadvantages, organizations seek to virtualize due to a number of significant benefits. Virtual environment management is much easier than expected. It is clear that virtualization is a mature technology that offers enterprises significant business benefits (Mann, 2006, p. 7). There are also some additional benefits which may be brought about by virtualization, but they depend on the infrastructure type, which is needed for a specific industry. The analysis of all benefits of virtualization leads to the conclusio