The term Virtualisation from Dell is not new to most people especially those interested in information technology. Most people might however not understand the real meaning of the term. The term refers to the method in which various virtual operation systems that are independent can be run on only one computer. It is basically a way in which the physical resources can be maximized and in effect also maximize the hardware investments. The popular Moore’s Law has made an accurate prediction of the exponential growth of the hardware requirements and the computing power. This is so because most of the part has not made the necessary changes, so as to accomplish the computing tasks that are similar. Because of this, it is very possible to turn a cheap dual socket 1U, which has a commodity server that is duo-core into eight or sixteen virtual servers that can now run up to sixteen virtual operating systems. The virtualizing technology is a very sure way of achieving server with higher density. Note that that it does not lead to the increase in the power of computing. It only slightly decreases the power of computing because of the overall reduction in the overhead expenses. At the present, a two-socket with a four-core server that for example costs $3,000 is far much powerful than an eight-socket with eight-core server that used to cost something like, say $30,000 some four years ago. Based on this example, this new power for hardware can be exploited through increasing the quantity of the logical operating systems that it can host. This has the effect of reducing most of the hardware acquisitions as well as the costs of maintenance that may result in savings that are significant for any particular organization or company. How should you use it? Virtualizing is the most perfect solution for all the applications that are designed for use that ranges from small to medium scale. It should not be used for the applications that are of high performance, in which there is need for the clustering together of one or more servers so as to meet the requirements of performance of only one application. This is so because the added overheads and the underlying complexity have the effect of reducing the general performance of the applications. For purposes of maximum usage of hardware, do not let the server exceed 50% of the utilization of CPU especially at the peak loads.
Related Articles -
Virtualization,
|