Whoever owns the information owns the world.
— Nathan Mayer von Rothschild
Information has surrounded us all over our history and every year there is more and more of it. Humanity has invented different methods for transmission and storage of information, and now, in this age of high technology, we store it electronically. As we know, the minimum storage unit of information is a bit. The bit, or binary digit, contains one of two values: zero or one.
The next significant unit of information measurement is a byte. It is believed that there are 8 bits in one byte. By adding SI prefixes (Le Système International d’Unités, SI or an international unit system) we get the amount of information that we already know better, such as 16 gigabytes (written as 16 GB). Where do we store the information? Today we are more aware of hard drives, the first of which was introduced in 1956 by IBM, called the IBM 350 Disk Storage Unit, it was capable of storing 3.5 MB.
Hard drives have become commonplace today, we keep our pictures, videos and documents on them. If for the average person the information stored on hard drives is in most cases a memory of the past, for organizations it is money. We all understand how important it is to keep this information from prying eyes, how important it is to protect it from various equipment failures and sudden power outages. The complexity of storage for organizations greatly increases with the growth of the organization itself, and this also brings in the increase in costs. Because of this, it was suggested that important data should be stored in the DPC.
Data processing centers
DPC (data processing center, other names are also used: DSPC - data storage and processing center, DC - data center) is a specially allocated builing or room for the placement of server and network equipment. As can be seen in the name, the main purpose of them is to store, process and distribute information.
The DPC's roots go back to the computer rooms of the 1940s. In those days, computer equipment was cumbersome and could occupy entire rooms. With the development of technology, computer equipment has decreased in size and the need for separate rooms has disappeared. But with the rising cost of stored information and the increase in volumes in the 70-80s, we began to return to such rooms again. But not many could afford to maintain such rooms.
In 1997-2000 the DPCs became a commercial service. The development of the Internet has also given a boost to this. Many companies have begun to build huge Internet Data Centers (IDC) to provide solutions to deploy various services based on their database to various commercial companies. This practice was later adopted by small private companies. There was also DPCs for cloud computing called the Cloud Data Centers, but this name did not take root, now they are simply called «data centers».
The increased use of cloud computing has caught the attention of government and business organizations and led to the launch of DPC analysis on security, accessibility, stability, compliance, and environmental impact. For example, the development of DPC design requirements is handled by accredited professional organizations such as Telecommunication Industry Association, etc.
In Russia, the interest in DPCs has increased especially recently. The strong impetus to this was facilitated in particular by state reforms, such as the obligation to store personal data of citizens of the Russian Federation in the state (Federal Law «On Personal Data» dated 27.07.2006 No.152 p.18 p.5) and the so-called Yarovaya Law, which includes the Federal Law dated 06.07.2016 N 374-FZ «On Amending Federal Law «On Combating Terrorism» And Certain Legislative Acts of the Russian Federation Regarding The Establishment Of Additional Counter-Terrorism Measures And Public Security» Article 13, which obliges telecommunication operators to keep not only information about messages, audio and video, voice, but also the messages themselves for a long time. And these are just huge amounts of data.
By the way, the security of personal data is taken care of not only in Russia, but also abroad, for example, on May 25, 2018, the regulation N 2016/679 of the European Parliament and the Council of the European Union «On the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC» was introduced.
DPC classification
Data centers can be classified on a variety of criteria, such as size (large, medium, small, modular and container), purpose (corporate, commercial, provider-dependent and regulatory) and a with the standard requirements. Significant indicators will be the compliance of the DPC reliability and safety, so most often we will meet the Tier characteristic with the Roman number from 1 to 4.
One of the first attempts to standardize the development of DPCs was the association of like-minded people in the Uninterruptible Uptime Users Group (UUUG) in 1989. A little later in 1993, the Uptime Institute, with the support of associations of companies that manufacture and maintain DPCs, created a base for sharing experience, based on which ways of evaluating efficiency were developed. Uptime Institute has obtained the rights to certify by Tiers various DPCs in the design, construction and operation worldwide. Many years later in 2005, the Telecommunications Industry Association (TIA) of the Institute of American National Standards developed its TIA-942, standard, which includes construction, commissioning, safety control, electricity supply, cooling, reservation and repair ability.
And in 2010, Building Industry Consulting Service International (BICSI) 002 2010 Data Center Design and Implementation Best Practices were introduced, complementing existing TIA, ISO/IEC and CENELEC standards. 150 experts took part in the development of the BICSI 002 2010 standard. In 2011, the standard was updated. While BICSI 002 2010, according to the developers, only complements the existing TIA, ISO/IEC and CENELEC standards, the TIA standard dictates strict requirements for the design and construction of DPCs as opposed to the Uptime Institute. Another agreement has been reached between the Uptime Institute and TIA that TIA cannot use the word Tier in its classification. Only the Uptime Institute has the authority to conduct the evaluation procedure and does so on a remunerative basis, and TIA has all the requirements in the public domain.
Each country and even organization has its own requirements for standards. For example, in the United States, the TIA-942 standard is more common, while in Russia there are no special requirements and DPCs are designed in accordance with the requirements of communication facilities, focusing on the standards of TIA-942, Uptime Institute and GOSTs (national standards of the Russian Federation) of 34th series.
As we have already understood, if we see the word Tier related to a DPC, it means that it has passed the procedure of verification by a third-party organization. There is a total of 4 progressing levels, each level includes the requirements of the previous.
- Tier I (N) is the most basic level. There are no false floors, uninterruptible power supply (UPS) units, backup electricity sources, engineering infrastructure is not reserved, and repairs, technical works or equipment failures lead to the shutdown of the entire DPC. The expected level of trouble-free operation is 99.671% (1729 minutes of annual downtime).
- Tier II (N+1) is a level that includes reserve capacity. There are false floors and a backup power supply, there is a reservation of central components, which facilitates the possibility of scheduled works and repairs, but those can still lead to DPC failures. The expected level of trouble-free operation is 99.741% (1361 minutes of annual downtime).
- Tier III (2N) is a level that includes parallel repairs. There is an opportunity to carry out scheduled works and repairs (including expansion and replacement of individual modules) without disabling the DPC, each element of the engineering system is reserved, there are several channels of electricity and cooling, but only one works at a time. The expected level of trouble-free operation is 99.982% (95 minutes of annual downtime).
- Tier IV (2(N+1)) is a level that includes resilience. Scheduled works and any repairs, failure of elements of the system do not lead to a shutdown of the DPC. Double-reservation of engineering systems. The expected level of trouble-free operation is 99.995% (26 minutes of annual downtime). The "expected level of trouble-free operation", that is the time that the DPC may become non-functioning within one year, was derived using special calculations and formulas. Uptime Institute started using this term, but in 2009 it rejected this indicator, as it was revealed, that human factor may significantly affect this value. You can determine the level of trouble free operation on your own here.
I need a DPC
So, we came to the most interesting part: why do we need and whether we need a DPC at all? Is it worth paying for it and is it profitable? There are a lot of different DPCs, but less than 500 have been certified by the Uptime Institute. Those who have passed the certification and received any reliability Tier can only guarantee the preparedness of the infrastructure, but not the quality of employees, the Tier does not allow to exclude human factor and unforeseen situations. The organization, which monitors its image and received a certificate, will, of course, be more careful with staff recruitment, so it is highly unlikely to meet unskilled employees in such companies. There is a proverb - "you want something done well - do it yourself", but with this attitude you will need to read a huge pile of literature, conclude contracts with suppliers of electricity services, rent or build a suitable room all on your own, and it also costs a lot of money. Not counting the funds to maintain the entire infrastructure.
The most important thing is to determine the feasibility of using a DPC for your business. If you don't have a website or servers with particularly valuable data, the loss of which due to the hard drive failure can be critical, then you do not need a DPC. If you do have data that needs to be stored, it's easier and cheaper to buy any NAS (Network Attached Storage), many of which have built-in RAID (Redundant Array of Independent Disks) that save data when one of the disks fails.
If your company has a website, a small server, and it’s not critical if it will be unavailable for a short period of time, you can consider the data center of Tier I or Tier II. It would be senseless to go for higher Tiers, which cost more. Why is it better to use a DLC rather than placing a website on your server in the office? Remember how often your ISP has problems or how often the electricity in your building is cut off, or you can simply forget to pay for the Internet. If there are any failures with the equipment, then you will also have to eliminate them on your own. All of this can be avoided using DPCs.
If you have an organization that is working 24/7, and the disconnection of a server or some service is critical, then it is worth considering the DPCs of Tiers III and IV. But in most cases, Tier IV will not be very useful compared to Tier III. In the Data Center article you can find the same arguments of some of the best experts: Rahul Shevale (Consultant, India) believes that if we consider reliable data centers, then Tier IV is mandatory, but in terms of efficiency and the necessary infrastructure Tier III will be more cost-effective and efficient.
Dr. Carlos Garcia de la Noseda considers Tier IV DPCs to be a waste and that it is better to build a Tier III+ DPCs, recruit quality staff, introduce safety procedures, keep records of reliability and quality of equipment used.
At the same time, William Engle, who was part of the team to build the world's first Tier IV DPC, cited statistics that 60% of the failures were due to human factors.
Therefore, it only makes sense to consider Tier IV DPCs for the banking sector and government organizations. Others have no point in overpaying. As was mentioned above, there are less than 500 DPCs that were certified by the Uptime Institute. And it's not always the case that a company is afraid to gain certification at a desired level. All this is due to the strict criteria of the auditing organization, for example, the area of the room used. Therefore, when choosing a DPC, it is worth paying attention to companies that use their services.
Conclusion
The importance of information comes with its loss and inability to restore it, so it is better to take care of it in advance. We have figured out that for home use or organizations without a web site and important data it will be more effective to purchase NAS, there is a great variety of these on the market and they are easy-to-use.
If you need access from anywhere in the world, or there are employees working remotely, or you have determined that the use of DPC will be more effective, but it is not critical that the server, service or website of the company will at times be unavailable even during daytime - consider Tier I and Tier II DPCs. It will be more efficient for everyone else to use Tier III and Tier IV DPCs, but we already know that it is not always necessary to overpay for excess reliability.
It is also worth paying attention on the list of organizations using the services of a DPC, when you are choosing one. The location also matters, especially for the realities of Russia (personal data of the citizens of the Russian Federation may be stored only on the territory of this country). It is worth not to forget about the requirements of some organizations, for example, those processing classified data, but in this case we, most likely, won't be speaking about third-party DPCs.
We hope that this article gave you new knowledge and allowed you to understand the DPCs, what they are needed for and who needs them.