Transforming Data Center Operations
What does the digitalization of data centers look like? Ulrich Terrahe from dc-ce looks at mastering DC documentation digitally.
What does the digitalization of data centers look like?
Data centers are at the center of an increasingly digitalized society. Every piece of data that is processed anywhere in the world is transmitted via one or more data centers. But what is the status of digitalization within the data center infrastructure?
Briefly: When it comes to digitalization, data center operation is still in its infancy. This means an extremely high expenditure in terms of administration and personnel for almost all processes.
Printed plans and models specify the tasks to be done, maintenance processes are triggered by telephone or email, and handled with the aid of printed forms. The resulting documentation – filed unmanaged and in hard copy – fills folders and filing cabinets.
The use of intelligent database systems, which digitally capture and interlink all a data center’s documents and data, are rarely to be seen. Technical and operational know-how is buried in the heads of the specialist personnel, and is not so easy to pass on. Even for simple operational or maintenance processes, a high degree of specialist training or professional experience is required.
Complicated and time-consuming – From specialists for specialists
All of this stems from there being numerous disparate interfaces between the actors involved: from the provision of data by the architects and planners, through to companies and manufacturers involved in the implementation, and on to the specialist personnel for operational usage or processing. Software solutions, such as Building Information Modeling, Datacenter Information Management and Building Management systems, can support operational processes. But these are proprietary systems that do not represent a holistic solution. Users need expensive basic software and long-term support from the provider. Costly training and professional development of staff is also necessary to be able to use the programs efficiently.
A further problem is the inadequate interplay between the location of objects in the data center and the associated documentation. Obtaining information about a particular object still requires a range of steps. For the retrofitting of a cooling system for example, information is required from the plans from audit documents. This multi-levelled search process alone demands at least a certain amount of experience and know-how, as well as being time-consuming and at times open-ended.
Virtual Reality – Building Information Modeling, a good starting point
A good place to start is the Building Information Modelling (BIM). Here, buildings/data centers are rigorously modelled in 3D and the objects contained within it are interlinked with information. On the computer monitor, the user can enter and navigate through a virtual clone. To identify an object sought, there is no longer any need for additional 2D or 3D plans, which are at times complicated to read. The desired information can be retrieved with a few clicks. The intuitive handling is reminiscent of moving through an adventure game.
Imprisoned in the old economy
And yet, now even this idea – which is in itself good – flounders not only (as has already been mentioned) because the basic software for Information Modeling is proprietary, but also because it is only offered by a few providers worldwide. Leading providers, like AutoCad, ArchiCad, Microstation, and SketchUp, have so far struggled to even develop the bare minimum of compatible interfaces.
In order to integrate objects and information from other systems, masses of interfaces need to be written. As a consequence, cumbersome, resource-intensive programs are created which need a high-performance computer before they can enable even a somewhat fluid navigation of the data center and the retrieval of object information. This effect is exacerbated by the accuracy and depth of detail of the images. This is of course necessary from the perspective of the architecture and planning – but is that also the case for data center operations?
Requirements for digitalization of data center operations
In data center operations, the attention to detail is not so important for the virtual clone. Rather, what is important is a surface with which the user can quickly and intuitively identify the objects, in order to then get access to the relevant information with just a few clicks. For daily operations, it is irrelevant whether, for example, the light switch is positioned to the millimeter in the virtual plan, or the cooling system is in possession of its original dimensions. For identification, a rough photo-realistic clone of the object in its expected place is sufficient. All further information with the required depth of detail can be placed on the next information level.
For the operator of a data center, fast and location-independent access to the relevant information is decisive. This should be immediately retrievable in front of the object in the DC. In addition, such access should be possible with next to no prior knowledge.
Play your data center – Learning from other sectors
In the search for solutions, a glance at other sectors can help: The Gaming sector, for example, is developing increasingly sophisticated worlds with great attention to detail. Entire cities are replicated; players dive into worlds in which the borders between reality and vision blur more and more. Tasks need to be fulfilled and information exchanged – with assignments that are at times more complicated than the requirements in data center operation. In a computer game, all static and interactive data are made available regardless of location or time. And all this, not only on high-performance computers, but on PCs, laptops, tablets, and smartphones. What’s more: Players do not need extensive training in order to use the program.
Large community instead of small specialist group
A further prerequisite of success that can be copied from this sector is the provision of the basic software on open-source platforms. With these, people cooperate worldwide – in open dialog – on the further development of software. The core idea is communal – not two-way competitive behavior, as is the case with proprietary systems. The innovation process is significantly more fruitful, the access to support easier and cheaper.
After all, basic applications are limited to a minimum of interfaces. The required information is not embedded in the programs, but is provided as “add-ins“ via clearly-defined interfaces.
Intuitive, user-oriented, entirely digital
With the “keep it simple” approach, a range of challenges in data center operations can be solved simultaneously with the integration of Gaming platforms: Initially, one can reach more specialists, who can simplify the visualization and documentation management with their ideas, and make it more flexible. Where to date only specialists from the architecture and engineering sector could be deployed, now programmers and data base analysts from a strong, young growth sector can be brought into play.
Furthermore, the approach of managing a data center like a game ensures simplified handling: Users move intuitively through realistic visualizations and even with a low level of specialist knowledge, they can quickly understand and interpret complex processes. Assistance like explanations, instructional videos, and other support tools can be integrated digitally. “Remote hands” support from a distance is also possible. Through the immediate access to all information and data in real-time, situation analysis will become considerably easier.
New avenues are needed in order to continue to do justice to the increasing requirements of DC operations. This could be an interesting, promising, and future-oriented solution in the increasingly digital DC world.
In his almost two decades of experience as a DC planner, Ulrich Terrahe has realized projects of all sizes. His specialization is climate control and cooling technology. With his company dc-ce RZ-Beratung, he is organizer of the annual industry meeting “future thinking” and initiator of the German data center award, the “Deutsche Rechenzentrumspreis”. In 2007, he himself was awarded the “Datacentre Award” in London, and in 2008 he was a member of the jury for the same award.
Please note: The opinions expressed in Industry Insights published by dotmagazine are the author’s own and do not reflect the view of the publisher, eco – Association of the Internet Industry.