<img src="http://www.gblwebcen.com/41894.png" style="display:none;">

Volta Data Centres Blog

Why automation and orchestration are essential for the software-defined data centre - by Phil Alsop, Editor, DCS Europe Published

Posted by Volta Newsroom on 07-Aug-2017 09:00:00

First there was virtualisation, then there was Cloud, and now comes the potential end game – the software-defined era. All the talk is of virtualised servers (but not, funnily enough (unless I’ve missed it) software-defined servers), software-defined networking, software-defined storage and software-defined security. And heading to a data centre near you soon…software-defined power at the very least – not sure if software-defined cabinets and racks make sense or not!

businessman hand pressing button with contact on virtual screens as concept .jpeg

Rather than argue the nuances of virtualisation versus Cloud versus software-defined, let’s agree that the objective of all these technologies seems to be to create one big pool of compute which is then controlled by one single management pane of glass. Theoretically, anyone anywhere in an organisation – with the necessary authority – can stand-up an application by simply requesting this compute pool for the required server(s), storage and networks. The management, or orchestration, software will configure the necessary IT infrastructure on which the application will then run – whether for ten minutes, ten days, ten weeks, ten months or ten years.

In this brave new world, the data centre infrastructure – most noticeably power and cooling – needs to be similarly ‘modular’ and dynamic. So, there’s no point in powering an IT resource that isn’t being used, but you need enough power available to cope with the overall anticipated daily workload. Similarly, cooling needs to be flexible – no point in cooling the whole data centre, or even just one aisle, if only one or two cabinets worth of IT kit are actually being used.

The dangers of the software-defined era include the possibility that the very complex, hardware world that exists under the software layer is becoming more and more difficult to manage, to the extent that, should humans ever need to intervene/understand what’s going on if the orchestration layer should fail, very few, if any, will possess the necessary multi-skills to do so.

And, as with virtualisation and Cloud, simply ‘hiding’ your hardware layer under a software layer might gain you some efficiencies, but if the hardware is old and slow, it will still be old and slow!

Key to the success of any software-defined project is the establishment of a whole string of policies that will determine how and when the various physical IT and data centre infrastructure resources are allocated by the orchestration software. So, no matter how automated your data centre, if the human-written policies have not thought of every eventuality, there will inevitably be some ricks in the system. For example, deciding the priority of who is allocated which resources, when and for how long, is a fairly complicated process, especially in a large organisation.

Artificial intelligence and, in particular, neural networks, might well be able to automate, and update, the vast majority of the necessary policies, but as we all know from our experiences with various automated telephone helplines, there’s always some query or problem that cannot be dealt with by a robot!

Furthermore, the idea of the software-defined data centre somehow bypassing the IT department – so that, say, the marketing department can decide to build a pop-up website for a week long promotion, access the necessary infrastructure via the management console, develop and stand-up the site, obtain loads of new names for the database and then forget to close the website (!) – will almost certainly lead to chaos. However software-defined you want your data centre to become, it’s important to ensure that the folk that manage that data centre and the IT inside it do not lose control of who is accessing what, when and how (security is rather important…).

So, the major problem with software-defined data centres right now? Well, most of the promotion surrounding this new idea suggests that to achieve the ultimate in scalable, flexible, dynamic compute resource requires little more than the waving of the software-defined wand, and everything will fall into place.

There’s no doubting that the promise of the software-defined data centre is huge, but the journey there is far from easy. As with so many decisions to be made concerning the data centre and IT resources in recent times, the major questions are: “How much of the software-defined journey do you want to make alone? “How much of the software-defined journey can you afford to make on your own – both in terms of available in-house expertise and available funds?” And, perhaps the most crucial question of all: “Am I a retailer/manufacturer/government department etc. or an IT shop?”!

Yes, virtually all organisations today have IT (and one or more data centres) at the heart of what they do, but do these organisations want/need to own this infrastructure or not?

Software-defined IT infrastructure and data centres are one of the last steps on the path towards true utility computing – with maybe AI and Machine Learning the ones still to come. Chances are that you’ll need one or more expert partners to support you along the way. Colos, Managed Service Providers, system integrators and IT and data centre hardware and software vendors could all have a part to play as you decide the best way forward for your business.

Any major business migration or transformation process needs to be well planned and particularly well executed - don’t make the mistake of thinking that you can achieve a software-defined data centre at the flick of a switch.

Topics: Volta News

Subscribe here!

Recent posts