Data centres overview
Two surprisingly common questions among business leaders who previously had no cause to properly engage with a data centre, are:
What is a data centre?
At its most basic, a data centre (or “data center”) is a robust physical facility that securely houses servers belonging to itself and/or 3rd party organisations. It shares applications and data via a network of computing and storage resources and comprises a range of components including switches, storage systems, servers, routers, and security devices.
Data centre infrastructure is typically housed in a facility organised by halls, rows, and racks that are supported by power and cooling systems, backup generators, and cabling plants.
What is a data centre used for?
Data centres are used by organisations of all types to securely house the critical IT hardware that stores, manages, and processes their data. They are also used as safe places from which computers and other digital equipment can communicate with other devices in offices, homes, other data centres, all the way up to the satellites that orbit the Earth.
Everything from business processes to government services and telecommunications depend on these computers interacting in this way and being able to securely exchange data.
However, though certain characteristics are universal, not all data centres are the same.
Types of data centres
There are multiple types of data centre, each with various available service models. Generally, how they are classed is determined by whether just one or numerous organisations owns the facility, how and if it interacts with other data centres, and what technologies are used for the computing and storage contained within.
For most organisations, there are 4 main types of data centre that are commonly used:
Enterprise data centres:
This type of data centre is custom built for and wholly-owned by a single organisation. The organisation controls all operations that take place within and optimise all functions to specifically benefit itself and its end-users. Mostly (but not always) they are constructed on the grounds of the organisation’s HQ.
Managed services data centres:
In contrast to an enterprise data centre, a managed services facility deploys, manages, and monitors all hardware and software on behalf of paying customers. Rather than purchasing the likes of servers or taking responsibility for the install and upgrades of operating systems and other software, the client organisation leases the equipment and infrastructure with management of both included in the lease.
Colocation data centres:
In a colocation data centre, an organisation rents space to house their privately owned hardware. As part of the agreement, the colocation data centre hosts the infrastructure, including physical security, cooling, bandwidth, and power while the client provides and manages the hardware contained within the racks.
Cloud data centres:
Effectively, a cloud data centre is a remote version of the traditional bricks and mortar equivalent and provides organisations with access to their data over the internet. The cloud provider (think, AWS, Microsoft Azure, IBM Cloud etc) takes responsibility for updates and maintenance and will own some facilities as well as using colocation data centres.
The difference between enterprise data centre and colocation data centre:
As described above, an enterprise data centre is an on-site facility built and owned specifically by and for one organisation. Conversely, a colocation data centre houses privately-owned servers or networking equipment on behalf of third parties.
An enterprise data centre provides an organisation with the specific IT infrastructure it needs and allows for full control over data, physical access to systems, and when and how updates or expansions occur.
Although these benefits are deemed priorities for those organisations that opt for enterprise data centres, they come at a high cost. Significant capital must be set aside for the construction, operation, and maintenance of an enterprise data centre facility which includes budget for the hire, remuneration, and retention of specialist engineers and security personnel. Moreover, if the physical capacity of the data centre is reached and further expansion is, for whatever reason, not possible, potential growth is capped.
It is for these reasons that colocation data centres have surged in popularity in recent years. With service plans providing built-in features such as high-level security, support, redundancy, and connectivity. And, with utility costs shared between customers, overall expenditure is significantly reduced. also offer a range of connectivity options and with multiple ISPs, network services, and cloud platforms to choose from, organisations can easily build network infrastructures that meet their specific needs. By leveraging cross-connections and hybrid/multi-cloud architecture, organisations can also optimise their workloads and enhance both functionality and speed.
What is in a Data Centre?
In terms of core components, data centres typically consist of servers, routers, storage systems, switches, firewalls, and application delivery controllers.
Together, they provide:
This connects servers (whether physical or virtual), services, storage, and external connectivity to end-users.
Critical enterprise data is usually stored in the data centre and storage systems, including tape to SSDs are used with multiple back-ups provided.
Servers housed within the facility provide the processing, memory, local storage, and network connectivity that allow applications to run efficiently.
Given how critical these three components are, certain data centre services are deployed to protect the performance and integrity of each. Network security appliances, such as firewall and intrusion protection safeguard the facility itself while application delivery assurances maintain application performance to ensure availability and resilience.
As for what’s visible upon entering a data centre, for the most part, it’s long lines of racks containing servers, ventilation, cooling, and fire suppression systems, backup generators, and connections to external networks.
Data centre classifications
The Uptime Institute - the Standard bearer for Digital Infrastructure performance - certifies data centres across the following four tiers:
Tier 1 Data Centre
Describes a facility with basic site infrastructure that uses a single path for power and cooling with no backup components. A Tier 1 data centre will offer limited protection against physical events and has an expected uptime of 99.671% per year.
Tier 2 Data Centre:
Also known as a redundant-capacity component site infrastructure. Again, it provides a single path for power and cooling but has some redundant and backup components and offers better protection against physical events. Expected uptime is 99.741% per year.
Tier 3 Data Centre:
Or concurrently maintainable site infrastructure. A Tier 3 data centre contains multiple paths for power and cooling and protects against almost all physical events while providing redundant-capacity components and multiple independent distribution paths. Redundant systems allow engineers to service the infrastructure with zero disruption to end-users and the expected uptime is 99.982% per year.
Tier 4 Data Centre:
A fault-tolerant site infrastructure. A Tier 4 data centre has redundant-capacity components and multiple independent distribution paths to enable concurrent maintainability and one fault can occur anywhere in the installation without causing downtime. Expected uptime is 99.995% per year.
Importance of location for a data centre
A major reason organisations use an external data centre is so their servers are away from the premises and taking up space somewhere else. However, access to the facility is still important and the reassurance of knowing systems can be easily accessed 24/7 cannot be underestimated.
Physical location of a data centre also plays an important role in latency and data speeds. For example, if a London-based company with customers clustered mainly around southern England stored its data on servers in the USA, data has to travel across the Atlantic whenever it's accessed. Needless to say, data would not reach the devices of end-users as quickly as it would if the servers were housed in a . And this is important. Research consistently shows that users will exit websites and applications with slow load times, something which has obvious and serious implications for the businesses in question.
Search engine ranking of your website can also be affected by the physical location of data. Geo-IP location of an organisation’s server may not be the most critical SEO factor to think about but it is a factor all the same. Unless primary customer/s are located abroad, ensuring data is kept as close to the business as possible will enhance SEO.
If your business is looking to house its critical IT hardware in an external data centre, considering these 11 questions will help to ensure the right facilities are shortlisted.
Data centre regulations
Organisations have significant obligations under the Data Protection Act and the General Data Protection Regulation (GDPR) and the penalties of contravention can be severe. It is therefore essential that a data centre is chosen that meets established security levels and which has robust processes for handling data. If there is a breach, ignorance cannot be offered up as an excuse.
To ensure compliance with data protection legislation it is thus advisable that organisations house their data in facilities governed by the same rules they are i.e., one based in the UK. However, this alone will not guarantee compliance so organisations should make data security and compliance a central part of any agreed SLAs.