Contact

CLOSE CONTACT

Head office (South Africa)

​The Campus, 57 Sloane Street
Cnr Sloane Street and Main Road
Bryanston, Johannesburg, 2021
+27 11 575 0000

Technical support

Africa
+27 11 575 2571 / 0860 221 221
Australia
+61 3 9626 0497 / 1800 638 457
Asia
+65-6842-0700
Brazil
+55 11 3878 6500
France
+33-149-758-603
Germany
+49-1803-366-466
Italy
+39-02553-907-850
Netherlands
+31 342 402 859
North America
800 974 6584
Switzerland
+41-848-100-888
United Kingdom
+44 1252 779 779
Data Centre Trends To Watch In 2015

Data Centre Trends To Watch In 2015

​​​


​Top Five Data Centre Trends to Watch in 2015
 What does 2015 hold in store for data centre professionals? 
Read as Dimension Data experts discuss the top trends to watch and their expected impact on business.

 
 1. Big gets bigger, small gets smaller​ ​
Big gets bigger, small gets smaller #datacentre #2015trends via @DimensionData

This has dramatic implications for the data centre itself. The first is that organisations need to re-evaluate the amount of data centre space they need. Most are realising that they’ve over-planned their requirements and that, by taking advantage of the cloud, can reduce their data centre footprints by up to 50%.

The next question organisations need to ask themselves is: ‘Where should my remaining data centre facilities be located?’  Increasingly, there’s a move towards co-locating hosting environments, cloud, and the traditional data centre in very close proximity to one another, to achieve optimal performance.

Many businesses are debating whether they want to own their own data centres at all and, instead, move to a purely capex-based consumption model. These considerations are being fuelled by the fact that, if your economies of scale in the data are shrinking along with your physical footprint, it doesn’t make sense to invest in the technologies required to optimise energy consumption. 

On the other end of the scale, large data centres, typically owned by industry and cloud provider heavyweights, see nothing but growth on the horizon. Some already sprawl over hundreds of thousands of square metres. These giant facilities are expanding either due to the fact that they’re absorbing the capacity being transferred to them from customers looking to exploit the cloud model, or simply as a result of the nature and complexity of the businesses they serve.  “The mega data centre business model focuses on efficiency,” Leahy explains. “Their operators invest in fuel cells and the most advanced technologies for power and cooling, so their energy consumption per unit of computing power is much lower than you could achieve in your own data centre.”

2.Unlocking the value of automation​​​

Unlocking the value of automation #datacentre #2015trends via @DimensionData ​​​
Frank Casey, Dimension Data’s Director for Managed Services agrees: “It’s all about internal maturity. Every client I speak to is at a different level of maturity and most feel they have a fair amount of work to do. The goal is to move the organisation forward, in a structured manner, to a point where it has a modern, mature operating model that drives consistent service delivery and leverages policy-based standard operating procedures.”

Casey believes automation is key to extracting maximum value from technology. “Most IT teams are focused on making sure that infrastructure is up and running, and, due to workload and time pressures, aren’t thinking about how they can drive greater levels of automation.”  

Another common stumbling block is the existence of a ‘chasm’ between IT leadership and technical experts working in the data centre. Today, CIOs are focused on applications; they’re concerned about how services are being delivered to internal stakeholders and external customers. As mobile devices become users’ ‘de-facto’ point of interface with applications and data, CIOs are concentrating their efforts on ensuring a high-quality user experience and meeting their people’s expectation for instant, anywhere access to information. Organisations need to make sure that, back in the data centre, technical experts are devoting the right level of attention to security and ensuring that data is protected when it’s accessed on mobile devices. 

Casey’s advice to CIOs is: “Understand how your customers want to interact with your business and its applications and let your IT teams engage with them directly. Allow them to use the feedback they receive to create user scenarios to determine whether the technology is designed with the customer in mind, and if not, to devise a plan for improvement. This may require new processes, policies, and automation technologies.”

Making informed decisions regarding new technology investments is critical to ensuring you extract maximum value from them. “The market’s shifting away from component-based hardware and software, and moving to bundled technologies that feature high levels of automation. However, it’s not always easy to determine how to get the most out of these technologies, and how they’ll interoperate with the legacy infrastructures and workloads that may be running in a public cloud,” says Casey.

Casey believes that this shift is driving organisations to hire different types of IT skills. “There's no longer this need for specific domain experts; now you need people who can focus on automation and the integration of application programming interfaces with existing technologies.” 

Here, organisations would do well to turn to a managed services provider that’s already invested in the relevant tools and automation technologies. Such a provider can help you to look beyond the technology, and focus on the outcomes that you want to achieve as a business. 

3. Agile IT: it's about exploring the art of the possible​​
Agile IT: it's about exploring the art of the possible #datacentre #2015trends via @DimensionData​​​​​​
 
Treb Ryan, Chief Strategy Officer for Dimension Data’s ITaaS Service Unit believes that IT organisations need to use technology advancements, such as cloud and Agile development methodologies, to explore the art of the possible ... boldly test new ideas and approaches ... and have fun.

“Many IT organisations still operate their data centres the way they did 20 years ago, using waterfall processes and idle-driven systems. They’re missing a great opportunity to take advantage of new architectures and systems which will allow them to be more responsive to their businesses and more cost-effective.”

As an example, he compares the way Google, Yahoo! or Facebook run their data centres with the approach taken within a typical enterprise data centre. “It's completely different. Things just don't run the same way. If someone at Facebook wants to develop a new application or set up a new workload, they’ll simply “slice off” a piece of architecture, and off they go. There’s no two-year planning cycle, they don't need to do sizing and business case analysis.”

Ryan explain that cloud and recent technology advances have changed the game; they’ve levelled the playing fields and have brought this level if agility within the reach of all businesses, not just the big players.  “We need to start thinking differently about how we can use these advancements to drive business outcomes,” he says.

Investing in Agile software development technology is a first step in capitalising on these opportunities. IT leaders need to instil a culture of people, processes, and software working together and responding to change, rather than following a rigid plan. “We should apply this thinking not just to our development methodologies, but also to how we operate IT. It’s about identifying a few key areas and focusing on responsive, iterative development. If you make small incremental changes continuously, you create far more stable environments,” says Ryan.  

The concept of consumption-based billing for IT capacity is a key enabler of this transformation. But the advantages extend well beyond cost savings alone; this model empowers IT teams to test new ideas quickly, and unleash innovation, without exposing the business to risk. 

“Developers can try things out without having to put in place formal structures. And if something works, great; if it doesn't, then they can go back and try something else. That's what we love about Agile technology. Agile plays to the benefits of how technologists think − trying things out, seeing what works, making changes, and going back and trying again. As opposed to suggesting a single, “correct” approach which, if not followed precisely, would supposedly lead to trouble further down the line.”

Ryan believes that if organisations can achieve this mind shift, they can look forward to more satisfied end users, who’ll benefit from new features and functions, and greater application stability. 

4. 'Software-driven' transforms into business value​
'Software-driven' transforms into business value  #datacentre #2015trends via @DimensionData ​​​​​
   
As they look ahead to 2015, the challenge facing many organisations is how to ensure that their efforts in implementing Agile-based methodologies result in tangible business value. 

Colin McNamara, Chief Cloud Architect at Dimension Data subsidiary Nexus, shares some of his insights and experiences: “One of the tools that our agile software development team uses in its development operations and with clients, is value stream mapping. It’s a ‘pencil and paper’ process that allows you to visualise the impact of your activities on the business and prioritise your activities.” 

McNamara cites a recent engagement with an online gaming provider, where value stream mapping delivered an impressive outcome. The IT organisation was responsible for a delay in a recent game release that cost the business over USD 130 million. “We spent about two hours with the client, using sticky notes and a whiteboard, and identified a couple of minor process improvements. We recommended the use of templates and automation, staff logs and release management, and suggested that the client bring in some QA specialists. Through that two hour value stream mapping engagement, we were able to enable the client to reduce its release cycle by 30 days.’ 

McNamara explains that this approach is critical to software development, as the process involves so many moving parts and iterations. ‘Every two weeks you're going through a different release cycle, so you have to be able to quickly understand the impact of each one on the business.” 

Open source technologies like OpenStack and OpenDaylight can also add significant value. McNamara explains how Dimension Data recently brought these technologies to bear for the benefit of a security company that was looking to update its software development process.

“At the time it had a very harbour-centric software development process for its products. Working with the early implementations of OpenFlow and OpenDaylight Controller, we enabled the company to optimise a very important element of its software development process – the ability to release products to market more quickly. 

5. It's time to go big in extracting the value of information​
​​​It's time to go big in extracting the value of information  #datacentre #2015trends via @DimensionData​​​​​​

While there’s a lot of hype in the market around big data and analytics, most organisations are in the early stages of finding ways to derive more value from their data and find ways to translate it into competitive advantage.

Leahy explains: “Most CIOs I speak to have implemented virtualisation and are starting to explore and adopt cloud. But few have attacked the information challenge head on, mainly because that requires engagement with business units. IT teams and the business aren’t used to working so closely.” Leahy believes that it’s time for data centre professionals to move beyond managing infrastructure and trying to find ways to reduce storage costs, and begin to explore the potential new value inherent in information.

Casey agrees: “IT teams should consider handing over basic infrastructure management and data protection tasks to a service provider.  The provider can also assist in ensuring that data is available quickly – for example, in the event of an audit – and avoid any compliance issues arising.” 

Ryan believes that decisions regarding data and storage can’t be made in isolation, especially in the era of analytics and big data. “The type of storage you need will depend on what you want to accomplish with your data and how you want to accomplish it. It’s not as simple as saying ‘‘I want to get into big data, so I'll just start accessing the tools."

Hadoop and MapReduce are built on the premise that you have fast sequential read access to cheap disks. However, traditional cloud servers don't offer this capability. They’re designed for highly transactional applications. So it’s important to consider what you want to do with your data. Do you want to perform transactions? Or do you want to archive it and later extract it and aggregate it into a big data construct? You also need to think about how applications are optimised to ensure you’re able to get the data back out of the storage environment when you need it.

Object storage is an area of growing interest on the part of businesses that are testing the big data waters. McNamara explains: “This involves posting your data into a Web server. You can export all your Salesforce data to the object store, feed it through a set of predefined Hadoop or MapReduce processes, and then export it back into the object store.” He adds that the advantage of this approach is that business users are able to gain access to, and extract value from, the information in a timely manner.