The future of data centres rests on AI: Digital Realty CTO

Image courtesy of Taylor Vick.

With the advent of cloud technology, many enterprises, especially in Asia-Pacific, have migrated their workloads and applications out of their on-prem infrastructure.

However, the data centre market in the region is capitalising on the popularity of artificial intelligence as a way forward, viewing the cloud as an ally rather than a competitor.

Frontier Enterprise sat down with Chris Sharp, Chief Technology Officer of Digital Realty – a provider of data centre, colocation, and interconnection solutions – to discuss the development of the data centre space in APAC, and how AI is making a huge difference to the business.

- Advertisement -

There’s a lot of stuff that’s happening, not only in the data centre space, but in the tech space altogether. Can you share a bit about your experience, moving from Equinix to Digital Realty eight years ago, and being involved with other initiatives as well? How do they complement your work now with Digital Realty?

I’m a technologist at heart, and this has kept me in the data centre sector for a long time due to the constant innovation and exposure to numerous trends. Despite some skepticism in the market suggesting data centres will lose relevance, my last role at Equinix involved working extensively with cloud operators. I inherited the challenge of determining whether the cloud was a friend or foe. At the time, Steve Smith (CEO and President of Equinix from 2007-2018) urged me to tackle this issue and establish a position to explain the evolving landscape. 

Though not discussed publicly, we formed a group called the cloud acceleration team (CAT). We had some fun with the acronym while focusing on establishing private access to the cloud in collaboration with cloud operators. Our belief was that the world would always be hybrid, with not everything shifting to the public cloud. Our maniacal focus on execution in this area yielded a phenomenal advantage for Equinix, particularly in North America, with the availability of numerous cloud on-ramps. This work also made me more attentive to the evolving needs of customers.

During this time, I developed a market evolution thesis based on a couple of key aspects. To remain relevant for businesses’ maturity, a larger capacity block was necessary. Initially, businesses would start with colocation on a few racks, but now phase one is typically migrating directly to the public cloud. Back then, phase two involved acquiring 30-50 racks in a couple of markets. However, phase three presented a new challenge as businesses had to find another provider and integrate it with their existing infrastructure, causing frustration. Recognising this, I realised that a larger capacity block of power was required. This led me to think differently about the world and formulate a thesis that emphasised capacity blocks and interconnection.

Interconnection has been a long-standing passion of mine throughout my career. I dedicate a significant amount of time to discussing the current state and future direction of interconnection in various public speaking engagements. One of the reasons I was drawn to Digital Realty was the shared belief among the executive group and the board that the world should be open, without always forcing a walled garden approach. We strive to be balanced in our market endeavours, recognising that not all infrastructure will be confined within Digital Realty. Maintaining openness has been a critical component of our strategy. Lastly, it is essential to remain true to our core mission, which has evolved over time. I made a deliberate choice not to compete with my customers.

Trying to develop a product that’s out of your core kind of remit, it’s hard to stay relevant from a technology perspective and relevant from how you position the customer. We’ve always said that we would rather invest in our customers and set them up for success. We want other people’s technologies to be enabled in our facilities, because the big component of PlatformDIGITAL is that we’re foundational.

Developing products outside our core domain proves challenging in terms of maintaining technological relevance and positioning the customer effectively. Instead, we prioritise investing in our customers and setting them up for success. We welcome other companies’ technologies in our facilities, as PlatformDIGITAL’s core principle revolves around enabling and supporting the foundational infrastructure.

Let’s talk about cloud repatriation. Do you see this happening a lot, especially in Asia, where there’s still so much hunger to get onto the cloud? How do you see that trend develop from your end?

Yes, the demand for cloud repatriation is driven by two key factors: economics and security. However, there are technical barriers to building a hybrid environment, which is why we introduced ServiceFabric to the market. We’re working with other providers to be able to position bare metal inside the facility. While there are technical barriers that limit the prevalence of cloud repatriation, we do see a lot of interest in doing it. Another catalyst for this trend is not only the hybrid nature of combining private and public environments but also the adoption of multi-cloud strategies. For instance, organisations may prefer Cloud X for analytics and Cloud Y for running their workloads, necessitating a cohesive solution. Data centres serve as the logical point for integrating these hybrid multi-cloud environments. Our focus is on eliminating technical barriers for that. We see that as a desired end-state for more people.

Chris Sharp, Chief Technology Officer, Digital Realty. Image courtesy of Digital Realty.

Another aspect we constantly consider is the composition of datasets and the construction of models. Merging various data pools and performing analytics against them provide a competitive advantage. Since intellectual property is crucial for most companies, they prefer to keep it private due to privacy and trust concerns. However, it’s hard to consolidate these datasets into a single hyperscaler. This will serve as a further catalyst for repatriating many of these workloads.

There’s a high-level paper, called “Data Gravity Index,” which talks about being aware of where your data is created, stored, and how you do your analytics against the data when designing your architecture. It suggests that architecture should be designed to support these considerations. Often, individuals focus primarily on the architecture without fully considering the implications of accessing, moving, and performing analytics on the data. This can result in accumulated tech debt due to existing investments. However, we observe a significant shift in customer thinking. Many now recognise the need to approach the market with a different mindset. They understand the importance of making their data store and analytics the focal point of their architecture. Additionally, they seek to design their architecture once and deploy it across multiple locations. Currently, there are only two entities on Earth that offer this capability. The ability to have a global platform where you can design once and deploy widely is highly valued by customers. As a result, we believe that this trend not only drives demand for data centres but also stimulates the repatriation and re-architecture of existing environments. Given the rapid growth and relevance of AI, we anticipate that this trend is here and now, and we expect it to continue its momentum without slowing down anytime soon.

Is there going to be a surge in demand for workloads, storage, and compute?

There’s a tremendous amount of additional demand. We’ve seen deployments where substantial investments have been made in GPU farms, but data proximity becomes a limiting factor. These deployments are unable to fully utilise the GPU’s capabilities due to insufficient data availability. It is crucial to be mindful of this. Many chief data officers and data scientists are compelled to become engineers out of necessity. They start as mathematicians but must also consider architectural aspects to address these failures. This trend has been evident for some time. During an investor day presentation six years ago, I shared a slide to address skepticism about a technologist working at a real estate company. The slide emphasised the value of master planning, beginning with data or land, collaborating with power operators and utility grids, and establishing transformation stations to ensure sufficient power supply. The design and master planning of the facility, combined with interconnection and access to extensive data resources through our service exchange, create an optimal environment for artificial intelligence. Our designs have been continuously evolving to accommodate the growing demand for raw compute power. The workload growth and demand driven by the cloud have not shown any signs of slowing down. Additionally, AI has become a crucial feature that major SaaS application providers need to incorporate to remain relevant. This rapidly evolving landscape requires substantial compute power for various models, not solely limited to large language models.

Contrary to the belief that these models will eventually reach a peak and stop training, we anticipate their continuous need for training. While certain use cases and applications may reach completion, others, including Stable Diffusion and Hugging Face, continue to require significant compute resources. We foresee the emergence of unique datasets in the market that will be AI-enabled and accessible behind paywalls, necessitating subscription-based access and GPU farms for processing. And building and managing such farms is not easy to do.

When we had done that presentation many years ago, we actively started working with NVIDIA. We were one of the first to collaborate with them on getting DGX pre-certified because we knew the power density in the environment required something different. We also knew that our designs were uniquely tailored to support it, so we went out and got a lot of our facilities DGX pre-certified.

During a recent visit to Japan, I had the opportunity to meet with an individual from an analytics company focused on the automotive industry. She was in search of a colocation data centre vendor capable of supporting high power density for deploying a GPU farm. Among the numerous vendors she considered, only two came close to meeting her requirements. This capability has long been inherited in our modularity—the ability to support a wide range of workloads, which we have been successfully doing for many years.

You guys are currently building some clean energy data centres down in Australia. Are you currently observing an increase in demand for data centres in Asia over the past few years, or has it been a consistent growth without any notable turning point?

I think there’s a lot of demand here. We’ve been closely monitoring the critical markets we’ve invested in, and the demand shows no signs of slowing down. When it comes to AI workloads, I see a level playing field between the United States and Asia. However, my gut feeling is that the APAC region will experience even higher demand for AI in the future. Asia is a critical market for us, as many multinational companies are eyeing Singapore. It’s high on their list to be able to make sure they can operate here and generate revenue.

What are some exciting stuff currently in development at your labs?

The next phase of AI and what it’s going to be able to do is scary but also extremely exciting. During the holidays, I created an avatar in my likeness with my voice, which I used for our sales kickoff and in some meetings. It’s fascinating how I can quickly generate a script on ChatGPT, incorporate it into the avatar, render a video, and share it. I just absolutely love what it’s doing. But it’s not just about creating avatars or voice recognition; what’s really exciting me now is Auto-GPT. It has the potential to act as an agent, writing and manipulating code on your behalf. This, coupled with our efforts to remove technical barriers, will empower non-technical individuals to have a significant impact and unleash their creativity in developing applications and datasets.

Everything is being built off data centres, which have become an integral part of not just the digital economy, but every economy. It’s a great place to be, given the many applications and appliances emerging from data centres. I take a lot of pride in my work and the trust placed in us by our 4,000-plus customers to enable their success and facilitate this transition. We’re very excited about the opportunities ahead.