Simplifying cloud connectivity for the modern enterprise

This article is sponsored by Equinix.

Cloud strategies are evolving to balance cost, compliance, and AI-driven innovation for enterprises. Image created by DALL·E 3.

Cloud connectivity has been characterised by agility, scalability, and convenience, but it also introduces cost, complexity, and data sovereignty challenges. Many enterprises continue to wrestle with infrastructure challenges, with some still uncertain which data and workloads are best suited to which platform.

Across industries, a hybrid multi-cloud approach has become increasingly popular, enabling organisations to explore AI use cases. Yet, this approach remains complex.

To address ongoing cloud challenges and explore potential solutions, senior IT executives gathered for a roundtable titled “Simplifying Cloud Connectivity,” organised by Jicara Media, and hosted by Equinix.

The cost of cloud

While cloud technology provides organisations with new capabilities, many are alarmed by escalating costs. Often, the final bill surprises, or even shocks, businesses.

“With all the scaling up happening, a major issue is the cost model, because there are many hidden fees. The cost of using these kinds of services is often unclear,” remarked Geoff Wade, Co-Founder and CTO of dating service MatchMde.

Wade noted that his company uses several APIs from a particular cloud provider and is charged for each piece of data they access.

“Let’s say you need information about a single place. In our case, this includes dating venues, restaurants, and the like. The provider charges us once when someone accesses information about the restaurant, and again for accessing contact details. We don’t see these charges immediately, as they’re broken down into small increments, until the end-of-month bill arrives — sometimes reaching several hundred dollars just for retrieving information about places,” he explained.

Although cloud providers often offer free credits for using their services, Wade finds the pricing model still overly complex.

“You get free credits, for instance, but you can’t always use these credits for certain cloud services. I think it’s partly a strategy to obfuscate cloud service pricing. We use about five cloud services, so I can’t imagine a company using 40 services and still keeping track of costs,” he said.

For an IT regional head at an investment firm, the cost dilemma presents another aspect.

“When I joined the company about a year ago, we were largely on-premises. It wasn’t the most efficient set-up, hence the cloud push. Now, about 70% of our workload resides on the public cloud. The rationale is simple: I don’t want to maintain five data centres in the region. There’s a good amount of transparency regarding costs, but it’s still not very clear how much of the cloud services we actually consume,” they said.

At clothing retailer Love, Bonito, the transition from a local cloud provider to a hyperscaler brought valuable enhancements, such as disaster recovery, easy scalability, and greater control over scaling resources based on requirements. While this shift initially came with higher costs, the company has been optimising its usage across various components and anticipates better ROI in the long term.

“This journey has been an opportunity for us to enhance our infrastructure resilience while continuing to seek efficiencies for the future,” shared Aman Agarwal, the retailer’s Senior Director of Technology.

Navigating regulatory waters

For enterprises in tightly regulated sectors, such as financial services and healthcare, managing data regulations can be challenging. However, meticulous planning and preparation can mitigate many potential obstacles.

“When we moved to the cloud, the first thing we did was to demarcate our data, identifying which data could be migrated and which needed further consideration,” said an IT regional head at an investment firm.

Before fully committing, the regional head’s IT team asked themselves a crucial question daily: Could they safeguard their data better than a hyperscaler with a significantly larger, dedicated team?

“I have a small team overseeing our entire IT landscape, and we ultimately concluded that nine people couldn’t secure our data as effectively as a larger external team,” the regional head added.

For the CIO of a large architecture firm, the Singapore government’s recent data security mandates underscored the need for strict compliance with standards like ISO 27001 and SOC 2.

“Achieving SOC 2 within an internal data centre is next to impossible. Hence, we’re looking externally to companies like SingTel and Google, who can facilitate this. That’s why we’re moving in this path. Currently, we’re exploring a complete migration from on-premises to the cloud. With a collection of offices worldwide and much of our data shared across these locations, cloud adoption aligns with our global data-sharing needs,” observed the CIO.

Preparing for AI

The architecture firm’s cloud strategy also forms the basis for deploying AI initiatives. According to its CIO, the organisation leverages AI to simplify and enhance the work of its architects.

“You wouldn’t expect design consultants to become software developers, but we have numerous pain points in our field where, honestly, AI has been a bit of a godsend. It elevated our productivity, and we’ve turned young domain experts into software developers,” noted the CIO.

One AI tool developed by the architecture firm was inspired by challenges in data centre design. Creating test fits — floor plans assessing the feasibility of physical spaces — is among the most challenging tasks because of the processes involved.

“If we rely solely on manual efforts, completing three or four test fits can take us five to six weeks. So, we developed a generative AI tool, incorporating carbon calculations and computational fluid dynamics (CFD) modelling. Now, it takes us about two hours to generate around 10,000 iterations,” the CIO revealed.

For Love, Bonito, the journey to implementing AI centres around balancing the decision to build or buy solutions that enhance consumer experiences. As a medium-sized company, the retailer focuses on practical applications of AI that deliver value to its customers.

“When deciding on AI solutions, we consider factors like the speed of data processing and the overall user experience,” Agarwal explained. “If creating our model ensures data is delivered in milliseconds, it adds value. However, ensuring relevance is key — if a customer searches for ‘red jacket’ and gets no results, the system falls short. It’s about delivering both speed and meaningful outcomes to enhance the end-to-end shopping journey.”

Value over cost

Addressing enterprise concerns, Hari Srinivasan, Principal, Technical Marketing at Equinix, believes that over time, the benefits of cloud will ultimately outweigh the costs.

“We see our customers are beginning to operate their enterprise applications in a hybrid multi-cloud architecture. With the advent of AI applications into an enterprise landscape, hybrid multi cloud architectures can deliver significant benefits as well. You’re not just using a publicly available model; you’re augmenting it with your own data. For this, you need high-speed connectivity between where the data resides and where it’s used to improve the model. Then, you deploy the model in multiple locations, allowing inferencing to occur locally. That’s where we bring value in AI. It’s what we call private AI or hybrid AI, which we see as the future and where we add significant value,” he said.

While not an AI provider itself, Equinix supports organisations through infrastructure management for AI.

“By leveraging our partnerships with companies like Nvidia or WWT, for example, we help customers adopt best practices as recommended by the principal vendor for building AI applications and also help them deploy in our global fleet of data centres,” Srinivasan highlighted.

He clarified that Equinix does not offer a private AI of its own but instead provides foundational infrastructure to help enterprises build new AI use cases.

“The analogy I use is this: You can build a beautiful house, but if you choose your location wrongly, do not build your foundations properly, or do not get your plumbing right before you move in, you are bound to have a poor experience living there. Similarly, you need to choose the right locations to host data, choose the right data centre in that location which offers a large number of connectivity options, including low latency connects to public cloud in an agile manner, and tap into market leaders to build your AI applications. This would help you to deliver a superlative user experience. This is the area we shine in with traditional enterprise applications, and now we are applying the same expertise in helping our customers deploy their AI applications,” he concluded.