Many AI deployments fail for a variety of reasons — poor planning, unreliable data quality, and infrastructure challenges among them. Even worse, some enterprises lack a clear direction for their AI projects.
To avoid excessive costs when embarking on an AI project, multiple factors must be addressed, with data being just one of them.
During the “Preparing the Enterprise for AI” panel at the Cloud & IT Infrastructure Frontiers 2024 conference, organised by Jicara Media, senior IT experts shared successful strategies for developing a robust AI plan.
In the beginning
Logically, businesses must first determine the purpose of their AI initiatives; otherwise, they risk wasting time, energy, and resources.
Next, a decision needs to be made about where to run the AI, whether on-premises or in the cloud.
“Enterprises must figure out if they want to develop this expertise in-house or run it privately. Do they need partners to help them? That’s where the challenge lies, because they are unsure of the direction they should take,” remarked Daniel Ong, Director, Solutions Architecture, Asia Pacific, Digital Realty.
Christophe Ozer, Head of Digital Infrastructure, APAC, Orange Business, observed that while the cloud is often preferred for piloting use cases, 82% of organisations choose to go on-premises for production.
“The problem now is the infrastructure required for this, particularly in terms of networking. You need a secure network with high resiliency, low latency, and significant bandwidth. Then you need to have a data centre ready, and I can tell you that in Asia, there are only a few data centres equipped to manage H100 servers,” Ozer said.
Nearly there
After addressing infrastructure concerns, a crucial stage before AI deployment is designing the data strategy. According to Simon Long, Senior Director – Client Technology Consulting at CBRE, many organisations are so eager to implement AI that they often overlook data security unless their CISO intervenes.
“One of the challenges is that we haven’t historically built the data strategies that would have allowed for us to deploy AI easily, so the data strategies that we have now really need to change,” Long said.
Long used an HR use case as an example, where a simple framework to classify HR data serves as a first step.
“You draw a vertical line down a piece of paper, and write your internal services on the left and your external services and systems on the right. Then, you draw a horizontal line through the middle, putting the secure services on top and the less secure ones at the bottom,” he explained.
In the bottom left box, you will likely find processes like holiday requests or instructions on how to use the meeting room, which require low data security and have low risk, making them suitable for immediate input into an AI system, Long noted, unlike client-facing data and systems, which are sensitive and often high-risk by nature.
Beyond HR data, which is relatively easier to manage, a major challenge for AI is the proliferation of unstructured data, making traceability and accountability difficult. For Long, it all comes down to careful supervision.
“You need a team monitoring your large language models, and you need to be in a position to audit how an outcome was achieved. Again, those kinds of processes need to be built and tested in a proof of concept that will then help you scale and resolve some of these issues. Garbage in and garbage out is a big problem, and now that we’ve got AI, it’s being highlighted even more,” he added.
Another challenge for many businesses is the existence of data silos, which, if left unresolved, will cause any AI project to fail.
“When silos exist within different organisations in a large enterprise, nobody has full visibility of where the data is. If that is the case, it becomes difficult to upload data into your AI clusters and get meaningful outcomes from it,” Digital Realty’s Ong said.
Ong has observed that small digital banks are already capitalising on AI, leveraging knowledge from their customer processes. On the other hand, most large enterprise customers still don’t have a clear view of where their data resides.
Similarly, Orange Business’ Christophe Ozer highlighted that many businesses are stuck on data governance, primarily due to a common misunderstanding of the concept.
“Companies think that data governance is a one-time undertaking and that it’s an IT problem. On the contrary, it’s an ongoing process and an organisation-wide issue,” he emphasised.
Finish line
Despite the huge potential of AI to disrupt industries, and the sizable investments by enterprises, progress towards ROI has been slow. According to Long, there’s a clear explanation for this.
“We’ve touched on the reasons why, like data governance. What have we exactly disrupted right now? Very little, actually, in the real estate industry. As much as we have rolled out a lot of things, the ones we’ve implemented are the low-hanging fruits or the low-impact areas, because we’ve all been playing with AI to get the data and process right. We’re doing this before pushing for the strong ROI targets in those sensitive, high-risk systems,” he said.
Long explained that significant progress in real estate AI could be achieved if someone focused on developing a valuation model specifically designed for the industry. This model would be powered by a highly focused AI capable of scraping published data from the entire industry on every listed price and sale price, and then comparing this to geographic market conditions in real time. By leveraging this data, the model could provide highly accurate valuations, demonstrating the potential for AI to have a substantial impact on targeted areas within real estate.
From Ong’s perspective, AI will disrupt those who do not adopt it. One practical use of AI at Digital Realty is in their data centres.
“We adopted AI software to manage cooling in our data centres, and one of the byproducts is increased efficiency and sustainability,” he shared.
Another AI concern, which will manifest within the next two to three years, is network congestion, Ozer predicted.
“The network will quickly come back into focus, because as soon as it reaches city-scale demand, you will need to have guaranteed bandwidth. Therefore, we will need to ensure that bandwidth is secured, as internet hubs will be completely saturated. With the explosion of data from LLMs and platforms like ChatGPT, we will need to cater to that increasing demand,” he said.
For now, enterprises must keep the following foundational elements in mind — a clear objective, a solid data strategy, and a robust but flexible infrastructure design — to get started on their AI journey. As the saying goes, “failure to prepare is preparing to fail.”