NUHS Group CTO talks about AI strategy

Image courtesy of NUHS

Artificial intelligence (AI) is transforming healthcare by performing tasks that are normally done by humans, but more efficiently, and in considerably less time and cost. Global consulting firm Accenture predicts that the healthcare AI sector – which had a market size of $600 million USD in 2014 – will be worth about $150 billion USD by 2026.

On the Southeast Asian corner of the globe, the National University Health System (NUHS) – a group of healthcare institutions in Singapore – built its own Endeavour AI platform, which is said to be capable of streaming data in real time. It gathers massive amounts of anonymised patient data and provides aggregated predictions and visualisations of insights.

Frontier Enterprise recently got in touch with Dr Ngiam Kee Yuan, Group Chief Technology Officer at NUHS, to discuss the health system’s digitalisation journey, how they productionised their AI tools, and NUHS’s upcoming plans from a technology and an infrastructure perspective.

Onboarding the Endeavour AI 

According to Dr Ngiam, NUHS’s journey to Endeavour AI started in 2018. Since they began, they have also been operating a research and development sandbox environment for AI called Discovery AI, which has been “around for a while” and functioned as the base of NUHS’s AI development efforts, in which they have built many of their tools that were deployed on Endeavour.

“The key thing about Discovery is that it was able to do what we call near-production testing, so we are able to test our AI tools, somewhat with the EMR system (electronic medical record) but not at the production level,” Dr Ngiam explained. “In the last few years, we tried to look for a solution that allowed us to run our AI tools in real time and in production. Because of that, we’ve decided to use Endeavour, which is truly a production platform that is integrated with the EMR system,” he said. 

Endeavour, said Dr Ngiam, features several modern IT architectural constructs such as a stream-capable platform, as well as the ability to run microservices. “With the typical products, we’re able to do almost an infinite number of automation steps based on the data that is being presented to the platform from the EMR system,” he shared. “That’s what we wanted to build, because we wanted to productionise our AI tools that we’ve been building for so many years, and we wanted to be able to signal that to our patients almost immediately. That is what we mean by production deployment.”

Productionising AI

To address those issues, NUHS went through the process of looking at different technologies and vendors. “Finally, we chose TIBCO because their product was able to meet all our architectural design specifications,” said Dr Ngiam. “In particular, we wanted to look for an enterprise-grade environment, build workflows, and found quite a number of them. TIBCO has a unique feature of being able to convert some of the data that we have into Kafka streams and Kafka messages, so this is actually quite a critical requirement that we specified as part of this project,” he explained. 

When it comes to productionising AI, Dr Ngiam said NUHS has been working on multiple projects. They recently had seven of them, which have been built and validated. “These are AI tools – modelled and validated, and we are just testing it out, in order to deploy a production-level software. All these tools were developed in-house, working closely with computer scientists and data scientists.”

Dr Ngiam noted that there are two kinds of AI software: “One is the medical-type AI software, which makes some kind of medical advisory, and these tools would then have to undergo some form of clinical validation trial. We have done something called ‘Allergen Validation Trials’, based on that information.”

The other type, said Dr Ngiam, is a device that does not do medical predictions. It does operational or automation functions, so it does not require medical registration or regulatory processes. Rather, it is purely for support operations, as well as to look at improving operational efficiencies,” he pointed out.

He also mentioned that NUHS is working with TIBCO to use several technologies such as TIBCO BusinessWorks, TIBCO Streaming, TIBCO Messaging, and TIBCO Spotfire in Endeavour, and that there are many clinicians that are closely involved in the building of their AI tools.

Using data sets

Dr Ngiam Kee Yuan, Group Chief Technology Officer at NUHS. Image courtesy of NUHS.

When asked if NUHS utilises any open-source code libraries or engines, Dr Ngiam said they work with a variety of them, and there are several original pieces of code that NUHS staff have created.

“They’ve written some code that allows it to function in a certain way, there will, of course, be tools that have been trained on combinations of open source tools. That is the basis for deep neural networks; there is so much of the AI tools that are being built.”

The data that these tools use belongs to NUHS, said Dr Ngiam. “When you use AI models, they are just models, but they learn from real-world data. That’s where the data sets within Discovery AI comes in, where these data sets are the large data sets that these models are built upon,” he clarified.

He added that NUHS has two kinds of AI models: the first uses AI models for diagnosis prediction using free text. The second one has administrative type tools that are useful for hospital operations, such as looking at surge capacity, or how staff can be redeployed.

Connecting to the EMR

According to Dr Ngiam, NUHS has been using its EMR system for the past seven years. To structure and index the system properly, they need to run certain features like anonymised data re-identification to ensure that the information there is ready to be used. In order to recognise the data, NUHS uses a system of advanced natural language processing (NLP).

When asked about the process of connecting the AI and streaming tools to the EMR system, Dr Ngiam revealed that they operate on a different architecture. “The Endeavour AI platform is distinct from the EMR system, but they communicate with each other,” he clarified. “You’re able to send data back to the EMR system, but Endeavour is distinct from the EMR system, and there’s a reason for that – we don’t want EMR functions to be overlapping onto diverse predictive and modelling functions.”

“Conversely, EMR systems are not designed to hold this scale of AI and machine learning (ML) tools, because we’re talking about ML tools that run 100 times per second,” observed Dr Ngiam. “That level of throughput and density are not what most EMR systems are designed to do. For example, within Endeavour AI, we run GPUs (graphics processing units). Very few EHR (electronic health record) systems run GPUs because you don’t need to run inference, whereas AI tools need to run inference. The most efficient way to run inference, is to use a GPU,” he said.

The future of NUHS medicine

AI is one of the technologies that NUHS has been testing for a while now, and it has been their biggest area of investment. To illustrate this point, Dr Ngiam shared that NUHS developed an NLP tool for AI. “We can use the same NLP for chatbots as they are complimentary, right? The chatbot allows us to do a great number of things, everything from front-end services to nudging/reminding patients (which help their coaches). This chatbot technology is something that we have spent quite a lot of time on, and it’s an offshoot from the building on NLP AI,” he said.

The chatbot is both customer and staff-facing, said Dr Ngiam. He added that the same software robot can be used for AI tools, be it text or imaging AI, where the same kind of network construct can be deployed by the robot.

As for NUHS’s technology plans, Dr Ngiam shared that they have been lining up technology investments which use AI as part of their core strategy. One part of these plans is the use of holomedicine or holograms. Earlier this year, NUHS initiated a study on using mixed reality (a combination of virtual reality and augmented reality) in clinical care. Their research evaluated the feasibility of using holographic technology to spatially locate brain tumours when operating on patients.

“In the study, we used Microsoft’s HoloLens 2 (a type of mixed-reality smart glasses) product to look at how we can use holograms in medicine,” Dr Ngiam said. “We have a whole Holo method programme, and there are quite a number of domains where we have invested. We hope that in the next few years, these domains would translate to actual clinical care, like what AI is translating today,” he concluded.