3 in 5 Singapore workers fear losing control of AI due to dubious data

Image by Zhu Hongzhi
- Advertisement -

AI has a trust problem as nearly half (48%) of workers in Singapore say it is difficult to get what they want out of artificial intelligence and 40% do not trust the data used to train AI systems.

These are from a report released by Salesforce which is based on a survey of 545 full-time workers in Singapore and nearly 6,000 globally.

The study also found that more than half (58%) of workers in Singapore fear humans will lose control of AI, and 94% do not currently trust AI to operate without human oversight.

The trust gap in AI is hindering adoption. Of those who do not trust AI, 95% say they are hesitant to adopt it.

Without accurate data input and humans at the helm, this trust gap will continue growing, preventing businesses from reaping the full benefits of AI.

Findings also show that workers find it hard to trust AI even as businesses race to invest in and capitalise on AI’s benefits. This trust gap presents a significant barrier for businesses in adopting and utilising AI effectively.

Further, data will make or break workers’ trust in AI. Nearly half (48%) of workers in Singapore find it hard to get what they want from AI right now, and 40% do not trust the data used to train AI systems they see today.

In addition, there is a prevailing fear that humans will lose control of AI. With the growing sophistication of AI, 58% of workers in Singapore fear humans will lose control of AI. 94% do not currently trust AI to operate without human oversight.

Still more, a lack of trust hinders AI adoption as 95% of those who don’t trust AI overall are hesitant to adopt it, and two in three (66%) of those who don’t trust AI training data are hesitant to adopt it.

According to Salesforce’s AI Trust Quotient, AI also has a data problem. Without the right data, businesses risk exacerbating the trust gap between workers and AI.

Results show that 70% of workers who don’t trust AI say it lacks the information needed to be useful. Almost two-thirds (65%) of workers in Singapore say out-of-date public data and incomplete customer or company data (61%) would break their trust in AI.

To overcome this trust gap, workers in Singapore ranked the following as most important — accurate data (the AI tool uses accurate data),  84%; secure data (the AI tool does not put confidential data at risk), 82%; and holistic/complete data (the AI tool uses all possible, relevant data), 79%.

Four in every five (80%) of workers in Singapore say that AI needs to consistently produce accurate outputs for them to trust it.

This is higher than the global average of 68%. In Australia and India, the figure stands at 73% and 71%, respectively.

To unlock the full potential of AI, workers agree that humans must be in the driver’s seat of AI. This combines the best of human and machine intelligence to create more productive businesses, empowered employees, and trustworthy AI.

Workers in Singapore stress the need for humans at the helm, and 94% do not trust AI to operate without human oversight. 

This is slightly higher than the global average of 90%. This figure stands at 89% and 80% in Australia and India, respectively.

In Singapore, 90% of workers do not trust AI to keep data safe on its own, but 59% trust AI and humans to keep data safe together.

“AI is only as good as the data powering it, and new research shows that data makes or breaks the workforce’s trust in AI. Businesses need to unify their data across systems for AI to deliver useful, accurate outputs that workers trust,” said Sujith Abraham, SVP and general manager, Salesforce ASEAN. 

“This has to be supported by keeping humans in the driver’s seat of AI, empowering them to focus on the most important outcomes as we enter a new era of AI innovation,” said Abraham. “Only then can businesses achieve value from AI through better adoption.”