You have 0 free articles left this month.
Register for a free account to access unlimited free content.
Powered by MOMENTUM MEDIA
accountants daily logo

Six out of 10 employees distrust artificial intelligence

Technology

Tools such as ChatGPT are exposing more workers to AI, but concerns linger over the lack of regulation.

By Josh Needs 12 minute read

Almost 60 per cent of Australians distrust the use of AI at work through tools such as ChatGPT – a sentiment mirrored in other western countries such as the UK and Canada, according to a KPMG survey. 

KPMG chair for organisational trust at the University of Queensland School of Business, professor Nicole Gillespie, said the global study found employees accepted AI only for menial tasks. 

“Most people are comfortable with AI use for the purpose of augmenting employee performance and decision-making, for example by providing feedback on how to improve performance and providing input for making decisions,” said Ms Gillespie. 

“However, people are notably less comfortable with AI use for human resource management, such as to monitor and evaluate employees, collect employee performance data, and support recruitment and selection processes.” 

The report found that of the 60 per cent of people wary of AI systems, about half were ambivalent towards it while the rest were somewhat or completely unwilling to trust it.

KPMG’s research also found 71 per cent of Australians agreed with the statements: “The impact of AI is unpredictable”, “The long-term impact of AI on society is uncertain” and “There is a lot of uncertainty around AI” – a figure in line with sentiment in the US.

==
==

Only 35 per cent of Australians believed there were enough safeguards to make AI safe, the study found, with many saying inadequate regulations had failed to keep up over the past two years. 

Lead partner at KPMG Futures James Mabbott said a lack of trust in organisations and the government was also influential.

“A key challenge is that a third of people have low confidence in government, technology and commercial organisations to develop, use and govern AI in society’s best interest,” said Mr Mabbott. 

“Organisations can build trust in their use of AI by putting in place mechanisms that demonstrate responsible use, such as regularly monitoring accuracy and reliability, implementing AI codes of conduct, independent AI ethics reviews and certifications, and adhering to emerging international standards.” 

Ms Gillespie said to achieve widespread AI take-up there would need to be greater trust in the technology and those who controlled it, with more visible regulations the key. 

“The findings highlight the importance of developing adequate governance and regulatory mechanisms that safeguard people from the risks associated with AI use and public confidence in entities to enact these safeguards, as well as ensuring AI is designed and used in a human-centric way to benefit people,” she said. 

KPMG also found a generational divide over AI. Its study revealed 65 per cent of Gen X and millennials trusted AI at work compared to only 39 per cent of older Australians. 

Education was also seen as a factor, with 56 per cent of university-educated respondents trusting AI compared to only 40 per cent of those without one. 

 

Josh Needs

Josh Needs

AUTHOR

Josh Needs is a journalist at Accountants Daily and SMSF Adviser, which are the leading sources of news, strategy, and educational content for professionals in the accounting and SMSF sectors.

Josh studied journalism at the University of NSW and previously wrote news, feature articles and video reviews for Unsealed 4x4, a specialist offroad motoring website. Since joining the Momentum Media Team in 2022, Josh has written for Accountants Daily and SMSF Adviser.

You can email Josh on: This email address is being protected from spambots. You need JavaScript enabled to view it.

You are not authorised to post comments.

Comments will undergo moderation before they get published.

accountants daily logo Newsletter

Receive breaking news directly to your inbox each day.

SUBSCRIBE NOW