Advisers warned on outsourcing investment decisions to AI

artificial intelligence OAIC financial advice privacy

21 October 2024
| By Laura Dew |
image
image
expand image

Under new guidance, financial advisers who opt to use artificial intelligence (AI) to provide advice must be able to demonstrate to clients how they reached their decisions and why that option was taken. 

The guidance from the Office of the Australian Information Commissioner (OAIC) outlined how AI can be used and how customer data can be used to train it. 

Adviser Ratings previously described that AI is increasingly becoming a “staple tool” for financial advisers to streamline business operations and reduce costs. In terms of how advice practices are adopting AI, 47 per cent are using it for client engagement, such as newsletter production, followed by marketing at 43 per cent, statement of advice or record of advice production at 41 per cent, and portfolio management at 12 per cent.

Meanwhile, firms such as Morgan Stanley, BlackRock, JP Morgan and Vanguard have already been vocal in their use of AI across wealth management and asset management. In the case of Morgan Stanley, 90 per cent of its advisers are using AI on a monthly basis.

While many advisers have indicated their preference to use AI with clients, the OAIC acknowledged it is a “high privacy risk activity”. 

The Guidance on privacy and the use of commercially available AI products stated it may be used to assist in decision-making process, but care should be taken regarding accuracy and appropriateness. 

“If your organisation is using an AI product to assist in the decision-making process, you must understand how the product is producing its outputs so that you can ensure the accuracy of the decision. As a matter of best practice, you should be able to provide a meaningful explanation to your client.

“In particular, if the behaviour of the AI system cannot be understood or explained clearly by your entity, it may not be appropriate to use for these purposes.”

It was important, therefore, for the AI to be overseen by a human who will be ultimately responsible for the investment decision as AI data can be inaccurate. Humans should also conduct regular due diligence of the AI products and staff training to ensure the option remains fit for purpose and appropriate for use with clients.

“You should ensure that a human user within your entity is responsible for verifying the accuracy of these outputs and can overturn any decisions made.”

It recommends the adviser to consider questions such as:

  • Is the personal information being input into the product accurate and of a high quality?
  • Do your records clearly indicate where information is the product of an AI output and therefore a probabilistic assessment rather than fact?
  • Do you have processes in place to ensure appropriate human oversight of AI outputs, with a human user responsible for verifying the accuracy of any personal information obtained through AI?
  • Does your organisation ensure that individuals are made aware of when AI is used in ways that may materially affect them?

Any use of AI must be disclosed to clients in the firm’s privacy policy.

“It is critical to ensure that you can provide the client with a sufficient explanation about how the decision was reached and the role that the AI product played in this process. This will enable the client to be comfortable that the decision was made appropriately and on the basis of accurate information.”

A recent research paper found that women were more likely than men to consider investment advice from AI, partly as it lacks gender biases presented by human advisers.

Two researchers examined 1,800 US participants who were presented with documents using Goldman Sachs stock market outlook wording. One was described as written by analysts, one written by analysts incorporating an advanced AI model, and one was described as written by an advanced AI model. All estimates and text remained the same, only the purported source differed.

The overall finding was that investors tended to be less responsive when an analyst incorporated AI, stemming from a lower perceived credibility of AI-generated forecasts. 

Interestingly, the report noted women were more responsive than men to AI forecasts and posited one reason for this could be the gender biases they experience when visiting a human financial adviser.

Read more about:

AUTHOR

Recommended for you

sub-bgsidebar subscription

Never miss the latest news and developments in wealth management industry

MARKET INSIGHTS

GG

So shareholders lose a dividend plus have seen the erosion of value. Qantas decides to clawback remuneration from Alan ...

1 month 4 weeks ago
Denise Baker

This is why I left my last position. There was no interest in giving the client quality time, it was all about bumping ...

1 month 4 weeks ago
gonski

So the Hayne Royal Commission has left us with this. What a sad day for the financial planning industry. Clearly most ...

1 month 4 weeks ago

A Sydney-based financial adviser has been banned from providing financial services in the interest of consumer protection after failing to act on conduct concerns. ...

1 week 5 days ago

The Reserve Bank of Australia has made its latest rate call, with only two more meetings left for 2024....

3 weeks 6 days ago

Financial advisory group AZ NGA has announced a strategic partnership with a $294 billion global investment manager to support its acquisition plans....

3 weeks ago