Advisers warned on outsourcing investment decisions to AI
Under new guidance, financial advisers who opt to use artificial intelligence (AI) to provide advice must be able to demonstrate to clients how they reached their decisions and why that option was taken.
The guidance from the Office of the Australian Information Commissioner (OAIC) outlined how AI can be used and how customer data can be used to train it.
Adviser Ratings previously described that AI is increasingly becoming a “staple tool” for financial advisers to streamline business operations and reduce costs. In terms of how advice practices are adopting AI, 47 per cent are using it for client engagement, such as newsletter production, followed by marketing at 43 per cent, statement of advice or record of advice production at 41 per cent, and portfolio management at 12 per cent.
Meanwhile, firms such as Morgan Stanley, BlackRock, JP Morgan and Vanguard have already been vocal in their use of AI across wealth management and asset management. In the case of Morgan Stanley, 90 per cent of its advisers are using AI on a monthly basis.
While many advisers have indicated their preference to use AI with clients, the OAIC acknowledged it is a “high privacy risk activity”.
The Guidance on privacy and the use of commercially available AI products stated it may be used to assist in decision-making process, but care should be taken regarding accuracy and appropriateness.
“If your organisation is using an AI product to assist in the decision-making process, you must understand how the product is producing its outputs so that you can ensure the accuracy of the decision. As a matter of best practice, you should be able to provide a meaningful explanation to your client.
“In particular, if the behaviour of the AI system cannot be understood or explained clearly by your entity, it may not be appropriate to use for these purposes.”
It was important, therefore, for the AI to be overseen by a human who will be ultimately responsible for the investment decision as AI data can be inaccurate. Humans should also conduct regular due diligence of the AI products and staff training to ensure the option remains fit for purpose and appropriate for use with clients.
“You should ensure that a human user within your entity is responsible for verifying the accuracy of these outputs and can overturn any decisions made.”
It recommends the adviser to consider questions such as:
- Is the personal information being input into the product accurate and of a high quality?
- Do your records clearly indicate where information is the product of an AI output and therefore a probabilistic assessment rather than fact?
- Do you have processes in place to ensure appropriate human oversight of AI outputs, with a human user responsible for verifying the accuracy of any personal information obtained through AI?
- Does your organisation ensure that individuals are made aware of when AI is used in ways that may materially affect them?
Any use of AI must be disclosed to clients in the firm’s privacy policy.
“It is critical to ensure that you can provide the client with a sufficient explanation about how the decision was reached and the role that the AI product played in this process. This will enable the client to be comfortable that the decision was made appropriately and on the basis of accurate information.”
A recent research paper found that women were more likely than men to consider investment advice from AI, partly as it lacks gender biases presented by human advisers.
Two researchers examined 1,800 US participants who were presented with documents using Goldman Sachs stock market outlook wording. One was described as written by analysts, one written by analysts incorporating an advanced AI model, and one was described as written by an advanced AI model. All estimates and text remained the same, only the purported source differed.
The overall finding was that investors tended to be less responsive when an analyst incorporated AI, stemming from a lower perceived credibility of AI-generated forecasts.
Interestingly, the report noted women were more responsive than men to AI forecasts and posited one reason for this could be the gender biases they experience when visiting a human financial adviser.
Recommended for you
As the year comes to an end, Money Management takes a look at the biggest announcements that shocked the financial advice industry in 2024.
As the year draws to a close, a new report has explored the key trends and areas of focus for financial advisers over the last 12 months.
Assured Support explores five tips to help financial advisers embed compliance into the heart of their business, with 2025 set to see further regulatory change.
David Sipina has been sentenced to three years under an intensive correction order for his role in the unlicensed Courtenay House financial services.