Advisers warned on outsourcing investment decisions to AI
Under new guidance, financial advisers who opt to use artificial intelligence (AI) to provide advice must be able to demonstrate to clients how they reached their decisions and why that option was taken.
The guidance from the Office of the Australian Information Commissioner (OAIC) outlined how AI can be used and how customer data can be used to train it.
Adviser Ratings previously described that AI is increasingly becoming a “staple tool” for financial advisers to streamline business operations and reduce costs. In terms of how advice practices are adopting AI, 47 per cent are using it for client engagement, such as newsletter production, followed by marketing at 43 per cent, statement of advice or record of advice production at 41 per cent, and portfolio management at 12 per cent.
Meanwhile, firms such as Morgan Stanley, BlackRock, JP Morgan and Vanguard have already been vocal in their use of AI across wealth management and asset management. In the case of Morgan Stanley, 90 per cent of its advisers are using AI on a monthly basis.
While many advisers have indicated their preference to use AI with clients, the OAIC acknowledged it is a “high privacy risk activity”.
The Guidance on privacy and the use of commercially available AI products stated it may be used to assist in decision-making process, but care should be taken regarding accuracy and appropriateness.
“If your organisation is using an AI product to assist in the decision-making process, you must understand how the product is producing its outputs so that you can ensure the accuracy of the decision. As a matter of best practice, you should be able to provide a meaningful explanation to your client.
“In particular, if the behaviour of the AI system cannot be understood or explained clearly by your entity, it may not be appropriate to use for these purposes.”
It was important, therefore, for the AI to be overseen by a human who will be ultimately responsible for the investment decision as AI data can be inaccurate. Humans should also conduct regular due diligence of the AI products and staff training to ensure the option remains fit for purpose and appropriate for use with clients.
“You should ensure that a human user within your entity is responsible for verifying the accuracy of these outputs and can overturn any decisions made.”
It recommends the adviser to consider questions such as:
- Is the personal information being input into the product accurate and of a high quality?
- Do your records clearly indicate where information is the product of an AI output and therefore a probabilistic assessment rather than fact?
- Do you have processes in place to ensure appropriate human oversight of AI outputs, with a human user responsible for verifying the accuracy of any personal information obtained through AI?
- Does your organisation ensure that individuals are made aware of when AI is used in ways that may materially affect them?
Any use of AI must be disclosed to clients in the firm’s privacy policy.
“It is critical to ensure that you can provide the client with a sufficient explanation about how the decision was reached and the role that the AI product played in this process. This will enable the client to be comfortable that the decision was made appropriately and on the basis of accurate information.”
A recent research paper found that women were more likely than men to consider investment advice from AI, partly as it lacks gender biases presented by human advisers.
Two researchers examined 1,800 US participants who were presented with documents using Goldman Sachs stock market outlook wording. One was described as written by analysts, one written by analysts incorporating an advanced AI model, and one was described as written by an advanced AI model. All estimates and text remained the same, only the purported source differed.
The overall finding was that investors tended to be less responsive when an analyst incorporated AI, stemming from a lower perceived credibility of AI-generated forecasts.
Interestingly, the report noted women were more responsive than men to AI forecasts and posited one reason for this could be the gender biases they experience when visiting a human financial adviser.
Recommended for you
The FSCP has announced its latest verdict, suspending an adviser’s registration for failing to comply with his obligations when providing advice to three clients.
Having sold Madison to Infocus earlier this year, Clime has now set up a new financial advice licensee with eight advisers.
With licensees such as Insignia looking to AI for advice efficiencies, they are being urged to write clear AI policies as soon as possible to prevent a “Wild West” of providers being used by their practices.
Iress has revealed the number of clients per adviser that top advice firms serve, as well as how many client meetings they conduct each week.