Government seeks to implement mandatory AI guardrails
The federal government is looking to introduce mandatory guardrails for the use of artificial intelligence (AI) to ensure safe and responsible practices.
Following responses to the safe and responsible AI consultation, it has been established that Australians want stronger protection in place to manage the risk of AI.
The interim report, released on 17 January, highlighted there is already sector-specific regulation in place for financial services. It also noted the use of a financial services sandbox by ASIC has allowed a limited form of experimentation with AI-powered technology.
“While Australia already has some safeguards in place for AI and the responses to AI are at an early stage globally, it is not alone in weighing whether further regulatory and governance mechanisms are required to mitigate emerging risks.
“There are strong foundations for Australia to be a leader in responsible AI.”
The use of a sandbox was also noted by the Financial Services Council (FSC) as a recommendation in its submission to the consultation.
It said: “The government could encourage innovation of trusted and safe AI through the provision of a voluntary regulatory sandbox in which organisations could test their products in a safe way before going to market. This would provide certainty that the product met the required regulatory standards and build trust amongst consumers that the product has been appropriately tested.”
Money Management previously wrote about the use of AI in financial advice and paraplanning.
Guideway Financial Services announced a new AI avatar paraplanning service called FinTalk, while Raiz Invest launched an AI-powered service called Your Beautiful Life which allows advisers to produce meeting notes, review a client’s finances and produce a statement of advice.
Meanwhile, Netwealth highlighted financial advice as an early beneficiary of AI and is using Microsoft Copilot in its own business.
Ed Husic, Minister for Industry and Science, said: “Australians understand the value of artificial intelligence, but they want to see the risks identified and tackled. We have heard loud and clear that Australians want stronger guardrails to manage higher-risk AI.
“The government’s response is targeted towards the use of AI in high-risk settings, where harms could be difficult to reverse, while ensuring that the vast majority of low-risk AI use continues to flourish largely unimpeded.
“The government is now considering mandatory guardrails for AI development and deployment in high-risk settings, whether through changes to existing laws or the creation of new AI-specific laws.”
Immediate actions being taken include:
- working with industry to develop a voluntary AI safety standard.
- working with industry to develop options for voluntary labelling and watermarking of AI-generated materials.
- establishing an expert advisory group to support the development of options for mandatory guardrails.
Mandatory guardrails to promote the safe design, development and deployment of AI systems will be considered, including possible requirements relating to:
- Testing – testing of products to ensure safety before and after release.
- Transparency – transparency regarding model design and data underpinning AI applications; labelling of AI systems in use and/or watermarking of AI-generated content.
- Accountability – training for developers and deployers of AI systems, possible forms of certification, and clearer expectations of accountability for organisations developing, deploying and relying on AI systems.
Outside of financial advice, the Minister for Financial Services Stephen Jones has also warned about the use of AI in scams, such as AI voice scams or QR code phishing.
Recommended for you
Financial Services Minister Stephen Jones has shared further details on the second tranche of the Delivering Better Financial Outcomes reforms including modernising best interests duty and reforming Statements of Advice.
The Federal Court has found a company director guilty of operating unregistered managed investment schemes and carrying on a financial services business without holding an AFSL.
The Governance Institute has said ASIC’s governance arrangements are no longer “fit for purpose” in a time when financial markets are quickly innovating and cyber crime becomes a threat.
Compliance professionals working in financial services are facing burnout risk as higher workloads, coupled with the ever-changing regulation, place notable strain on staff.