Notice

Please be advised you are about to leave the Standards Australia website to proceed to the AustLII website. Click OK to proceed.

New AI Management System standard discussed in recent live event

April 11, 2024

Statements

AS ISO/IEC 42001:2023 sets a precedent for the responsible use of Artificial Intelligence (AI), offering guidelines for organisations to establish, implement, maintain, and continually improve their AI management systems.

Standards Australia recently hosted an online event focussed on the AS ISO/IEC 42001:2023, Information technology - Artificial intelligence - Management system standard. This new standard aims to change the way AI management is approached, providing a clear framework for the ethical and consistent use of AI technologies.

The webinar, led by Harm Ellens, Committee Member of Australia Technical Committee IT-43 Artificial Intelligence and Director at Virtual Ink Australia, explored the details of this new standard. The event served as a valuable resource for technology professionals and organisations, offering them insights into how the standard can help manage their AI systems effectively and ethically.

The event also included a live Q&A session, allowing attendees to ask Mr. Ellens questions related to the standards and their work.

Read on for the answers to all the questions that were asked by attendees during the event or scroll to the bottom of this page to view the video recording.‍

Q&A‍

Has the banking industry adopted the standard and implemented it?

‍Yes, several financial organisations in Australia are actively working towards implementing the standard. A Canadian bank has successfully conducted a test run of the AIMMS certification process with the draft version of the standard, highlighting its broad applicability in the financial sector.

What are the risks of not implementing an AI management system standard?

Organisations risk falling behind in compliance, especially in regions like Europe where the European AI Act mandates adherence to standards like AS ISO/IEC 42001:2023 for high-risk AI use cases. Non-implementation can lead to significant legal and operational challenges.

Why should companies, especially SMEs, consider implementing the AS ISO/IEC 42001:2023 standard?

Implementing the standard could help support efforts to enhance or restore a company's reputation in relation to AI, particularly in sectors like retail where AI use has raised privacy concerns. A handbook is being developed to aid SMEs in the implementation process, ensuring it is both manageable and not resource intensive.

Is the standard applicable to both AI developers and procurers?

Yes, the standard provides guidance for both developers and procurers, addressing their unique risks and objectives in managing AI systems effectively.

How will the AS ISO/IEC 42001:2023 standard stay current with advances in AI technology?

The standard will undergo review every five years, with potential updates every two years or as needed, to address rapid developments in AI technology, including generative AI.

Can government departments and agencies use the AS ISO/IEC 42001:2023 standard?

Absolutely. The principles of the standard are highly applicable and beneficial for government settings, facilitating responsible AI use and procurement.

What distinguishes the AS ISO/IEC 42001:2023 standard from other AI standards?

AS ISO/IEC 42001:2023 stands out as the first internationally recognised standard for AI management, constructed independently of any specific legal code. Its global reach sets it apart from regional standards, offering a comprehensive framework for AI management across borders.

What percentage or proportion of Australian businesses are ready to adopt AS ISO/IEC 42001:2023?

Based on anecdotal evidence, organisations that are likely to be affected by regulation in the US and EU are starting to prepare for AS ISO/IEC 42001:2023. Their number is in the low single-digit percentage points for the moment and expected to increase as AU regulation takes shape.

By procured AI, does it refer to the basic LLM model such as GPT 3.5? If we are building on top of that, are we using a procured AI?

It applies to both. The standard identifies a variety of stakeholder roles, and those roles vary from AI producer to AI service provider and AI user. The producer could be required to be conformant with the standard, as could the user. However, they would achieve that compliance in a differentiated way.

Is this a stand-alone standard and will it be required to be certified by a Certification Body like AS/NZS ISO 9001:2016 or AS ISO/IEC 27001:2022?

The AS ISO/IEC 42001:2023 standard is specific to AI systems whereas, say, AS ISO/IEC 27001:2022 is specific to cybersecurity. So, there will be a requirement for both. However, it is possible for these multiple management standards to reference each other. So being in conformance with AS/NZS ISO 9001:2016 and AS ISO/IEC 27001:2022 reduces the overall AS ISO/IEC 42001:2023 establishment and maintenance effort.

Is this standard enough as an organisation's internal AI government framework if implemented completely?

AI governance is informed by the AI policy and the objectives for AI as described in ISO/IEC 38507. For most organisations AS ISO/IEC 42001:2023 may well prove sufficient. But for others, especially those that manufacture devices and products with embedded AI, or that automate financial recommendations, conformity to other standards may be required. And this is also separate from any legislative and regulatory requirements, of which we will see some appear over the course of 2024-25 across the world, including Australia.

What do you think will be the main challenges companies may face when implementing the standard in their organisation?

The first step is often the hardest. Creating an inventory of the AI already deployed and determining the scope of the AI Management system requires getting a lot of people involved. From there, the standard pretty much outlines the process that you will have to complete.

Have you seen any pharmaceutical companies adopting this standard?

Not yet. However, there are medical device companies already in the process of implementing the standard. Diagnostic software is now considered a 'device' so if a pharmaceutical company develops and publishes a diagnostic app, it is very likely they will have to adopt AS ISO/IEC 42001:2023 at some stage in the future.

Is AS ISO/IEC 42001:2023 developed considering GenAI and LLM implications and different risk profile compared to other AI systems?

AS ISO/IEC 42001:2023 can be applied to any AI, including non-machine learning AI and recent machine learning approaches and developments. ISO/IEC 23894 can help with performing the risk assessment for your organisation and your use case for GenAI and LLM.

Are there any parts of the standard that are most useful for tech leads? More broadly, could you please let us know if you have any advice for the tech leads?

The most useful part for tech leads is the reference to another standard, ISO/IEC 5338 AI system life cycle processes. It maps how the AI system lifecycle differs from the traditional software lifecycle. Many parts of AS ISO/IEC 42001:2023 reference ISO/IEC 5338, and as a tech lead you will be required to demonstrate how you are managing the AI system lifecycle. That standard will provide you with a process framework that you can use to review and update your organisation's development practices.

Live event recording:

ASISO/IEC 42001:2023 is available via the Standards Australia Store and our distribution partners.

NOTE: This webinar and other information on this page contain general information and is not formal advice. Users must make their own assessment as to the suitability of this material and the standards referred to herein for their specific business needs.

New online training

For a more in-depth understanding of ASISO/IEC 42001:2023, enrol now for the online training course, in partnership with the Australian National University (ANU), 'Understanding AS ISO/IEC 42001:2023, Information Technology - Artificial Intelligence - Management System'.

Contact
Communications Department
New AI Management System standard discussed in recent live event
Email and link here
AS ISO/IEC 42001:2023 sets a precedent for the responsible use of Artificial Intelligence (AI), offering guidelines for organisations to establish, implement, maintain, and continually improve their AI management systems.

Standards Australia recently hosted an online event focussed on the AS ISO/IEC 42001:2023, Information technology - Artificial intelligence - Management system standard. This new standard aims to change the way AI management is approached, providing a clear framework for the ethical and consistent use of AI technologies.

The webinar, led by Harm Ellens, Committee Member of Australia Technical Committee IT-43 Artificial Intelligence and Director at Virtual Ink Australia, explored the details of this new standard. The event served as a valuable resource for technology professionals and organisations, offering them insights into how the standard can help manage their AI systems effectively and ethically.

The event also included a live Q&A session, allowing attendees to ask Mr. Ellens questions related to the standards and their work.

Read on for the answers to all the questions that were asked by attendees during the event or scroll to the bottom of this page to view the video recording.‍

Q&A‍

Has the banking industry adopted the standard and implemented it?

‍Yes, several financial organisations in Australia are actively working towards implementing the standard. A Canadian bank has successfully conducted a test run of the AIMMS certification process with the draft version of the standard, highlighting its broad applicability in the financial sector.

What are the risks of not implementing an AI management system standard?

Organisations risk falling behind in compliance, especially in regions like Europe where the European AI Act mandates adherence to standards like AS ISO/IEC 42001:2023 for high-risk AI use cases. Non-implementation can lead to significant legal and operational challenges.

Why should companies, especially SMEs, consider implementing the AS ISO/IEC 42001:2023 standard?

Implementing the standard could help support efforts to enhance or restore a company's reputation in relation to AI, particularly in sectors like retail where AI use has raised privacy concerns. A handbook is being developed to aid SMEs in the implementation process, ensuring it is both manageable and not resource intensive.

Is the standard applicable to both AI developers and procurers?

Yes, the standard provides guidance for both developers and procurers, addressing their unique risks and objectives in managing AI systems effectively.

How will the AS ISO/IEC 42001:2023 standard stay current with advances in AI technology?

The standard will undergo review every five years, with potential updates every two years or as needed, to address rapid developments in AI technology, including generative AI.

Can government departments and agencies use the AS ISO/IEC 42001:2023 standard?

Absolutely. The principles of the standard are highly applicable and beneficial for government settings, facilitating responsible AI use and procurement.

What distinguishes the AS ISO/IEC 42001:2023 standard from other AI standards?

AS ISO/IEC 42001:2023 stands out as the first internationally recognised standard for AI management, constructed independently of any specific legal code. Its global reach sets it apart from regional standards, offering a comprehensive framework for AI management across borders.

What percentage or proportion of Australian businesses are ready to adopt AS ISO/IEC 42001:2023?

Based on anecdotal evidence, organisations that are likely to be affected by regulation in the US and EU are starting to prepare for AS ISO/IEC 42001:2023. Their number is in the low single-digit percentage points for the moment and expected to increase as AU regulation takes shape.

By procured AI, does it refer to the basic LLM model such as GPT 3.5? If we are building on top of that, are we using a procured AI?

It applies to both. The standard identifies a variety of stakeholder roles, and those roles vary from AI producer to AI service provider and AI user. The producer could be required to be conformant with the standard, as could the user. However, they would achieve that compliance in a differentiated way.

Is this a stand-alone standard and will it be required to be certified by a Certification Body like AS/NZS ISO 9001:2016 or AS ISO/IEC 27001:2022?

The AS ISO/IEC 42001:2023 standard is specific to AI systems whereas, say, AS ISO/IEC 27001:2022 is specific to cybersecurity. So, there will be a requirement for both. However, it is possible for these multiple management standards to reference each other. So being in conformance with AS/NZS ISO 9001:2016 and AS ISO/IEC 27001:2022 reduces the overall AS ISO/IEC 42001:2023 establishment and maintenance effort.

Is this standard enough as an organisation's internal AI government framework if implemented completely?

AI governance is informed by the AI policy and the objectives for AI as described in ISO/IEC 38507. For most organisations AS ISO/IEC 42001:2023 may well prove sufficient. But for others, especially those that manufacture devices and products with embedded AI, or that automate financial recommendations, conformity to other standards may be required. And this is also separate from any legislative and regulatory requirements, of which we will see some appear over the course of 2024-25 across the world, including Australia.

What do you think will be the main challenges companies may face when implementing the standard in their organisation?

The first step is often the hardest. Creating an inventory of the AI already deployed and determining the scope of the AI Management system requires getting a lot of people involved. From there, the standard pretty much outlines the process that you will have to complete.

Have you seen any pharmaceutical companies adopting this standard?

Not yet. However, there are medical device companies already in the process of implementing the standard. Diagnostic software is now considered a 'device' so if a pharmaceutical company develops and publishes a diagnostic app, it is very likely they will have to adopt AS ISO/IEC 42001:2023 at some stage in the future.

Is AS ISO/IEC 42001:2023 developed considering GenAI and LLM implications and different risk profile compared to other AI systems?

AS ISO/IEC 42001:2023 can be applied to any AI, including non-machine learning AI and recent machine learning approaches and developments. ISO/IEC 23894 can help with performing the risk assessment for your organisation and your use case for GenAI and LLM.

Are there any parts of the standard that are most useful for tech leads? More broadly, could you please let us know if you have any advice for the tech leads?

The most useful part for tech leads is the reference to another standard, ISO/IEC 5338 AI system life cycle processes. It maps how the AI system lifecycle differs from the traditional software lifecycle. Many parts of AS ISO/IEC 42001:2023 reference ISO/IEC 5338, and as a tech lead you will be required to demonstrate how you are managing the AI system lifecycle. That standard will provide you with a process framework that you can use to review and update your organisation's development practices.

Live event recording:

ASISO/IEC 42001:2023 is available via the Standards Australia Store and our distribution partners.

NOTE: This webinar and other information on this page contain general information and is not formal advice. Users must make their own assessment as to the suitability of this material and the standards referred to herein for their specific business needs.

New online training

For a more in-depth understanding of ASISO/IEC 42001:2023, enrol now for the online training course, in partnership with the Australian National University (ANU), 'Understanding AS ISO/IEC 42001:2023, Information Technology - Artificial Intelligence - Management System'.

Contact
Communications Department
communications@standards.org.au
communications@standards.org.au
Adam Stingemore
Chief Development Officer
+61 2 9237 6086
Jess Dunne
Communications Manager
+ 61 2 9237 6381