Microsoft has achieved ISO/IEC 42001:2023 certification—a globally acknowledged normal for Synthetic Intelligence Administration Methods for each Azure AI Foundry Fashions and Microsoft Safety Copilot.
Microsoft has achieved ISO/IEC 42001:2023 certification—a globally acknowledged normal for Synthetic Intelligence Administration Methods (AIMS) for each Azure AI Foundry Fashions and Microsoft Safety Copilot. This certification underscores Microsoft’s dedication to constructing and working AI techniques responsibly, securely, and transparently. As accountable AI is quickly changing into a enterprise and regulatory crucial, this certification displays how Microsoft permits prospects to innovate with confidence.
Elevating the bar for accountable AI with ISO/IEC 42001
ISO/IEC 42001, developed by the Worldwide Group for Standardization (ISO) and the Worldwide Electrotechnical Fee (IEC), establishes a globally acknowledged framework for the administration of AI techniques. It addresses a broad vary of necessities, from danger administration and bias mitigation to transparency, human oversight, and organizational accountability. This worldwide normal gives a certifiable framework for establishing, implementing, sustaining, and enhancing an AI administration system, supporting organizations in addressing dangers and alternatives all through the AI lifecycle.
By attaining this certification, Microsoft demonstrates that Azure AI Foundry Fashions, together with Azure OpenAI fashions, and Microsoft Safety Copilot prioritize accountable innovation and are validated by an impartial third get together. It gives our prospects with added assurance that Microsoft Azure’s software of strong governance, danger administration, and compliance practices throughout Azure AI Foundry Fashions and Microsoft Safety Copilot are developed and operated in alignment with Microsoft’s Accountable AI Normal.
Supporting prospects throughout industries
Whether or not you might be deploying AI in regulated industries, embedding generative AI into merchandise, or exploring new AI use circumstances, this certification helps prospects:
- Speed up their very own compliance journey by leveraging licensed AI companies and inheriting governance controls aligned with rising laws.
- Construct belief with their very own customers, companions, and regulators by clear, auditable governance evidenced with the AIMS certification for these companies.
- Acquire transparency into how Microsoft manages AI dangers and governs accountable AI improvement, giving customers higher confidence within the companies they construct on.
Engineering belief and accountable AI into the Azure platform
Microsoft’s Accountable AI (RAI) program is the spine of our strategy to reliable AI and contains 4 core pillars—Govern, Map, Measure, and Handle—which guides how we design, customise, and handle AI functions and brokers. These ideas are embedded into each Azure AI Foundry Fashions and Microsoft Safety Copilot, leading to companies designed to be modern, protected and accountable.
We’re dedicated to delivering on our Accountable AI promise and proceed to construct on our present work which incorporates:
- Our AI Buyer Commitments to help our prospects on their accountable AI journey.
- Our inaugural Accountable AI Transparency Report that allows us to document and share our maturing practices, mirror on what we’ve got discovered, chart our targets, maintain ourselves accountable, and earn the general public’s belief.
- Our Transparency Notes for Azure AI Foundry Fashions and Microsoft Safety Copilot assist prospects perceive how our AI expertise works, its capabilities and limitations, and the alternatives system homeowners could make that affect system efficiency and habits.
- Our Accountable AI resources site which gives instruments, practices, templates and data we imagine will assist a lot of our prospects set up their accountable AI practices.
Supporting your accountable AI journey with belief
We acknowledge that accountable AI requires greater than expertise; it requires operational processes, danger administration, and clear accountability. Microsoft helps prospects in these efforts by offering each the platform and the experience to operational belief and compliance. Microsoft stays steadfast in our dedication to the next:
- Regularly enhancing our AI administration system.
- Understanding the wants and expectations of our prospects.
- Constructing onto the Microsoft RAI program and AI danger administration.
- Figuring out and actioning upon alternatives that permit us to construct and keep belief in our AI services.
- Collaborating with the rising neighborhood of accountable AI practitioners, regulators, and researchers on advancing our accountable AI strategy.
ISO/IEC 42001:2023 joins Microsoft’s intensive portfolio of compliance certifications, reflecting our dedication to operational rigor and transparency, serving to prospects construct responsibly on a cloud platform designed for belief. From a healthcare group striving for equity to a monetary establishment overseeing AI danger, or a authorities company advancing moral AI practices, Microsoft’s certifications allow the adoption of AI at scale whereas aligning compliance with evolving world requirements for safety, privateness, and accountable AI governance.
Microsoft’s basis in safety and knowledge privateness and our investments in operational resilience and accountable AI exhibits our dedication to incomes and preserving belief at each layer. Azure is engineered for belief, powering innovation on a safe, resilient, and clear basis that provides prospects the arrogance to scale AI responsibly, navigate evolving compliance wants, and keep accountable for their knowledge and operations.
Study extra with Microsoft
As AI laws and expectations proceed to evolve, Microsoft stays centered on delivering a trusted platform for AI innovation, constructed with resiliency, safety, and transparency at its core. ISO/IEC 42001:2023 certification is a crucial step on that path, and Microsoft will proceed investing in exceeding world requirements and driving accountable improvements to assist prospects keep forward—securely, ethically, and at scale.
Discover how we put belief on the core of cloud innovation with our strategy to safety, privateness, and compliance on the Microsoft Belief Heart. View this certification and report, in addition to different compliance paperwork on the Microsoft Service Belief Portal.
The ISO/IEC 42001:2023 certification for Azure AI Foundry: Azure AI Foundry Fashions and Microsoft Safety Copilot was issued by Mastermind, an ISO-accredited certification physique by the Worldwide Accreditation Service (IAS).