What’s Covered?
The Professionalizing Organizational AI Governance Report dives into how companies are responding to the fast rise of AI adoption by treating governance not as an afterthought, but as a core organizational need. Based on a global survey of over 500 professionals across 50+ countries, the report examines six key areas:
1. Use of AI in Organizations
AI is becoming mainstream—used not just in isolated innovation units, but across departments. However, 56% of respondents say their organization doesn’t fully understand the risks and benefits of deploying it.
2. AI Governance as a Strategic Priority
In 2022, AI governance ranked ninth in strategic priorities for privacy functions. By 2023, it shot up to second place, with 57% of privacy functions taking on AI-related tasks. That shift is driven by the growing realization that AI impacts data, ethics, compliance, and trust.
3. The AI Governance Function
Nearly 60% of organizations have already established or plan to establish a dedicated AI governance function. These functions often emerge out of privacy teams, using familiar processes like privacy impact assessments (PIAs) but extending them to tackle unique AI risks—opacity, bias, autonomy, and unpredictable outcomes.
4. AI-Enabled Compliance Benefits
Organizations with formal AI governance programs report benefits such as better coordination across departments, improved alignment with upcoming regulations, and increased trust from users and clients.
5. Implementation Challenges
Despite the momentum, most organizations face two major roadblocks:
- Lack of professional training or certifications for AI governance roles (33%)
- Shortage of qualified professionals (31%)These gaps echo the early years of privacy compliance and suggest a similar trajectory of specialization and certification ahead.
6. Looking Forward
The report sees AI governance following the path of privacy: evolving from an add-on to a profession. This includes role-specific training, structured assessment frameworks, and integration into enterprise risk and compliance management.
💡 Why it matters?
AI governance has moved beyond theory. It’s now part of strategic planning in most major organizations. But governance isn’t just a technical challenge—it’s a staffing and expertise problem. This report makes the case for building AI governance into your org chart now, not later.
What’s Missing?
This is a strong industry snapshot, but a few areas get less attention than they deserve:
- Public sector readiness is not discussed, even though many AI governance issues (like accountability and transparency) are especially sensitive in public administration.
- There’s limited detail on specific governance models, such as centralized vs. federated oversight or multi-stakeholder advisory panels.
- The report doesn’t dig into non-Western or SME perspectives, which could reveal unique regulatory or resourcing challenges.
- There’s little coverage of external transparency—how organizations communicate AI risks to customers, regulators, or the public.
Best For:
Privacy officers, general counsels, data governance leads, and AI policy professionals trying to operationalize responsible AI within a corporate setting. Also helpful for HR and L&D teams developing internal training on AI ethics and compliance.
Source Details:
Title: Professionalizing Organizational AI Governance
Authors:
- Joe Jones, Director of Research and Insights, IAPP
- Angela Saverice-Rohan, Global Privacy Leader, EY
Organizations:
- International Association of Privacy Professionals (IAPP) – the world’s largest privacy professional body.
- EY (Ernst & Young) – one of the Big Four professional services firms, with a growing AI governance and compliance practice.
Published: November 2023 (as a supplement to the IAPP-EY Privacy Governance Report 2023)
Data Basis: 500+ respondents from 50+ countries, surveyed May–July 2023
Key Terms: AI governance, privacy assessments, internal controls, risk mitigation, professionalization