Confidential AI Governance Scorecard

Irish Credit Unions and Building Societies

Irish credit unions and building societies face unique AI governance challenges. With fast approaching regulatory deadlines under the EU AI Act and heightened Central Bank of Ireland expectations for individual accountability under IAF and SEAR, demonstrating robust governance over AI driven platforms is a critical priority.

This interactive scorecard provides a confidential, high level assessment of your organisation's position. Answer the 25 questions below to generate your preliminary score.

Why this matters now (January 2026): The Central Bank of Ireland has the power to take direct enforcement action against individual PCF holders under the IAF and SEAR frameworks. With the EU AI Act high-risk obligations applicable in August 2026 and CPC 2025 requirements effective March 2026, demonstrating governance over AI systems is no longer optional. DORA operational resilience requirements will apply to credit unions from January 2028.

Section 1: AI System Identification and Classification

Q1. Has your credit union or building society conducted a formal inventory of all AI systems currently in use across operations (including member services, lending, fraud detection, and back office functions)?
Q2. Have you classified each AI system according to the EU AI Act risk categories (Unacceptable/Prohibited, High Risk, Limited Risk, Minimal Risk)?
Q3. ICT System Classification (DORA Preparation): Have you identified which AI platforms would constitute critical or important ICT services under DORA's operational resilience requirements (effective January 2028 for credit unions)?

Section 2: IAF and SEAR Accountability Framework

Q4. Have you assigned clear AI governance responsibilities within your Statements of Responsibilities (SoR) for relevant PCF holders?
Q5. Has your board formally approved an AI governance framework that addresses EU AI Act and Central Bank requirements?
Q6. Do you have documented evidence that PCF holders understand their personal accountability for AI governance failures?
Q7. Have you established regular board level reporting on AI system performance, risks, and compliance?

Section 3: Staff AI Literacy and Competence

Q8. Have board members received training on AI governance, EU AI Act requirements, and their oversight responsibilities?
Q9. Have PCF holders and senior management received AI literacy training appropriate to their roles?
Q10. Have operational staff who interact with AI systems received training on their use, limitations, and when to escalate concerns?

Section 4: Risk Assessment and Human Oversight

Q11. Have you conducted formal risk assessments for each high risk AI system under the EU AI Act framework?
Q12. Have you established human oversight protocols for AI driven decisions affecting members (lending, credit scoring, fraud detection)?
Q13. Do you have processes to detect and mitigate bias in AI systems, particularly regarding protected characteristics under Irish equality legislation?
Q14. Have you established monitoring procedures to track AI system performance, accuracy, and compliance on an ongoing basis?

Section 5: Third-Party and Vendor Management

Q15. Have you reviewed all AI vendor contracts to ensure they address EU AI Act compliance obligations?
Q16. Do you have documented evidence of AI vendor due diligence, including their governance frameworks and compliance status?
Q17. Have you established a Register of Information for ICT third party arrangements (DORA preparation for January 2028)?
Q18. Do you have processes to assess and monitor vendor AI system changes, updates, and performance?

Section 6: Documentation, Transparency, and Member Rights

Q19. Have you developed technical documentation for each AI system covering purpose, functionality, data sources, and decision logic?
Q20. Have you updated member disclosures and privacy notices to explain AI use in decision making?
Q21. Have you established processes for members to request explanations of AI driven decisions affecting them?
Q22. Do you have procedures for members to challenge or appeal AI driven decisions?
Q23. Have you documented your AI governance framework in a format accessible to regulators during inspections?
Q24. Do you maintain audit logs of AI system decisions for regulatory review and member queries?
Q25. Have you established incident response procedures for AI system failures, bias discoveries, or regulatory breaches?

Your AI Governance Scorecard Results

Score Breakdown by Section

Section 1: AI System Identification and Classification
Section 2: IAF and SEAR Accountability Framework
Section 3: Staff AI Literacy and Competence
Section 4: Risk Assessment and Human Oversight
Section 5: Third-Party and Vendor Management
Section 6: Documentation, Transparency, and Member Rights
Book a Call with Lynda

Next Steps

For a confidential discussion about your results and how to address identified gaps, book a call or contact:

Lynda Connikie
CLARENDON
Email: lynda.connikie@clarendon.ie
Telephone: +44 (0)203 576 1834

Book a Call