Teaming up with... AVIVA

Welcome to the UKGI weekly regulation update service for Aviva ABC brokers

We hope you find the Updates useful. If you are
interested in subscribing to our affordable
ABC compliance support package, please
email us at ABC@ukgigroup.com or
call UKGI on our dedicated ABC
contact line 01925 767893.

CII says institutions must be held accountable for AI outcomes

Link(s):CII says institutions must be held accountable for AI outcomes

Context

In recommendations submitted to the Treasury Select Committee (TSC), the Chartered Insurance Institute (CII) says institutions and individuals must be held accountable for decisions made using AI, and advocates for a ‘skills strategy’ to support the use of AI within financial services.  Professionals should “always be prepared to take accountability for the outcomes created by AI, either through design or monitoring”, and “all professionals should be educated on the potential harm that can come from mis-managing AI”.

Key points to note and next actions

  • In its submission to the TSC, the CII recommends that institutions should be held responsible for decisions made by the algorithms that they use.
  • This accountability should be backed up with validation and testing, especially for discriminatory harm, and institutions should make the results of these tests public.
  • The CII says that a ‘proportionate, regulatory approach’ to the use of AI in financial services would include implementation of a sector wide skills strategy, in which all employees of firms receive education on the potential and risks of AI, in order to strike the right balance between optimising its use and protecting consumers.
  • The submission draws on CII consumer research carried out over several years as part of its Public Trust Index, and highlights the potential for AI to support key areas that consumers and SMEs are seen to value in insurance, including ‘cost’, ‘protection’, ‘ease of use’, and ‘confidence’. 
  • In advocating for a focus on governance of AI within firms, the CII points to its Digital Companion to the Code of Ethics and Addressing Gender Bias in Artificial Intelligence, which set out practical steps that individuals and firms can take to use AI in a responsible way.