+ | - | reset

Introduction

There are many challenges a board of a financial services company in Asia faces such as the typical concerns of the economy, staff and allocating resources. Additionally, technology-related issues such as cybersecurity, privacy and AI are high on the agenda [1]. Globally, most boards struggle to fully understand how they should utilise AI and apply governance to it [2]. Moving too fast without clarity may increase both traditional risks and new ones Gen AI introduces, while moving too slowly may render the organisation obsolete. Leadership is needed during this transition particularly from those that can deeply influence strategic decisions. It is important to get the strategic decisions right and also to lead the financial organisation through a digital transformation.

Additional Risks for Boards

However, there may be additional liabilities for boards if decisions based on AI go wrong. This adds more complexity to risk management. Shareholders can claim a board knew or should have known the dangers involved in a decision. Board members may have to consider whether their insurance covers potential claims. These changes to the board’s role and the uncertainty it may bring must not prevent them from instilling trust in the process of digital transformation with AI.

Trust Risk

As the role of AI increases, in most cases, the role of people decreases and the dependence on AI increases. This can bring a feeling of losing control and a potential increase in risk or at least, perceived risk. While in the past the risk would typically be about a process failing, for example, a money transfer not being completed successfully, however, with AI the risks are broader. With AI, the risk goes beyond whether a process is completed to why it has done something. For example, an insurer would understand why they reject a customer when they use traditional automation without AI, but this is not always clear with AI. AI can be a legal and regulatory risk, along with a risk to reputation, finances and ethical behaviour.

There are many tensions in AI adoption that the board need to contend with. Moving quickly may give a competitive advantage but may reduce trust. This is due to not having time to build trust and problems that a rushed implementation may cause. Using AI more broadly may increase the reward but also the risk.

Given the complex and delicate balance an organisation must strike when using AI, this is a process where a wise and experienced board can and must have an active leadership role. What is gained by AI is important, but what is lost by automating and replacing people may not be reversible. Skills, knowledge and wisdom may be lost irreversibly.

How to Build Trust in AI

Our research interviewed 21 Malaysian Fintech board members to identify the most effective ways for a board to build trust for shareholders, staff and customers [Zarifis and Yarovaya, 2024].

Tables 1, 2 and 3 illustrate how a board should build trust in AI. The findings show that there is a significant overlap with good overall implementation and governance of AI. There are, however, several issues identified that are specific ways to communicate how AI is used and to build trust. The findings also show that some ways of applying Gen AI are more conducive to trust in AI, even if they may be a more restrained and limited application, and some of Gen AI’s performance is sacrificed. There are, therefore, some trade-offs between unleashing Gen AI in all its capacity and a more constrained, transparent and predictable application that builds trust in customers, staff and shareholders. This balancing act, between a fast adoption of Gen AI and a more cautious adoption of a more controlled AI, is at the heart of the challenge the board faces.

Table 1. How leaders can build trust in AI with shareholders

Implementation

  1. Use AI in a way that does not increase financial or other risks.
  2. Build in-house expertise, don’t rely on one consultant or technology provider.
  3. Create a new committee focused on the governance of AI and data. Accurately evaluate new risks (compliance etc.).
  4. Develop an AI framework risk that the board will use to evaluate and communicate risks from AI implementations. Management should regularly update the framework.
  5. Renew and diversify the board composition, bring in more technical knowledge and have sufficient competence in AI. Keep up with technological developments. Ensure all board members understand how Gen AI and traditional AI work.
  6. Make the right strategic decisions and collaboration for the necessary technology and data (e.g., through APIs, etc.)

Communication

  1. Clear vision on AI use. Illustrate sound business judgment. Showcase the organisation’s AI talent.
  2. Clear boundaries on what AI does and does not do. Show willingness to enforce these.
  3. Illustrate an ability to follow developments: Show similar cases of AI use from competitors or companies in other areas.
  4. If trust is placed in specific leaders who have a smaller influence on AI, the trust lost must be rebuilt.
  5. Be transparent about AI risks so shareholders can also evaluate them as accurately as possible.

 

Table 2. How leaders can build trust in AI with staff

Implementation

  1. Show long-term financial commitment to AI initiatives.
  2. Encourage a mindset of experimentation but with an awareness of the risks such as privacy, data protection laws and ethical behaviour.
  3. Involve staff in the digital transformation process. Share new progress and new insights gained to illuminate the way forward.
  4. Create an AI ethics committee with staff from various seniorities.
  5. Give existing staff the necessary skills to effectively utilise and use Gen AI, rather than hiring new people with technological knowledge who do not know the business. Educate staff on when not to follow and challenge the AI findings.
  6. Key performance indicators (KPIs) need to be adjusted. While some tasks become easier, the process of digital transformation is time-consuming.

Communication

  1. Communicate a clear, coherent, long-term vision with a clear role for staff. The steps towards that vision should reflect the technological changes, business model changes and changes in staff roles.
  2. Be open and supportive to staff reporting problems so that whistleblowing is avoided.

 

Table 3. How leaders can build trust in AI with customers

Implementation

  1. Avoid using unsupervised Gen AI to complete tasks on its own.
  2. Only used AI with clear and transparent processes and predictable outcomes, to complete tasks on its own.
  3. Have clear guidelines on how staff can utilise Gen AI, covering what manual checks they should make.
  4. Monitor competition and don’t fall behind in how trust in AI is built.

Communication

  1. Explain where Gen AI and other AI are used and how.
  2. Emphasise the values and ethics of the organisation and how they still apply when Gen AI or other AI is used.

 

Based on the directors interviewed, more technical knowledge of the technology is needed on boards. This finding supports previous research on boards in China [4] and suggests this also applies to Malaysia. The message that comes out strongly is that board members with their experience of making strategic decisions with long-term implications are critical, but new knowledge needs to be gained.

Building Evidence-Based Trust and Supporting Healthy Distrust in AI 

Blind trust in AI and the assumption that it works like another technology that operates in a predictable way, such as statistical analysis of investments provided by a financial investment app, is also unhealthy. Therefore, leaders and boards in particular, must build trust by providing a suitable strategy and an effective implementation while at the same time avoiding crushing a healthy level of distrust based on an understanding of AI’s limitations. A blind distrust must not be replaced by a blind trust. Therefore, a level of trust as close to the abilities and limitations of the AI chosen is ideal. This will be a more stable and sustainable trust. This delicate balance is based on good judgement, which is where a board with their experience often excel. This is one more time when they must rise to the challenge.

  • EY. (2024), Asia-Pacific Board Priorities: Balancing Growth with Economic Volatility.
  • van Giffen, B. and Ludwig, H. (2023), “How Boards of Directors Govern Artificial Intelligence”, MIS Quarterly Executive, Vol. 22 No. 4, pp. 251–272, doi: 10.17705/2msqe.00085.
  • Li, J., Li, M., Wang, X. and Thatcher, J.B. (2021), “Strategic directions for AI: The role of cios and boards of directors”, MIS Quarterly: Management Information Systems, Vol. 45 No. 3, pp. 1603–1643, doi: 10.25300/MISQ/2021/16523.
  • Zarifis A. and Yarovaya L. (2025) ‘How can leadership in organizations in finance build trust in AI: The case of board of directors in Fintech in Malaysia’, In Zarifis A. & Cheng X. (ed.) Fintech and the emerging ecosystems around centralised and decentralised financial technologies, Cham: Springer Nature. (Forthcoming)

This article is written by Dr Alex Zarifis & Dr Larisa Yarovaya from University of Southampton, UK.

Photo by Andy Kelly on Unsplash.

Rate this article

5 / 5. 2

Is this article good for you?
pexels daniel reche 718241 1556707 scaled
5.0
4  Read

An Active Board vs. a Meddling Board

25 July 2024

READ MORE
Share
andy kelly 0E vhMVqL9g unsplash scaled
5.0
8  Read

How Leadership in Financial Organisations Build Trust in AI: Lessons from Boards o...

19 July 2024

READ MORE
Share
pexels google deepmind 17483874 scaled
5.0
29  Read

Strengthening the Bonds of Human and Machine Collaboration

20 October 2023

READ MORE
Share
dawit FBmEunYW1Yg unsplash scaled
5.0
5  Read

Four Steps to Embed Data Ethics into Your Data Risk Control Environment

25 September 2023

READ MORE
Share
micheile henderson SoT4 mZhyhE unsplash scaled
5.0
8  Read

How Can Corporate Reporting Bridge the ESG Trust Gap?

28 August 2023

READ MORE
Share
jametlene reskp gVfGGb62Fpo unsplash scaled
5.0
16  Read

Reputation in a Crisis: Footsteps and Horses’ Hooves

27 April 2023

READ MORE
Share

Survey

ICDM
Homepage