FIN.

BoE looks at AI and financial stability

Jonathan Hall of the FPC has spoken on whether increasing use of AI models could have a negative impact on financial stability. He sees two key risks:

  • that deep trading agents could lead to a brittle and highly correlated financial market; and
  • that the incentives of deep trading agents could become misaligned with that of regulators and the public good.

Deep trading agents stem from deep learning, a kind of machine learning in which neural networks are trained on vast amounts of data. An artificial neural network will process inputs and ultimately produce outputs which can be either information or actions. For example, for a deep trading algorithm, the output could be an electronically generated, tradeable order. Failure or misspecification of the model can be a significant problem. He looked at how machines can, or will be able to, replicate human analysis, but can make mistakes or amplify shocks.

He concluded that:

  • any deep trading algorithms will need to be trained, tested and constrained by careful monitoring;
  • the algorithms will need to be trained to ensure their behaviour complies with regulations; and
  • a variety of stress testing techniques and tests will be needed to understand the algorithms’ reaction function and to ensure the function does not change because of forgetting or opponent shaping.

Emma Radmore