We hosted a roundtable on the future of operational risk quantification, a great opportunity to discuss how the industry can move beyond RCSA, address the challenges CROs face, and explore how exposure-based models can reshape decision-making. A warm thank you to Olivier Vigneron (CRO, Barclays Bank Plc), Efe Cummings (Global Head of Operational Risk, Nomura), and Luke Carrivick (Executive Director, ORX) for sharing their insights and contributing to a rich discussion.
Risk Appetite - where strategy meets oprisk
Banks have limited appetite for stress losses, and part of that capital is allocated to OpRisk.
Knowing whether that allocation is in the right ballpark links risk management to risk measurement — and guides where to invest in controls.
When zero appetite blocks quantification
We often read statements such as “We have no appetite for money laundering.”
Declaring zero appetite contradicts continuing the business — or implies giving up on measurement — because in many activities (mis-selling of complex products, money-laundering exposure in correspondent banking, errors in trading workflows) residual exposure is simply unavoidable.
The strategic choice becomes: invest in controls, or exit the business.
RCSA - Necessary, but not designed for quantification
RCSAs are essential to understand controls, ownership, and vulnerabilities across the organisation.
But they reflect subjective opinions, sometimes influenced by budget incentives, rather than actual exposure.
For risks that are frequent but usually minor, with rare severe events (such as cyber), single-point frequency/severity scoring cannot work: high frequency and high severity would overstate the risk.
Loss generation mechanisms - the foundation of structured risk assessment
Losses emerge from how mechanisms interact, not from abstract risk categories:
For payments: exposure comes from the distribution of payment sizes. Simulation shows how specific (and sometimes costly) controls — stopping STP or requiring MD approval above a threshold — reduce risk.
For rogue trading: controls can be bypassed, but margin calls reveal anomalous behaviour regardless of the trader’s technique.
This requires mechanism-based analysis, not “tick-the-box” workshops.
Combining the two — granular for controls, holistic for risk
Assessing cyber risk at the workstation level or natural-disaster risk at the desk level leads to conceptual dead ends.
In practice you need both views:
• Bottom-up controls to identify, for instance, obsolete applications with higher defect rates,
• Top-down risk to estimate how reducing obsolescence (e.g., from 20% to 10%) changes the firm’s tail-loss profile.
Building trust in structured risk assessment
Structured risk assessment can function as real models, not narratives.
When mechanisms and drivers are explicit and measurable, they become:
• Defensible
• Challengeable
• Aligned with model-validation expectations
A direction leading institutions are already moving toward — seeking more coherent, transparent, and credible OpRisk estimation.

