PRMIA Intelligent Risk - October, 2018

Page 1

INTELLIGENT RISK knowledge for the PRMIA community

October 2018 ©2018 - All Rights Reserved Professional Risk Managers’ International Association



Steve Lindo


Editor’s introduction

Principal, SRL Advisory Services and Lecturer at Columbia University


The continuing journey towards bank capital management resilience by Sharon Hufnagel

Dr. David Veen


PRMIA member profile - Terri Duhon

Director, School of Business Hallmark University


The impact of the Current Expected Credit Loss (CECL) framework for the provisioning of credit losses on financial institutions by Michael Jacobs Jr.

Nagaraja Kumar Deevi


Women in risk spotlight - With Donna Howe


PII data breach: who should own the incident response plan? by Nagaraja Kumar Deevi & Thomas Lee


Dangerous adaptation - by David M. Rowe


A perspective on machine learning in credit risk by Danny Haydon & Moody Hadi


An inverted yield curve induces equity market volatility by John Galakis & Jean Paul van Straalen


Managing rapidly changing cyber risks - by Paul Sand


Definition – conflict of interest - by Rory Flynn


Do interest rate markets face a hard transition from LIBOR / Euribor to RFR-based bench-marks? by Christian Behm, Peter Woeste Christensen & Andreas Hock


The FRTB: concepts, implications and implementation by Sanjay Sharma & John Beckwith


PRMIA and RIM - Together, we shape the landscape of risk management


Bringing stakeholders together to discuss financial sector regulation


Canadian risk forum 2018


Calendar of events

Managing Partner | Senior Advisor DEEVI | Advisory Services | Research Studies Finance | Risk | Regulations | Analytics

SPECIAL THANKS Thanks to our sponsors, the exclusive content of Intelligent Risk is freely distributed worldwide. If you would like more information about sponsorship opportunities contact




Intelligent Risk - October 2018

editor introduction

Steve Lindo Editor, PRMIA

Dr. David Veen Editor, PRMIA

Nagaraja Kumar Deevi Editor, PRMIA

Our chosen theme for this edition of Intelligent Risk, “The Changing Landscape of Risk,� presented a broad canvas for PRMIA members to fill. The contributions we received comprise an interesting mix of topics that fall into three main categories – markets, technology and financial regulation. In the markets category, our authors provide insights into the risks associated with the transition to new benchmark interest rates and the inter-dependence of interest rate and equity markets. In the technology category, we received thoughtful articles on the risks and responses posed by fast-moving data and systems vulnerability. In the regulatory category, our authors shared their views on changing bank capital requirements and loan loss forecasting methods, in the latter case specifically relating to the adoption of CECL. Lastly, in an era where the spotlight is increasingly being shone on business conduct, we also selected an article on the topic of conflict of interest. Change in these and other domains is getting faster and more unpredictable. For risk managers, this means often being caught in a race to stabilize one set of methods and practices before change makes them obsolete. We hope that this issue of Intelligent Risk provides PRMIA risk managers with some ideas and guidance on how to tackle these challenges.

Intelligent Risk - October 2018


the continuing journey towards bank capital management resilience

by Sharon Hufnagel As US Bank Holding Companies hastened to close out Q2-2018 by wrapping up their DFAST annual report submissions, they were also anxiously awaiting formal notice from the Federal Reserve and other Agencies about their regulatory reporting going forward, as some of them might not have to be concerned about supervisory stress testing any longer. On May 24, 2018 President Trump signed into law S.2155, the Economic Growth, Regulatory Relief and Consumer Protection Act (EGRRCPA), which amended provisions in the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) as well as other statutes administered by the Board of Governors of the Federal Reserve System (Board), the Federal Deposit Insurance Corporation (FDIC), and the Office of the Controller of the Currency (OCC).

Dodd-Frank act modification and financial regulatory relief This new, bipartisan law represents the culmination of a multi-year effort aimed at creating regulation that is tiered to the size, risk, complexity and business model of individual financial institutions. Major areas addressed by this law include capital management and stress testing standards. With the Enhanced Prudential Threshold raised from $50 billion to $250 billion for Bank Holding Companies, 1) financial companies with total consolidated assets of less than $250 billion that are not bank holding companies (BHCs), eighteen months after EGRRCPA’s enactment, will no longer be subject to the company-run stress testing requirements in section 165 (1)(2) of the Dodd-Frank Act, 2) BHCs with total consolidated assets between $50 billion and $100 billion are exempt from enhanced prudential standards immediately, and 3) BHCs with total consolidated assets between $100 billion and $250 billion will become exempt in November 2019.

anticipated changes across the banking community The passing of this law generated different reactions amongst the members of the banking industry concerning the EGRRCPA’s impact. Community banks (banking organizations with less than $10 billion in assets) will benefit the most by bearing lower administrative burdens and costs, and becoming more effective competitors; mid-sized banks, many of whom for years made every attempt to avoid crossing


Intelligent Risk - October 2018

the $10 billion and $50 billion thresholds, can now breathe a sigh of relief for not having to be concerned about enhanced supervision; whereas for large banking organizations with $250 billion or more in consolidated assets, there is little relief provided by the EGRRCPA. See Appendix 1 for details of S.2155’s impacts.

building capital robustness - equity capital management The coming into law of S.2155 represents the biggest legislative change since the financial crisis. Although introduced by a Republican president, it has come a long way and is by no means a casual amendment. Since the 2018 financial crisis, global regulators have implemented much tougher capital standards for banks to better prepare them in case of another economic crisis. An important component in building banking industry resilience is the battle to increase individual firms’ equity capital and to limit the industry’s reliance on external funding. The Fed has relied on two main approaches to ensure banks’ equity capital adequacy: the establishment of minimum levels of financing thresholds, and the annual stress tests to see whether the thresholds would be breached in conditions of crisis. Opposing banking officials, on the other hand, often argue that banks benefit from magnified returns in good economic climates by leveraging (taking on debt vs. boosting equity). Since excessive regulations can curtail lending, eventually holding back job creation and stifling economic growth, as a concession to banks the Fed indicated it is prepared to ease some aspects of its stress tests by changing assumptions it has made about how the firms would behave in another crisis. In 2017, under a Presidential Executive Order on Core Principles for Regulating the United States Financial System, the Treasury Department released a report assessing the depository system (banks and credit unions), which served as a foundation for the current changes.

forward focus Despite easing the administrative burden on banking institutions in areas of regulatory reporting, the Agencies will continue to supervise and regulate financial institutions within their jurisdictions, and the risk management practices of these institutions will continue to be reviewed through the regular supervisory process. As a result, financial institutions of all sizes need to carry on the important mission of capital management. There will be increased needs for capital management and policies to be aligned with firmspecific strategic goals. In the area of stress testing, focus will need to be given to the transparency of banks’ aggregation models, key assumptions, and management overlay methodology, emphasizing idiosyncratic scenarios and sensitivity analysis around the key assumptions the banks use for their internal stress testing. In addition, quality of data and sophisticated technology infrastructure remain a big challenge for most institutions. The impacts from ancillary regulatory changes also need to be taken into consideration. The journey towards banking industry resilience remain long and hard.

Intelligent Risk - October 2018


Appendix 1: Final Changes to Enhanced Prudence Standards

Source: Deloitte Reg Pulse Blog

author Sharon Hufnagel Sharon Hufnagel is Managing Partner of eLambda LLC, a financial risk management consulting company. Passionate about thought leadership, Sharon is a firm believer that good risk management is well balance between qualitative and quantitative analyses, business and methodology models, and finance and technology. Sharon’s work focuses on market, Credit and liquidity Stress testing for BHCs and CCPs; as well as Systemic risk and its interconnectedness for SIFIs. She has spoken and written on various occasions upon topics of market/Liquidity Risk and FRTB, DFAST and CCAR. She is also an active member of PRMIA, holding a seat at its Global Council of Regional Directors.


Intelligent Risk - October 2018

PRMIA Sustaining Members Have Complimentary Access to the Journal BECOME A MEMBER TODAY AT WWW.PRMIA.ORG aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa

© 2018 Dow Jones & Co., Inc. All rights reserved.

PRMIA member profile - Terri Duhon

by Adam Lindquist, Director of Membership, PRMIA Terri Duhon has seen her share of change in her 24 year career in finance. After college, she was at the forefront of the development of credit derivatives at JP Morgan where she was an exotic credit derivative trader and structurer. This experience, along with an entrepreneurial spirit, produced a consulting firm where she and her team became critical to helping firms and regulators understand the impact from the financial crisis of 2008. Her team spent countless hours unravelling the derivative products institutions bought or thought they bought and reinforced the importance of understanding risk. Today, she is a Non-Executive Board Director and Chair of the Risk Committee for Morgan Stanley International and Rathbone Brothers.


Thank you for sharing your thoughts in Intelligent Risk. Let’s start with an important question. How did you choose a career in risk? Terri I’m not entirely sure that I chose a risk career path. When I graduated, there was huge interest in getting very analytical degrees onto Wall Street. So with a math degree, I started as an interest rate swap trader. But, I didn’t call myself a trader; I was at JP Morgan, where we were called “risk managers.” Early on, I remember going out with interbank brokers and meeting other “traders” where they would swagger about. Naively, I thought initially that we simply couldn’t be doing the same job. Today, with hindsight, I think there’s a lot in the name, and my role was to manage risks. I know that isn’t what most traders saw as their job 20 years ago. I’ve eventually ended up firmly in the risk space. And yes it’s a little geeky but it’s super interesting.


What do you feel was the most significant change that took place in risk in the past? What was the triggering incident, if any, that caused the shift to the importance of measuring and managing risk? Terri There were a few big developments, for example VaR and the exponential growth of the derivative markets, which were significant. But I think the role of risk has really developed and matured as a result of the financial crisis – now 10 years ago, which highlighted that risk culture was generally very poor. Before the crisis, risk was often considered middle office and as a result didn’t have the authority to challenge front office decision making. So risk departments were often not being appropriately engaged in the development of or the investment in complex products and risks. That sort of behavior today would raise red flags all over the place. Today risk is a clear oversight function which requires critical thinking and understanding of financial markets. And sometimes risk is even a trusted partner of the business.


Intelligent Risk - October 2018


Where are the opportunities in risk, or how do you see risk changing in the future? What are the areas you feel will most impact the future? I Terri think there are many years of interesting change and challenges ahead. For example, there is a real drive right now in the industry to push fixed income to more of an electronically traded space. This is a huge shift and could eventually change how the market works e.g. from quote to order perhaps or it may bring in different market participants or create different market dynamics. This will require real thought around how to manage, quantify and control those risks. I also think operational risk has more development to come. How do you quantify operational risks or think about limits? We’ve all gotten comfortable and there are numerous market standards that you can use to quantify market and credit risk. Operational risk is less straightforward, so there’s no standardization. I think that’s a good thing because it’s forcing people to think about it very carefully. Another is counterparty credit risk, which is still very nonstandard from a modeling perspective and again I actually think there’s real value in having something non-standard. It means you’re not relying upon a market standard in the industry. You’re actually having to think about it, really challenge what you know, what’s out there, what you’re using, and what your risks could be because they’re complex. These are just some of the areas in risk that I think are fun and interesting.


Do you think the CEO role will be a place for risk in the future?

Terri Actually, what I think might be a more natural next step for a risk person is sitting on boards and acting as a non-executive board member. If I have to describe one of my key roles as a board member, it is simply to challenge the executive. That means I’m actively challenging the executive when they present strategy, when they present results, when they present risks etc. To do this I have to be a critical thinker and I have to be able to effectively and constructively communicate. Those two skillsets, which I would also say are pretty key in a risk team, are very valuable at the board level. So that is definitely a path to keep in mind. Adam

Thank you for your wonderful insights.

interviewee Terri Duhon NED and Chair of Risk Committee, Morgan Stanley International

Intelligent Risk - October 2018


the impact of the Current Expected Credit Loss (CECL) framework for the provisioning of credit losses on financial institutions

by Michael Jacobs, Jr.1 In the United States, the Financial Accounting Standards Board (“FASB”) issues the set of standards known as Generally Accepted Accounting Principles (“U.S. GAAP”), a common set of guidelines for the accounting and reporting of financial results. In this paper we focus on the guidance governing the Allowance for Loan and Lease Losses (“ALLL”), the financial reserves that firms set aside for possible credit loss on financial instruments. The recent revision to these standards, the current expected credit loss (“CECL”; FASB, 2016) standard, is expected to substantially alter the management, measurement and reporting loan loss provisions. The prevailing

ALLL loss standard for U.S. has used been the principle of incurred loss, wherein credit losses are recognized only when it is likely that a loss has materialized. This is a calculation as of the financial reporting date and future events are not to be considered, which impairs the capability of managing reserves prior to a period of economic downturn. The result of this deferral implies that provisions are likely to be volatile and subject to the phenomenon of procyclicality, which means that provisions rise and regulatory capital ratios decrease exactly in the periods where we would prefer the opposite.

In Figure 1 we illustrate the procyclicality of credit loss reserves under the incurred loss standard. We plot net charge-off rates (“NCORs”), the provisions for loan and lease losses (“PLLL”) and the ALLL for all insured depository institutions in the U.S., sourced from the FDIC Call Reports for the period 4Q01 to 4Q17. NCORs began to their ascent at the start of the Great Recession in 2007, while PLLLs exhibit a nearly coinciding rise, while the ALLL continues to rise well af-ter the economic downturn and peaks in 2010, nearly a year into the economic recovery. This coincided with a deterioration in bank capital ratios, which added to the stress on bank earnings, impairing the ability of institutions to provide sorely needed loans and contributing to the slug-gishness of the recovery in the early part of the decade.

1 / Corresponding author: Michael Jacobs, Jr., Ph.D., CFA, Lead Quantitative Analytics & Modeling Expert, PNC Financial Services Group – APM Model Development, 340 Madison Avenue, New York, N.Y., 100022, 917-324-2098, All views expressed herein are those of the author and do not necessarily represent an official position of PNC Financial Services Group.


Intelligent Risk - October 2018

Figure 1: Net Charge-off Rates, Provisions as a Percent of Total Assets and the ALLL – All Insured Depository Institutions in the U.S. (Federal Deposit Insurance Corporation Statistics on Depository Institutions Report – Schedule FR Y-9C)

In the remainder of this article we discuss of some of the practical challenges facing institutions in implementing CECL frameworks. In Figure 2 we depict the regulatory timeline for the evolution of the CECL standard. In the midst of the financial crisis during 2008, when the problem of countercyclicality of loan loss provision came to the fore, the FASB and the IASB established the Financial Crisis Advisory Group to advise on improvements in financial reporting. This was followed in early 2011 with the communication by the accounting bodies of a common solution for impairment reporting. In late 2012, the FASB issued a proposed change to the accounting standards governing credit loss provisioning (FASB, 2012), which was finalized after a period of public comment in mid-2016 (FASB, 2016); meanwhile the IASB issued its final IRFS9 accounting standard in mid-2014 (IASB, 2014). The IRFS9 standard was effective as of January, 2018, while CECL is effective in the U.S. for SEC registrants in January, 2020 and then for non-SEC registrants in January, 2021; however, for banks that are not considered Public Business Entities (PBEs), the effective date will be at December 31, 2021.

Intelligent Risk - October 2018


Figure 2: The Accounting Supervisory Timeline for CECL and IRFS9 Implementation

Figure 3: The CECL Accounting Standard – Regulatory Overview


Intelligent Risk - October 2018

Figure 4: Key Business Impacts of the CECL Accounting Standard

In Figure 3 we depict some high-level overview of the regulatory standards and expectations in CECL. The first major element, which has no analogue in the legacy ALLL framework, is that there has to be a clear segmentation of financial assets, into groupings that align with portfolio management and which also represent groupings in which there is homogeneity in credit risk. This practice is part of traditional credit risk modeling, as has been the practice in Basel and CCAR applications, but which represents a fundamental paradigm shift in provisioning processes. Second, there are changes to the framework for measuring impairment and credit losses on financial instruments, which has several elements. One key aspect is to enhance the data requirements for items such as troubled debt restructurings (“TDRs�) on distressed assets, and lifetime loss modeling for performing assets. This will a definition of model granularity based on existing model inventories (i.e., for Basel and CCAR), data availability and a target level of accuracy. Moreover, this process will involve the adoption of new modeling frameworks for provision modeling. Finally, institutions will face a multitude of challenges around implementation and disclosures. This involves enhanced Implementation platform for model and reporting (e.g., dashboards), as well as revised accounting policies for loans and receivables, foreclosed and repossessed assets and fair value

Intelligent Risk - October 2018


Figure 5: Best Industry Practices in Implementing the CECL Accounting Standard

Figure 6: Data Considerations and Challenges in Implementing the CECL Accounting Standard


Intelligent Risk - October 2018

Figure 7: Modeling Considerations and Challenges in Implementing the CECL Accounting Standard

Figure 8: Implementation Considerations and Challenges in the CECL Accounting Standard

Intelligent Risk - October 2018


Figure 9: Roadmap Forward for an Efficient Implementation of the CECL Accounting Standard

In Figure 5 we depict these best industry practices. The new CECL standard is expected to have a significant business impact on the accounting organizations of financial institutions by increasing the allowance, as well as operational and technological impacts due to the augmented complexity of compliance and reporting processes, as summarized in Figure 4. Institutions will need to develop an acumen of best practice activities: project experience in supporting CECL programs and investing in assets and tools that enable the institution to accelerate the model development, implementation and reporting work required in CECL initiatives. As shown in Figure 6. The CECL effort requires an additional reconciliation of regulatory modeling data with business (General Ledger) and other sources to ensure the consistency and transparency of the results. CECL guidance provides a high-level expectation for modeling the “life of loan� which needs to be accurately interpreted and captured in the model development process, which we enumerate in Figure 7. An efficient and automated system is the key for a successful CECL rollout across the bank, creating distinct implementation considerations and challenges, which we show in Figure 8. In summary, as shown in Figure 9, we highlight that an efficient and automated system is the key for a successful CECL rollout across the bank, creating distinct implementation considerations and challenges.


Intelligent Risk - October 2018

author Mike Jacobs Mike Jacobs is a lead model development and analytics expert across a range of risk and product types, having a focus on wholesale credit risk methodology, regulatory solutions and model validation. Mike has 25 years of experience in financial risk modeling and analytics, having worked 5 years at Accenture and Big 4 consulting as a Director in the risk modeling and analytics practice, with a focus on regulatory solutions; 7 years as a Senior Economist and Lead Modeling Expert at the OCC, focusing on ERM and Model Risk; and 8 years in banking as a Senior Vice-President at JPMC and SMBC, developing wholesale credit risk and economic capital models. Skills include model development & validation for CCAR, PPNR, CECL, credit / market / operational risk; Basel and ICAAP; model risk management; financial regulation; advanced statistical and optimization methodologies. Mike holds a doctorate in Mathematical Finance from the City University of New York – Zicklin School of Business and is a Chartered Financial Analyst.

Intelligent Risk - October 2018


women in risk spotlight Interviewed by Cindy Williams, Principal CWFG LLC

with Donna Howe Cindy

How did you get into the field of Risk Management?

Donna I started in risk management at a time when it was much more ambiguous. There were not independent risk managers in the way we think of them today. In what has evolved into a modus operendi for me, my first official role was due to someone else’s problem. Basically, a foreign bank in trouble with the Federal Reserve Bank (FRB) needed a risk manager badly at a time when there weren’t many around. But further, they wanted and needed somebody who was not afraid of risktaking. They needed someone who understood structuring and trading risk AND who also understood regulatory constraints AND who had experience with FRB Examiners. This was a pretty scarce skill set. At the time, I was a proprietary trader at a large international bank. In those days, regulatory risk was a first-line responsibility. So, I was offered the position. It seemed interesting. It seemed different. It seemed possible to grow in the position. (My mother was worried that it would be a short-term business opportunity -LOL). I’ve been in risk management ever since.


What do you enjoy most about what you do?

Donna I love the ever-changing nature of the position. I love the fact that I can help solve problems. At heart, I’m an analyst and a problem solver. A problem solver is an analyst who provides quantitative and actionable responses to questions. The problems just get weirder and harder every year, and so it’s really fun. Cindy

The theme of this edition is “The Changing Landscape of Risk.” What do you see as future trends in the field? What changes do you foresee? Donna The most material changes that I foresee are in operational risk. At present, operational risk is viewed as being the risk due from people, processes, systems, and external events. These are all treated as being independent. And I think that approach is naive. The reason being is that technology (systems) is the enabler in today’s world. Systems take data and systems apply algorithms to turn the data into information. Systems are the way that information is viewed and distributed. Technology is the leverage to the entire banking system. People use systems and act ethically or not, so risks from cyber, fraud, and AML are all entwined. So, we need a better way to look at operational risks – frameworks for discovering the emerging risks, assessing the possible magnitudes of cost; understanding the drivers; measuring, and then being able to take steps to mitigating those risks.


Intelligent Risk - October 2018


What are the biggest challenges faced by someone in your role?

Donna Personally, I think that time management is the top challenge, and managing the always-changing priorities comes a close second. Cindy

How do you feel about the opportunities for women in risk management?

Donna I believe there are a number of opportunities for women in risk management - however the biggest constraint is that math is not sexy. While top positions appear to be administrative, you need to have a strong understanding of the way things work quantitatively, and understand the algorithms used in calculating P&L and exposures. You also need to have a good understanding of technology and operational process – but that’s an area where one can source a strong female candidate. Female candidates with master’s degrees in math, econ, or statistics are rare. You really need a strong if/then capability to manage all the variables in the daily routine of issues. Alas, this type of strong candidate does not seem to be increasing in number over time either. Risk is a very integrative role, thus so long as women have an appropriate background, there are lots of opportunities for visibility. But so long as women don’t go into math, I think that’s going to be a limiting factor. Cindy

What is your advice for women just entering risk management careers?

Donna Ask lots of questions! And take career risk. By that I mean try for opportunities that are a bit scary – don’t limit yourself to what you know you can do with no problem. Some of the most interesting problems currently have no established optimal solution. So why wait? Give it a try.

interviewee Donna Howe CEO & Founder Windbeam Risk Analytics LLC an affiliate of TechparGroup

Intelligent Risk - October 2018


PII data breach: who should own the incident response plan?

by Nagaraja Kumar Deevi & Thomas Lee, PhD Where does cybersecurity leadership fit into the organization? One piece – often overlooked and under resourced – is the Incident Response Plan, which prepares the company for the inevitable. We propose that financial institutions can substantially reduce cyber risk, cyber-insurance and potentially, loss of reserves, if ownership of the Incident Response Plan is moved from the cybersecurity group – to the CFO, the CRO, Legal Department or Board Members with Cybersecurity oversight responsibility. These changes make sense now that rigorous regression models have become available for cyber risk. Even through a data breach of Personal Identifiable Information (PII) is a rare event, cyber risk can still be viewed in the same way as credit risk: as the product of probability and financial impact. Reducing either probability or financial impact is reducing risk, but most cybersecurity resources are focused on reducing probability of a data breach. There is a diminishing return as even more is spent reducing probability, to the point that reducing the financial impact is a more cost-effective way to further reduce risk. Also, regulators are paying more attention to these kinds of idiosyncratic events, and reducing financial impact also reduces insurance needs and potentially loss reserves. In many organizations, the Incident Response Plan is owned by the cybersecurity team. The plan is even part of security standards such as NIST. But the cybersecurity group is incentivized to focus on reducing probability – not financial impact. The expectation is, they will be fired if there is a major data breach, and the expectation is probably correct. But new Data Breach Impact Models have become available that accurately estimate the severity of a PII data breach in dollars. These models are different from AMA models used in stress testing, in that they only characterize the financial impact, independent of probability. These models are developed on the same data used by the insurance industry and therefore give corporations the same advantage as the insurance industry in understanding risk. These models can be characterized for accuracy, are SR11-7 compliant and can be incorporated into Model Risk Management frameworks. These new models reveal that 1) the cost increases by the square root of the number of people affected, 2) one lawsuit can nearly double the cost, and 3) data breaches caused by malicious outsiders are about five times costlier than any other cause; other causes including accidents, lost or stolen devices and malicious insiders. These models also reveal that the cost of investigation is a primary cost. This makes sense, since a PII data breach caused by a malicious outsider is the most complicated data breach to investigate: how did it happen, where did intruders go, what data was exposed and did they leave malware behind.


Intelligent Risk - October 2018

Preparation for the investigation phase of a data breach falls solidly within the Incident Response Plan. Best practice for investigation readiness is to ensure insurance access logs are turned on, saved in a read only manner, with a uniform format, if possible. There should be 1) written policies and procedures, 2) management support and 3) evidence that access logs are on. Turning on access logs does come with a cost: managing disk space with potentially lower performance. Data Breach Impact Models also reveal that for the same breach size, cause and data type, costs can vary over a large range, with an 80% confidence interval many times the median cost (see graph). But costs can be managed; in particular investigation costs can be reduced by following best practice (research work was presented at Federal Reserve conference at Richmond: 2018).

Breach Cost

Impact of a PII data breach affecting 1 million people, caused by a malicious outsider. The median cost is forecast to be just $6 million, while the 80% confidence interval (the point where 80% of data breaches of this size and cause will fall below) is $32 million. The large difference in cost between median and 80% confidence interval is largely a function of the incident response plan and in particular management of the cost of investigation. It therefore makes sense to move the Incident Response Plan from the cybersecurity group to a group with interests better aligned with the near-term value delivered by the plan. The near-term value is to reduce cyber insurance costs, reduce loss reserves, and present a better risk management culture to regulators. We propose three groups with interests better aligned with the near-term benefits: the office of the CFO, the CRO and the Legal department. Argument for CFO: The CFO is often tasked with insurance adequacy. Quantifying and reducing the financial impact of a data breach, procuring enough insurance go hand-in-hand. Assessing investigation readiness is best performed by a 3rd party with experience investigating large data breaches and familiarity with industry peers. Engaging a third party, negotiating investigation costs before a data breach is an activity that the CFO can well manage.

Intelligent Risk - October 2018


Argument for CRO: Managing risk, understanding the value of investments that reduce impact and the relationship to loss reserves and insurance adequacy are concepts the chief risk officer understands. The CRO also appreciates the value of representing a strong risk management culture through the use of objectively obtained estimates in Idiosyncratic Scenarios. Again, assessing investigation readiness using an experienced 3rd party is an easy way to manage a very technical activity. The CRO can also best leverage the combination of a validated model-based forecast and incident response plan to argue for a lower loss reserve. Argument for Legal: The legal department is very involved in a large data breach, since one of the major costs is notification: the cost of notifying various government agencies. Also, when a data breach becomes large enough, probability of a lawsuit becomes significant and regression analysis of historical data breaches teaches us that one lawsuit nearly doubles the cost of a data breach. Making Legal responsible for controlling costs and reducing probability of a lawsuit might be a strategy for well managing major drivers for the cost of a data breach. Finally, insurance brokers and underwriters have an inherent conflict of interest recommending the right amount of insurance. The decision regarding insurance adequacy, which may override recommendations by insurance agents, may need to be at board level, with evidence and recommendations provided by the owner of the Incident Response Plan. The owner of the Incident Response Plan should also be the owner of the Data Breach Severity Models and supply the expert judgment needed for data breach idiosyncratic scenarios in stress tests.

authors Nagaraja Kumar Deevi Nagaraja Kumar Deevi is a senior strategic executive with over two decades of Leadership experience in Finance, Risk, Regulatory, Analytics and Technology enabled solutions working with Global Banking & Financial Institutions. He is currently Managing Partner & Senior Advisor at DEEVI Advisory & Research Studies. NAG is specialized in Banking regulations, Regulatory Policy & Affairs and Enterprise wide Strategic Risk initiatives. Designed and developed Enterprise Risk Governance Framework aligned with firm-wide Corporate strategy, setting high level Regulatory Policy, Risk Appetite Statement, Recovery, and Resolution Planning (RRP)/ Living Wills, Culture, Conduct & Reputational Risk. Effective utilization of Tools & Techniques addressing Risk Assessment, Risk Identification, Risk Measurement, Prioritize Risk & Risk Mitigation & Risk Response processes. NAG works closely with Academia and Research studies on Risk & Analytics and AI based startup companies through knowledge sharing, Solution Approach & Go-to Market strategy, and has advanced management studies from Columbia, NYU, Kellogg & MIT.


Intelligent Risk - October 2018

Dr. Thomas Lee Dr. Thomas Lee, is the chief executive at VivoSecurity, with decades of experience pioneering methods in statistical analysis, image processing and digital signal processing for science, industry and cyber risk. Thomas has degrees in physics, electrical engineering and a PhD in Biophysics from the University of Chicago. He has multiple patents and papers published in peer-reviewed journals and is an expert in software, operating systems and hardware vulnerabilities, and enterprise operations. He is a recognized expert in quantifying Operational Risk, including fraud and cyber-risk, and a frequent speaker at events such as PRMIA and Op Risk North America and the Federal Reserve’s Research Conferences.

Intelligent Risk - October 2018


dangerous adaptation1

by David M. Rowe, PhD Adaptation is one of the most powerful phenomena in nature. It is the means by which species survive changes, often major changes, in their environments. In this sense we are conditioned to look upon adaptation as a favorable characteristic. It is important to remember, however, that effective adaptation has its dark side as well, because it also can be a source of strength and resilience for dangerous threats. The Global Financial Crisis prompted considerable thought concerning what finance can learn about systemic risk from other disciplines including epidemiology. The appearance of new and previously unknown viruses is a recurring challenge. These are almost always similar to known viruses but have developed a mutation that makes them resistant to existing forms of prevention and treatment. Risk managers need to recognize that, in terms of adaptation, the problems we face are often similar to those confronting epidemiologists. We are not external observers of a distinct and independent system. Risk management is an integral part of the system the risk of which we seek to control. Among other things, this points out the ultimate futility of trying to control financial institutions by detailed microregulations. This is particularly relevant when such regulations evolve not over weeks or months but over years. The underlying institutions and systems adapt much faster than such rules and regulations can possibly be updated. Goodhart’s Law states that, “Whenever a reliable indicator becomes a target (of social, economic or organizational policy) it ceases to be a reliable indicator”.2 One example of this relates to VaR. When VaR became the standard metric for measuring and monitoring the limits on market risk taken by traders, both individually and collectively, they had no choice but to comply. Traders who repeatedly and willfully exceed their institutionally established limits will ultimately be fired. Nevertheless, traders still wanted to make their returns. It is hardly a big leap to realize that one way of doing this is to pile on risk far into the tail of the loss distribution. One obvious way to do this is to write well out-of-the-money puts and calls. Such trades expose a firm to low probability high impact events. Because the probability of an occurrence falls well below the usual 1% VaR threshold, however, such positions have little or no initial impact on a VaR-based risk measure. The corrosive feedback effect is that the widespread use of VaR as a control metric encourages exactly the type of risk-taking that VaR fails to measure, namely exposure to extreme events.

1 / This is a slightly edited excerpt from the author’s forthcoming book: An Insider’s Guide to Risk Management – Relearning the Lessons of the Global Financial Crisis 2 / The law was named for Charles Goodhart, a former advisor to the Bank of England and Emeritus Professor at the London School of Economics.


Intelligent Risk - October 2018

Hence VaR doesn’t just fail to address the most extreme losses, it actually encourages behavior that increases their magnitude. Other measurement techniques are necessary to monitor the existence of the types of risk created by such adaptive behavior. Another example of dangerous adaptation relates to credit agency ratings of mortgage backed securities. Once these agencies published their methodology, the market began to game them in every way possible. This gaming undermined the limited reliability these ratings had initially. To meet our responsibility, risk managers must always be aware of how adaptation may be operating to undermine the reliability of the risk indicators we use.

no final victories While politicians and regulators are busy fighting the last war, it is essential for risk managers to remain alert to how markets and institutions are adapting their products and strategies. Ours is a profession in which there are no final victories. Adaptation is the fundamental reason that any claims that regulation can assure, “This will never happen again” cannot and should not be taken seriously. Human beings are too ingenious and too much a part of the highly adaptive biological system for such claims to be sustained. As Shakespeare has Cassius say, “The fault … is not in our stars … but in ourselves”. Only constant vigilance, with special attention to the risk inherent in the adaptive changes taking place around us, will allow organizations to avoid the worst consequences when the next crisis occurs, as it inevitably will.

author David M. Rowe, PhD David M. Rowe wrote the monthly Risk Analysis column in Risk magazine from 1999 through late 2015. He has over 40 years of experience at the interface between economic forecasting, finance, and risk management with the rapidly changing world of information technology. His professional career included years spent at Wharton Econometric Forecasting Associates, Townsend-Greenspan & Co., Security Pacific Bank, Bank of America, SunGard and Misys as well as his own small consulting firm. Dr. Rowe is also a former board member of PRMIA.

Intelligent Risk - October 2018


a perspective on machine learning in credit risk

by Danny Haydon & Moody Hadi There have been major advances in the application of Machine Learning (ML) in the recent past due to a plethora of industry drivers that have revolutionized the utilization of these techniques in the risk management sphere, and beyond. In this primer we will cover the key transformational drivers causing these high adoption rates, some of the techniques, and how to assess their utility within credit risk.

drivers Firstly, data in general has experienced a large expansion in several dimensions: size, velocity and variety. Simultaneously the abilities to record, store, combine and then process large datasets from many disparate sources has experienced wholesale improvements. This is not limited to just traditional sources, but also alternative data which fueled the need to extract information value from these sources. However, the side effect of this data expansion is an elevated level of data pollution that needs to be contended with. Data pollution includes noisy, conflicting and difficult to link datasets. Secondly, the ease of access to enhanced computational efficiency through hardware that can run specialized operations in large scale, and also in coding language enhancements which have moved towards functional programming, have transformed the game in terms of integrating Machine Learning techniques. Languages, such as R, become the hub for numerical computing using functional programming. They leverage a lengthy history of providing numerical interfaces to computing libraries. Supervised and unsupervised algorithms allow data scientists to process these datasets into actionable insights with relative ease and to code with cheaply executable hardware. Thirdly, reproducible research and analysis has been widely adopted by the data science community. This is defined as a set of principles about how to do quantitative and data science driven analysis, where the data and code that leads to a decision or conclusion should be able to be replicated in an efficient and clear way. Finally, the pervasiveness of Open Source libraries, packages and toolkits has opened doors for the community to contribute via teams of specialists, sharing code base and packaging them into easy and modular functions.


Intelligent Risk - October 2018

ML techniques in risk and considerations in their application The typical phases of applying ML within a Risk context include the following pipeline:

Figure 1: Generalized machine learning pipeline

Assessing which ML techniques to use and when is an important step that needs to be done thoughtfully with the target context in mind. There is no prescriptive method that is purely tied to a particular class of algorithms; the risk context always needs to be kept in mind in order to assess the tradeoffs. A simple example to consider is the variance-bias tradeoff. Variance reflects the instability of the model to various factors. For example, if small changes to the data result in big changes to the model, then the technique has a high variance. Bias is the ability of the model to show fidelity to the underlying pattern. See Fig 2 for a simple example of this.

Intelligent Risk - October 2018


Figure 2: Demonstration of model fit comparison visualization

In the above figure we see that Random Forest exhibits low bias, but high variance to the dataset. Quadratic Regression exhibits low variance, but high bias to this data set. Nonlinear regression, in this trivial example with ex-ante known data generating process, seems to achieve low bias and low variance and provide appropriate fit. In the real world, however, finding a sweet spot between over-fitting and under-fitting is less trivial and requires appropriate definition of model selection criteria and exploration of different levels of model complexities. The key takeaway here is that none of these techniques are categorically wrong; it really depends on what tradeoffs we have to make to achieve as close to low bias and low variance as is possible. We need the model to adapt as the real-world adapts and ideally contend with polluted information with minimal supervision, while being as transparent as possible. These are all competing objectives and need to be accounted for within the applied risk domain. Within a risk scoring context a simple example of being able to communicate to the business the supervision and complexity tradeoff is shown below.


Intelligent Risk - October 2018

Figure 3: Supervision and complexity trade-offs

Here we see that given the characteristics of the dataset, there is a trade-off between coupling the model with the data and the level of transparency of the ultimate model. Another application of ML in credit risk is within sentiment analysis. A generalized sentiment analysis pipeline is provided below: Figure 4: Generalized sentiment analysis pipeline

Intelligent Risk - October 2018


Sentiment analysis methods can generally be split into either deterministic models that rely on a dictionary (bag of words) or neural network models that typically engage a deep learning exercise. The sentiment analysis can be further divided into ‘classification’ and ‘attribution’ where in each case given a target variable, a sentiment polarity label is assigned to a particular article (in the classification case) or attributes segmented within articles which are actually relevant and would impact the target variable. Figure 5: Usage in Sentiment Analysis

Once again we see the considerable tradeoffs between supervision and complexity. Dependent on the risk context, any of these techniques would be applicable. We have covered the key drivers of the adoption of ML within a credit risk context and showed a few simple examples of the uses. It is important to consider the tradeoffs which are largely dependent on the actual final application. ML functions are a complementary class of techniques but they are not a panacea for every use case within credit risk. Ultimately, being able to communicate their value to the business audience and why they are being used in this context is of critical importance.


Intelligent Risk - October 2018

authors Danny Haydon Head of Relationship Management, Americas / Risk Services / S&P Global Market Intelligence Danny is the Head of Risk Services Relationship Management for the Americas, focusing on key market stakeholders across financial services. Previously, he covered the Americas for S&P Global Market Intelligence’s Portfolio Risk (R2) products. Danny joined from MSCIRiskmetrics where he focused on sales and client coverage for their flagship Market Risk Management platform, as well as the buy side, with deep experience with Hedge Funds and Investors. He has 12 years of Sales and Relationship management experience, including time as a Structured Credit Derivatives Broker for CreditEx/ICE Exchange. Danny holds a BSc 1st Class (Hons) in Psychology and Sport & Exercise Science from the University of Southampton, United Kingdom and an MBA (Distinction) from the University of Southampton Business School in the United Kingdom.

Moody Hadi Senior Director – Innovation & Product Research / Risk Services / S&P Global Market Intelligence Moody is a Senior Financial Engineer at S&P Global – Market Intelligence Risk Services. As a group manager in Innovation & Product Research group within Risk Services, he leads a team focusing on applying modelling techniques, such as machine learning and data sciences to extract information value for risk management. Previously, he was Co-Head of Research and Development at Credit Market Analysis (CMA), where he lead the model development and research on Credit Default Swaps pricing and risk management. Prior to CMA, Moody was a Senior Quantitative Analyst at the Chicago Mercantile Exchange (CME) Group, where we worked on Over-The-Counter (OTC) Clearing of Interest Rate and Credit Derivatives and the SPAN Margining Algorithm. Prior to that he had several senior roles in analytical & technical consulting, spanning diverse areas from Asset-Liability Management (ALM) to Business Intelligence (BI). Moody holds a Bachelor’s of Science in Computer Science from Georgia Institute of Technology, Masters of Science in Operations Research from Columbia University and MBA from the University of Chicago – Booth School of Business. All figures are for illustrative purposes only. Source: S&P Global Market Intelligence as of July 2018. Content including credit-related and other analyses are statements of opinion as of the date they are expressed and are not statements of fact, investment recommendations or investment advice. S&P Global Market Intelligence and its affiliates assume no obligation to update the content following publication in any form or format. The authors would like to express their thanks to Max Kuhn and Jonathan Regenstein from R Studio who provided their expertise and input into the article contents. R Studio is not affiliated with S&P Global or its divisions. Intelligent Risk - October 2018


an inverted yield curve induces equity market volatility

by John Galakis & Jean Paul van Straalen The term structure has long been regarded as one of the most reliable signals of an upcoming economic recession. More specifically, economists and financial analysts have monitored the slope of the yield curve in order to assess the probability of an economic downturn, as the curve inverted ahead of each of the last seven U.S. recessions, and had a single false signal in the past 60 years. More specifically, since the late 60’s every U.S. recession has been led by a period during which the term spread was negative. While the U.S. yield curve has flattened to levels not seen since the second half of 2007, amid a remarkably healthy and robust growth environment, it is far from inverted, as the term spread is still positive, 84 and 24 basis points as far as the 10y-3m and 10y-2y spread is concerned (as of 19 July 2018). The prevailing macroeconomic backdrop of expanding economic growth and rising inflation favors a gradual acceleration of the Federal Reserve’s policy normalization, i.e. a more aggressive tightening cycle than that previously expected. As a consequence, numerous analysts expect that the yield curve will flatten even further, and could, potentially, turn negative. Even though this is not a consensus view, as various analysts claim that the possibility of an inversion is more remote than usual, due to the protracted period of ultra accommodative monetary policy, and low rates that have distorted risk premia in particular, it seems highly likely that the curve will continue to have a bear flattening bias. As the term spread is one of the most reliable and consistent predictors of projected economic activity it is emitting a clear warning signal for the U.S. economy’s projected performance. It seems that investors are nervous with the current curve positioning, because historically the decline in the term spread has been primarily driven by a pronounced rise in short-term rates relative to long dated ones that exhibited a more gradual increase. As the projected path of U.S. monetary policy could still surprise on the upside, especially if the macroeconomic data is better than expected, investors fear a more aggressive rise in short rates, even though the Federal Open Market Committee has been adopting a more ‘gradualist’ approach. The probability of a future recession based on the current value of the term spread is rather limited, but has been consistently rising; based on our Probit model the probability that the U.S. economy will be in recession 12-months from now is 13.88% (Graph 1), while the probability based on the New York Federal Reserve’s model is 12.51% (June 2019). If one excludes the period of quantitative easing program implementation from the estimation, to account for possible distortions, the probability that the U.S. economy will be in recession twelve months from now rises to 14.72%. Although still contained the rising recession probability contradicts other widely followed U.S. leading economic indicators, such as the Institute for Supply Management Purchasing Managers Index (ISM Index), that are pointing to a continuation of the positive momentum.


Intelligent Risk - October 2018

Most analysts focus primarily on the relation between the term spread and macroeconomic performance and not on the spread’s impact on financial market performance and volatility. To get better insight in the relation between the term spread and equity market volatility we looked into their past behavior. Table 1 shows the movement in the term spread, as well as that of the CBOE VIX Index, since the beginning of the 90’s. It seems that as the curve steepens implied equity volatility tends to fall, as there is an almost 6 and a half percentage point drop in the VIX Index. The opposite is true for periods that the curve has been flattening; implied volatility has exhibited a rising bias.

The result sounds reasonable, as, in general, periods during which the curve is steep signal better days ahead for both the economy and riskier financial assets. Investor fear is lower, as risk aversion declines. By contrast, an inverted curve signals trouble… Table 2 provides more insight for the periods that the curve has inverted up till the outbreak of recession.

There have been three recessions since the beginning of the 90s: 1990, 2001, and 2007. Volatility tends to rise from the curve’s inversion, as well as its lowest point, to the beginning of every recession. The most sizeable movements have occurred during the last two recessions, probably due to high investor complacency and consistent underestimation of risk.

Intelligent Risk - October 2018


In the past, the market has relied on the curve’s leading nature and consistent forecasting ability and anticipated the projected macroeconomic and financial stress. It is thus quite crucial to monitor the curve’s positioning, even though some analysts have been questioning its effectiveness, as an inversion could stir up market volatility that could in return lead to a correction in equity markets, even if it does not eventually lead to a recession.

Last but not least, it remains to be seen whether the Federal Reserve will decide to take action in order to prevent the curve from inverting; the President of the Federal Reserve Bank of St. Louis, James Bullard, has been advocating a slower pace of policy normalization to avert inversion.

references 1. The forecasting ability of the spread is pretty robust, as it has produced only a single false signal in 1967, when the economy experienced a ‘credit crunch’ that the NBER did not classify as a recession, despite a notable decline in activity indicators. 2. More specifically, as we are interested in conditioning the VIX on the yield curve, we classify the sample in periods when the curve had a steepening bias, from a local minimum (trough) to a local maximum (peak) and vice versa when the curve experienced a flattening bias.


Intelligent Risk - October 2018

authors John Galakis Iniohos Advisory Services John Galakis is Managing Partner and Chief Investment Strategist at Iniohos Advisory Services, an investment consulting and investment research boutique. John has extensive experience in the areas of quantitative portfolio management, investment strategy, asset allocation and investment research.

Jean Paul van Straalen, CAIA, CMT, FRM, PRM Iniohos Advisory Services Jean Paul is Partner and Senior Risk Specialist at Iniohos Advisory Services. Jean Paul has extensive experience in the areas of (quantitative) portfolio management, portfolio risk management, Asset & Liability Management and Technical Analysis.

Intelligent Risk - October 2018


managing rapidly changing cyber risks

by Paul Sand Small and medium sized financial services enterprises face a daunting challenge as they seek to optimize the management of cyber risk. These enterprises have limited staff and budget and must carefully choose where to invest their resources to maximize cyber risk reduction while facing a risk landscape that changes with every tick of the clock. Today a significant “quantum mass” of resources is required to enable an enterprise to understand and react to that rapidly changing cyber threat environment. That level of resources is usually well beyond the afforability of these enterprises. As a result, the enterprises operate within a higher residual risk environment than can likely be optimally achieved with better prioritized efforts. And, because of the interrelationships between these enterprises and other key parts of their economic sectors, their entire sectors are potentially exposed to elevated residual risk.

understanding cyber risk drivers To understand the challenge and then find a solution, we will first parse out what factors influence cyber risk at an enterprise. Then, we will consider what factors the enterprise can influence in order to control cyber risk. Finally, we will develop an understanding of how quickly those factors tend to change. Knowing these things can lead us to discovering important insights into the best approaches to managing cyber risk. When we express risk we generally use this following equation: Risk = Threat * Likelihood * Impact. To best apply this to cyber risk we need a deeper understanding of some of the subfactors of a cyber threat: Intent, Capability and Vulnerability. Intent is required for a Threat to exist because if an actor is not looking to harm your enterprise or steal types of assets that your enterprise owns, there is no threat of harm to your enterprise by that actor. Even given the existence of Intent, a Threat does not exist unless the actor also has the Capability of carrying out an attack given their motivation to do so. Then, finally, given the existence of actor’s Intent and the presence of an actor’s Capability, a Threat does not exist unless the enterprise is Vulnerable to an attack within the actor’s Capability. So our equation to express cyber risk is best restated as: Risk = Intent * Capability * Vulnerability * Likelihood * Impact.


Intelligent Risk - October 2018

These five factors identified above are all influenced by drivers that affect the amount of risk that may be present. The factor of Intent is affected by a number of drivers, three of which are a threat actor’s desire to enrich themselves by stealing from or extorting the enterprise, a desire to embarrass the enterprise by disrupting operations, or a desire to make a political or social statement. The threat actor’s Capability is influenced by the availabilty of tools, the existence of successful techniques and tactics, and the time and money the actor has to conduct attacks. The Vulnerability of the enterprise to attack changes based on increases or decreases in the number assets that the enterprise needs to protect, increases or decreases in the types of assets the enterprise must protect, and increases or decreases in the number of vendors the enterprise uses to deliver its goods and services. The likelihood that an attack may occur against the enterprise varies based on the number of potential attackers and the complexity of the attack. For example, the existence of easily staged attacks that an attacker would use without significant motivation to target the enterprise raises the likelihood an attack will occur. The impact of a successful attack grows as the enterprise is more successful. For example, as the enterprise acquires more assets and acquires more personally identifiable information (PII) and flows more money with its business partners, the value of a successful attack against that enterprise increases. This can then attract new and more sophisticated threat actors. Now that we’ve developed an understanding of the five factors that influence cyber risk and what drives the risk behind those factors, let’s now seek to understand how often the factors change and how much influence the enterprise has on the risk due to each of the factors. The drivers that influence intent (desire to self-enrich, disrupt or make a statement) don’t change that often, if ever, and are wholey outside of the control of the enterprise. It generally takes a little bit of time for threat actors to acquire new capabilities through developing new tactics and techniques and acquiring new attack tools, but it is not unusual for threat actors’ capabilities to evolve during a year’s time. The enterpise has no way to directly control the development of the capabilities of threat actors. Vulnerability of the enterprise changes rapidly. Each month new vulnerabilities are reported and virtually every day the enterprise is exposed to additional latent vulnerabiliities. Enterprises have considerable control over these vulnerabilities by patching and configuring systems to remediate the vulnerabilities. The likelihood of an attack can change rapidly as more vulnerabilities are discovered, tactics and techniques evolve, and attack tools are developed and made available. But these drivers of the likelihood of attack are well outside the control of the enterprise. The impact of an attack generally changes slowly for most enterprises because the pace of business growth does not materially make the cost of a successful attack enterprise rise much year over year.

Intelligent Risk - October 2018


optimized cyber risk management In the universe of cyber risk drivers as described, there is a perfect convergence where there is rapid change driving the need to continously gather intelligence and where the enterprise can directly control the identified risks: vulnerability management. It is here that enterprises can improve prioritization by focusing time and attention on answering this intelligence question: “What vulnerabilities are being actively attacked outside my enterprise?” Then working to remediate those commonly attacked vulnerabilities that are known to exist within their enterprise first. In this manner, the enterprise reduces the greatest amount of risk with the resources it has at hand. The key to successfully executing on this approach is finding an intelligence source that can answer that important question. Each enterprise should turn to their respective Information Sharing and Analysis Centers (ISACs) and commercial threat intelligence vendors to acquire timely reports of commonly attack vulnerabilities. Armed with that information the enterprise can then apply resources to quickly achieve the greatest possible risk reduction with the resources available to them.

author Paul Sand Paul Sand is Vice President, Independent Security Officer, at the Federal Home Loan Bank of Chicago where he is responsible for oversight of the Bank’s security programs. FHBLC is a $100 billion wholesale bank that provides liquidity to retail financial institutions involved in home based lending. He is a named inventor on 21 US Patents spanning the disciplines of cyber security, physical security, public safety, voice communications, and data communications. Paul holds a Master’s of Science degree in Computer Science from Northwestern University and a Bachelor of Arts degree in Business, History, and Computer Science from the University of Jamestown.


Intelligent Risk - October 2018

definition – conflict of interest

by Rory Flynn In a previous life, I was a fund manager. Clients wanted upside and out-performance. The name for the stuff they didn’t want was risk. Conflict of interest management is a bit like fund performance; organisations may want to exploit employee relationships, and when things don’t work out those relationships are “conflicts of interest.” The traditional attitude is to regard conflict of interest as a problem. An early management primer advised that “no man can serve two masters” (Matthew 6:24). That view persists; the Irish financial regulator (“Themed Inspection highlights Conflict of Interest risks in investment firms”, 29 February 2016) advises that “COI (conflict of interest) should be avoided...”. According to the Oxford Dictionary, a conflict of interest is: “A situation in which the concerns or aims of two different parties are incompatible” with a primary example “the conflict of interest between elected officials and corporate lobbyists”. Merriam-Webster defines: “a conflict between the private interests and the official responsibilities of a person in a position of trust”. These traditional definitions characterise conflicts of interest as problematic. This seems to resonate with Government ethics advisors; thus their base strategy is to eliminate all relationship risk. In the private sector relationships are important and can be constructive. The corporate risk manager needs tools which help identify those factors which make some relationships constructive and others destructive. In the modern economy more decisions are out-sourced and automated. Out-sourcing creates new conflicts of interest. Automation means that decisions are made once and not revisited. This creates greater pressure to think clearly about the fewer, bigger decisions that are made. Risk management is about understanding the environment in which decisions are made and thinking about ways the distribution of outcomes can be described or quantified, and perhaps altered. Good conflict of interest management can be a tool to improve governance around these fewer, bigger decisions. My goal is to develop better analysis methods which would help that process. A risk management definition of conflicted decision making could help us think through the forces at work. Any definition of ‘Conflict of Interest’ should include references to: • The decision maker. Conflict of interest policies focus on people, but the thing which is generating the decision could be; a business or business unit, or a role. Modern regulation has the idea of responsible person or control function. Internal audit and control functions are mechanisms for isolating and highlighting individuals who do serve two masters, with a view to setting a hierarchy. Risk managers might occasionally have the power to create these focus roles. Intelligent Risk - October 2018


• More often, the risk manager will be thinking along the lines “a decision is occurring, it would be helpful to understand whether the forces around this decision are conflicted”. Is the identity of the decision maker irrelevant? No; in any situation a specific individual can increase or decrease the quantity of conflict. • The idea that the conflicted party is part of two or more entities or roles. Conflict can arise when a person is part of competing departments, subsidiaries, families, family businesses, non-profits, or oneperson consultancies. Analysis is easier when these roles have names and income streams. • Ongoing relationships. A one-off, isolated, transaction is unusual and rare and has no conflict of interest potential. A relationship which persists over time allows conflict of interest to emerge. • Motive. It is usually money, but it could be anything of value; it can be negative or positive. • Finally, fixed and variable outcomes. Binary, zero-sum, outcomes are easiest; normal probability distributions are manageable. Risk management becomes messy when the distribution of outcomes is skewed, with asymmetric up (or down)-side, or a mix of fixed and variable outcomes. To sum it up: • A decision-maker faces a conflict of interest when they can influence the allocation of inputs and/or outputs across multiple ongoing structures. Colloquially: “we are in this together, but different”. Much of the literature on “Conflicts of Interest’ is focused on identifying quite subtle conflicts. Risk managers, myself included, can experience difficulty explaining to well-meaning professionals why apparently subtle pressures can create potential for conflict. The proposed definition has the potential to help risk managers with the process of identifying conflicts of interest in a consistent way. Further work will allow us to analyse why some conflicts and situations are more or less troublesome, with the ultimate aim of ranking and managing conflicts of interest in a consistent fashion.

author Rory Flynn Rory Flynn, Director, Consultant has a background in investment management as an analyst, portfolio manager, CIO, and executive board member. He was part of a successful, awardwinning team managing European and Global equity portfolios. More recently he is active as independent director of a local credit union. He is currently carrying out research on governance and risk management. He is a CFA charterholder and recently earned the PRM designation. Rory is based in Dublin, Ireland where he lives with his family. He enjoys cycling, though he has not yet managed to win a race.


Intelligent Risk - October 2018

do interest rate markets face a hard transition from LIBOR / Euribor to RFR-based bench-marks by Christian Behm, Peter Woeste Christensen & Andreas Hock Today’s system of interest rate benchmarks is about to change fundamentally in the near future. Riskfree rates (RFRs) are expected to replace LIBOR as the benchmark for all types of products. These changes are associated with both systemic risk and risks which individual market participants face. Many banks have followed a “wait-and-see” approach. However, the speed of changes in market practices is increasing, for example in the clearing area. Thus, it is time to prepare for the upcoming transformation.

I hope it is already clear that the discontinuation of LIBOR should not be considered a remote probability ‘black swan’ event. Firms should treat it is as something that will happen and which they must be prepared for. […] Firms that we supervise will need to be able to demon-strate to FCA supervisors and their PRA counterparts that they have plans in place to mitigate the risks, and to reduce dependencies on LIBOR.


Andrew Bailey, Chief Executive of the FCA (July 12, .2018)

the market and the benchmark regulation In the wake of the LIBOR Scandal, IOSCO1 published the 2013 ruleset “Principles for Financial Benchmarks”. In return, the EU implemented IOSCO’s principles as part of the EU Benchmark Regulation (EU BMR or EU 2016/1011). EU BMR has been in effect since January 1, 2018. After the two-year initial transition period, all financial benchmarks must be EU BMR-compliant from January 1, 2020. The scope of the EU BMR is to create a regulatory framework with suitable governance and control requirements to reduce the vulnerability of benchmarks, especially in relation to manipulation. A benchmark anchored in actual transactions is the best safeguard against manipulation, and as such one of the central principles of the regulation.

1 / International Organization of Securities Commissions

Intelligent Risk - October 2018


The IBOR (Interbank Offered Rate) problem is that the benchmark is trying to capture a market that hardly exists any more. Unsecured wholesale bank term funding has almost vanished. ICE/IBA (administrator of LIBOR) confirmed this in an April 2018 paper.2

There are now real concerns about the sustainability of certain IBORs due to a significant decline in activity in the unsecured bank funding market that they are supposed to represent. Given the limited number of actual transactions, and with banks reluctant to provide submissions based on judgement, the viability of certain IBORs is now in doubt.


IBOR Global Benchmark Transition Report June 2018, ISDA, AFME, ICMA, SIFMA, SIFMA, AMG

Due to the fragile nature of the IBORs, a new system of RFR-based benchmarks is being developed for the major3 currencies. The establishment of new benchmark families follows similar but individual paths in each currency.

managing complexity and risks The analysis starts with splitting risks into systemic risk (market availability and stability) and risks faced by individual market participants (legal, operations, financial, etc). Identifying and mitigating individual risks requires an understanding of the overall complexity of the tasks. The complexity arises from several aspects. First, fragmented developments in different markets: currencies, benchmarks, products, legal frameworks, jurisdictions and client segments. Second, very high number of affected counterparties and notional volume. Third, uncertainty regarding further developments and the risk of the occurrence of unexpected scenarios with a substantial impact on markets.

2 / ICE LIBOR Evolution 3 / EUR: USD: GBP: CHF:


Intelligent Risk - October 2018

The future of Libor beyond 2021 remains an uncertainty, and in light of the results of the waterfall results published by ICE it seems particularly uncertain for CHF.


Minutes from the meeting of the National Working Group on CHF Reference Interest Rates (4.6.2018)

key risk scenarios Eonia (Euro Overnight Index Rate) is the only critical benchmark not trying to pursue EU BMR compliance. The administrator EMMI announced this in February 20184. Therefore, Eonia cannot be referenced to in any new financial contract after January 1, 2020. ESTER has been announced as the Eonia successor rate. The impact can not be underestimated since Eonia is the valuation basis even for EUR products that do not reference the rate directly. Eonia OIS prices are used to value all EUR derivatives, which increases the challenge at hand. A seamless transition to the ESTER is not possible. In addition, there is a significant spread between Eonia and ESTER. Current communicated plan state that ESTER will be available from October 2019. Risks include: • Regulators do not authorize Eonia for legacy business after 01 January 2020. ºº Unlikely due to the risk of disruption and market stability. ºº Black swan type of event. • Markets in products for the successor rates not existent or not liquid on 01 January 2020. ºº Probability is difficult to assess. The tight time schedule increases the risk. ºº Potentially major impact with a systemic risk on markets. • Basis risk of hedging Eonia legacy positions with the ESTER results in inefficiency of risk management and hedge accounting. ºº High probability of occurrence. ºº It will be difficult to value legacy positions due to lack of observable ESTER curves. Impact dependent on the specific organization. ºº We assume the market will try a rapid migration to the successor benchmark. • Highly complex and risky operational transition, both external (negotiations and repapering of contracts) and internal (valuation of positions and funds transfer pricing) ºº High probability of occurrence. ºº In case no industry wide standard procedure can be agreed upon, counterparts need to negotiate the exact transition conditions for each portfolio. Market participants should assess the impact to their organization individually.

4 /

Intelligent Risk - October 2018


LIBOR & Euribor: The situation is different in each currency. Common risk issues are: • Regulators do not authorize the use of the benchmark for legacy business after 01 January 2020. ºº Unlikely due to the risk of disruption and market stability. ºº Black swan type of event. • Regulators do not authorize the use of the benchmark for new transactions after 01 January 2020. ºº Probability is difficult to assess. ºº Major systemic risk impact. • Banks withdraw from the panels. ºº Possible scenario after 2022. ºº Banks need to adapt to the post IBOR environment, with significant impact. • Liquidity moves to RFR markets. ºº Possible scenario after 2020. Gradual event as the RFR markets gains momentum. ºº Smooth transition that limits the impact. However, the parallel phase of different benchmarks in one currency comes with significant challenges for systems and processes. The future of Libor beyond 2021 remains an uncertainty, and in light of the results of the wa-terfall results published by ICE it seems particularly uncertain for CHF.


Minutes from the meeting of the National Working Group on CHF Reference Interest Rates (4 June 2018)

call for action Developments affecting the current system of interest rate benchmarks are fundamental. Changes are underway and will have a significant impact on organizations. The multiple Known-Unknowns and UnknownUnknowns should not keep any of them from starting their preparations. On the contrary, the complexity of the matter and the different risk scenarios require an early start of the planning process. The impact will affect multiple systems, processes, products and clients. We recommend managing the overall program based on risk scenarios. This allows an organization to react effectively in case risks materialize during the transition.5

5 / To stay up to date see also or connect to


Intelligent Risk - October 2018

It is important to keep an eye on the quantitative effects of the transaction when converting portfoli-os and negotiating with counterparties, in order to avoid adverse P/L effects. These must already be considered when agreeing on fallback language.

conclusion The change will be associated with significant challenges over the next years. Careful planning of both approach and milestones is required. A new system of interest rate benchmarks also means a shift in the allocation of banks‘ refinancing risk. Today, liquidity and credit risk are components of the IBORs. Therefore, IBOR referencing products are also associated with respective risk components. This is about to change in a RFR-based system. In addition, basis risk between different tenors of the rates, such as between 3-Month LIBOR and 6-Month LIBOR, will diminish. In the long-term, this offers the potential of a simpler interest rate framework with less curves involved.

authors Christian Behm Christian Behm has more than 15 years of consulting experience with a focus on the capital markets business. At LPA he is responsible for Risk & Quant Consulting and leads the IBOR Transition Practice Group (

Peter Woeste Christensen Peter Woeste Christensen, has more than 25 years of capital markets experience, with a deep under-standing of products, processes and technology. As a director within the Risk & Quant Consulting practice at LPA, he plays a pivotal role in the IBOR Transition Practice Group.

Andreas Hock Andreas Hock has been with LPA for more than three years and is part of LPA’s IBOR Transition Prac-tice Group. As part of LPA’s team Risk & Quant Consulting he focuses on regulatory topics for market and counterparty risk as well as the valuation of financial instruments.

Intelligent Risk - October 2018


the FRTB: concepts, implications and implementation

by Sanjay Sharma & John Beckwith Sanjay Sharma and John Beckwith’s new book, The FRTB: Concepts, Implications, and Implementation, explores the Fundamental Review of the Trading Book (FRTB) regulations and their consequences and takes a look at the principal components of the new guidelines. We sat down with Sanjay and John to discuss their book.


What was your motivation for writing this book? Why is the timing of the book so important?

Sanjay and John In our roles as risk and credit executives at RBC with six decades of our collective experience at other banks we helped navigate the bank through the 2008 global financial crisis in better shape than most by identifying and hedging against risk factors from instruments before spreads made such hedges non-economic. We were able to identify these risks ahead of the curve by identifying slowing trading patterns in instruments prior to prices being reflected in the market. Subsequently, we worked through the inevitable global regulatory response to the crises and the practicalities of implementing those regulations into a robust global trading platform. These include Basel 2.5, the Liquidity Coverage Ratio, CCAR/stress tests and others. In our view, none of these addressed the fundamental need for changes to trading book management at the observability of instrument/risk factor level that caused us to identify and mitigate market risk in 2006. That task is accomplished by the design and implementation of FRTB. FRTB is nearly 10 years in the making and will have as great an impact on trading books as Basel 2 had on banking books. It is the first regulatory framework to consider the liquidity and observability of risk factors, creating a market-driven self-correcting mechanism for capital attribution of trading desks when markets slow. As we mark the 10th anniversary of Lehman’s collapse, Basel (BCBS?) is making its final adjustments to the framework with expectations of global implementation over the near term. Banks will need to invest in and align new platforms, new calculations engines, and new corporate governance frameworks in order to meet the revised regulatory deadlines. PRMIA

What impact will FRTB have on banks?

Sanjay and John Although recalibration will soften the initial capital impact, initial estimates suggest that the increase in capital charge for banks’ trading books from FRTB adoption will be considerably higher than current Basel 2.5 capital requirements for trading desks/banks. The impact of the capital increase will vary by jurisdiction.


Intelligent Risk - October 2018

In Europe a “phase-in” proposal in the draft CRR proposes that a multiplier of 65% will be applied for three years and this period could be extended after review by the EBA. Further, banks adopting the standardized approach are expected to experience capital charges 40% higher than for desks/banks adopting the IMA. The sheer increase in capital charge has attracted the attention of bank senior managements. In addition, the wide gap between the standardized and internal models approaches will set the stage for competitive rebalancing across the industry. Banks that invest in and get regulatory approvals for using the IMA will potentially need to allocate only half or less capital than the SA – thus giving them twice the return on capital compared to SA banks, all other factors being equal. PRMIA

What are the key things banks need to know as they prepare for FRTB?

Sanjay and John

There are 3 key things:

1. FRTB is coming, despite skeptics. 2. Changes will have deep impact. 3. There is not nearly as much time to implement as several banks currently believe. PRMIA

For what audience is the book intended? Who will benefit the most from reading it?

Sanjay and John FRTB is not simply a new capital calculation engine. It will create an entirely new way to manage, monitor and report trading activities across banks around the world. As such, this book is intended to be useful to bank practitioners at multiple levels. Bank Executive Management They must plan before implementation. Moving portfolios and changing strategies will be costlier, more difficult, or even impossible in some cases, so advance planning will be imperative. Chief Financial Officer They will have to define process early for cost and capital allocation by RTD. • Some Lines of Businesses (LOBs) will become unprofitable. Define a process for identifying issues and alternatives early. • Cost allocation to individual RTDs will be critical to LOB analysis. • Consider dependencies and alternatives if unprofitable businesses support profitable ones. • Consider how to allocate regulatory equity capital across LOBs paying particular attention to LOBs with related clients on either side of the boundary.

Intelligent Risk - October 2018


Business Head Business Heads managing global trading strategies will need to consider whether and where to establish global RTDs and where to establish regional RTDs. Factors to be considered include: • Differences in regulatory interpretation across jurisdictions • Capital benefit of separating NMRF and Long-Liquidity Horizon assets • Incremental operating, reporting and management costs associated with each RTD • Developing a coherent RTD Optimization strategy Treasurer They must be integral part of the planning process early and at all levels. • Treasurers will need to ensure that regional and global ALCOs are aligned to RTD strategies. • Individual RTD strategies may have large impacts on regional liquidity. • Create a consistent global process to provide feedback regionally. Chief Risk Officer Their fundamental challenge is to think about resources for a more complex world. • CROs will need to establish more granular governance protocols for RTD monitoring and reporting driven by a common data set. • The CRO will also need to establish new policies/procedures regarding movements across the boundary. Chief Operating Officer, Chief Technology Officer and Chief Transformation Officer • COOs, working with CTOs, will need to develop a common data infrastructure and protocol to inform inputs to all trading models in a uniform manner. • The COO will also need to ensure that RTD strategies have common benchmarks for addressing operational risks within their desks. Bank Analyst They should not underestimate the cost or distraction from FRTB. • With a 2019 implementation date, most banks will only begin to focus earnestly in 2017-18. • Do not expect an early dividend from regulatory easement.


Intelligent Risk - October 2018

Regulator or Supervisor On their part, they must be clear about guidance and wary of jurisdictional conflict. • Regulators will need to clearly articulate the conditions necessary to approve an RTD strategy and under what circumstances, if any, it will allow operating sub-desks under RTDs supervised by other jurisdictions to exist. • Staff appropriately. Greater granularity will require more analysis resources. • Communicate requirements early. ºº Banks will need time to invest in technology. ºº Reconcile interpretational differences between jurisdictions early to avoid regulatory arbitrage. PRMIA

What do you hope people come away with after reading the book?

Sanjay and John Readers should come away with a robust understanding of why FRTB is so critical, the impact FRTB will have on their organization and, most importantly, how to implement FRTB within their organization in an efficient and effective manner. The following fundamental questions will be addressed in the book: • Which model approach and framework captures risk better vis-à-vis capital charge? • Which approach implies appropriate capital vis-à-vis revenue and risk? • Which framework SA/IMA – or combination thereof – is practicable? • Are the risk and capital measures computable in current infrastructure? • If not, is the transformation optimal from an investment and capital perspective? • Which businesses will be affected, sunset vs. invested in and scaled? • What process and cultural changes have to be made across the institution?

authors Sanjay Sharma Founder and chairman of GreenPoint Global.

John “Jeb” Beckwith Managing director and head of financial institutions at GreenPoint Financial.

Intelligent Risk - October 2018


PRMIA and RIM Together, we shape the landscape of risk management

by Kristin Lucas, PRMIA Managing Director of Operations PRMIA is excited to announce its recent acquisition of the Risk Management Initiative in Microfinance (RIM). With an ethos very similar to that of PRMIA, RIM was begun as a grass-roots effort by industry professionals to set a standard for risk management practice in the microfinance industry. Both PRMIA and RIM are focused on supporting the industry through education, training, networking, and providing an open forum for the advancement of risk management practices. For PRMIA, this acquisition brings a whole new arena in which to connect with risk management professionals. The microfinance industry is growing and developing at a rapid pace, and the setting of risk management frameworks and standards is critical to that growth. As inclusive lending and microfinance offerings grow and overlap more and more with traditional banking streams, we are excited to have well-rounded discussions and cross-pollination of ideas between the traditional banking/investment and microfinance industries. Begun in 2013, RIM has developed a pathways-based, best practice standard for risk management in the microfinance sector: The RIM Graduation Model. The process gives microfinance institutions the tools to assess their risk management structures, processes, and adherence to risk management standards. Institutions can then determine a strategic improvement pathway to bring risk management practices in line with the Graduation Model Together, we can learn from the best practices of both industry sectors and grow from the exposure to each sector’s particular challenges and strengths. Watch for continued training on and promotion of the RIM Graduation Model, a certificate program in risk management in microfinance, and a growing library of content on microfinance-arena topics, including member webinars. As we welcome RIM members into the PRMIA fold, please watch for opportunities to meet at chapter events and through PRMIA online communities.


Intelligent Risk - October 2018

bringing stakeholders together to discuss financial sector regulation

by Steve Lindo, Principal, SRL Advisory Services On November 28, the PRMIA Washington DC and New York chapters are again collaborating to hold a forum on Financial Services Regulation. This year’s event marks the 7th in a series dating back to 2010, in which industry, government and academic experts exchange views on evolving markets, regulations and risks in the financial services sector. Two features of this event series make it unique. The first is that the opportunity for experts from these different domains to discuss and debate far-reaching regulatory issues does not occur naturally. It takes the coordinated efforts of PRMIA volunteers and staff to set up a forum where this can occur. To further encourage this invaluable dialog, a pre-conference lunchtime discussion is also held for the speakers and other noteworthy experts. The second feature is that the member profiles of PRMIA DC and New York chapters span both industry and regulatory communities, but hardly overlap. It is only through collaboration between the two chapter steering committees that these communities can be brought together at the same time, on the same date, and in the same place. This year’s forum will open with a keynote review of “The Financial Crisis – Ten Years After” by two distinguished speakers, followed by panel discussions on “What Regulation Isn’t Working and Needs to be Scrapped?”, “What Regulation Is Working and How Can It Be Improved?” and “What Roadmap Is Needed for the Regulation of Digital Finance?” Full details of the agenda, location and registration information can be found here. We greatly appreciate the support for this year’s event from our sponsor Kroll Bond Ratings and venue host Steptoe and Johnson.

Intelligent Risk - October 2018


canadian risk forum 2018 The Changing Landscape of Risk PRMIA Toronto is proud to be hosting the 6th Annual Canadian Risk Forum. November 12 - 14. The Global Financial Crisis brought an increased focus on traditional risk management practice at financial institutions. Risk management showed signs of weakness from basic retail mortgages to exotic credit derivatives, and spread well beyond banks to rating agencies, insurance companies and central banks and regulators. The post crisis era led to massive deleveraging and derisking of institutions and a period of increasing regulation. Budgets, headcounts and technology usage in risk management increased to compensate. Ten years on from the crisis, there is a growing perception within institutions that they have focused too much on the well understood and measured categories of credit and market risk, and a growing concern that they have not paid enough attention to non-financial risks, in particular some risks that are poorly understood and very difficult to measure.


Donald Rumsfeld famously either confused or enlightened his audience at a news briefing on this topic: “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns – the ones we don’t know we don’t know. [I]t is the latter category that tend to be the difficult ones.”

The “unknown unknowns” of risk management is what keep CROs and risk managers up at night, and some can even be broadly identified and named. We have always been aware of conduct risk, but in today’s world of rapid information flow, the impacts can be disproportional to the incident: the conduct of a single individual at a branch can be spread to social media, and into traditional media shortly after, and then impact shareholder value. The biggest unknown unknown of them all is cyber risk, where a single point of failure can lead not only to very large direct losses, but to unimaginably large potential loss to reputation. The speed at which information travels makes it very difficult for any business to control its reputation on social media; for financial institutions who rely on their reputation, losses can be devastating. The theme of this year’s Canadian Risk Forum is the Changing Landscape of Risk Management, and our broad discussions will cover:


Intelligent Risk - October 2018

• The changing roles of financial risk management: Market and credit risk management groups are in a post-expansion phase, facing pressure on head count, but at the same extensive regulatory expectations, with the most intense the “Basel IV” banking reforms. • Emerging risks: Conduct and culture. Cybersecurity. Vendor risk. Environmental and social. Nonfinancial risks have become an increasing focus of financial institutions. How are financial institutions managing the expanded risk management mandate? How are they coming to terms with these risks? • The role of technology and data: For risk managers, new technologies offer the possibility to leverage more with a smaller number of resources. Robotic process automation can already alleviate some routine tasks, machine learning holds out the promise of gaining new insights from data. How are tools evolving for risk managers? The two-day event will be preceded by a special training workshop on Big Data, Machine Learning and Artificial Intelligence in Financial Risk Management. A learner with some or no previous knowledge in this space will understand the fundamentals of this new science and how it is applied in finance and risk management in particular. A hands-on session using Microsoft Azure will help to demonstrate the implementation of ML algorithms.

authors Andreea Amariei Andreea Amariei, PRMIA Toronto Co-Regional Director, is an Associate Vice President, TD Securities Change Delivery. In her current role, Andreea is the Delivery Lead for the LIBOR Transition Program for TDS. Andreea joined TD Securities in 2005 with TD Commodity and Energy Trading where she performed various roles in market risk. In 2010, Andreea moved to Toronto to lead the development of middle office processes for the launch of TD’s Global Precious Metals trading business, and afterwards she provided market risk oversight for the Canadian Interest Rate Trading Business. Before TD, she worked in the public sector with Alberta Education in the Budget and Fiscal Analysis Branch where she began her career in 2000.

David Milne David Milne, PRMIA Toronto Co-Regional Director, is an Associate Partner in the Financial Services Risk Management practice of Ernst & Young LLP, where he leads the Quantitative Advisory Services group. In a prior role, David was Vice-President at CIBC Capital Markets Risk Management, where he led the team of quants reporting to the Capital Markets CRO (EVP), responsible for methodology, development, and governance of capital markets trading risk. David has 17 years of experience in capital markets modelling and quantitative risk management, including experience at TD, CIBC and Algorithmics/IBM.

Intelligent Risk - October 2018


calendar of events Please join us for an upcoming training course, regional event, or chapter event, offered in locations around the world or virtually for your convenience.

PRM™ SCHEDULING WINDOW September 15 – December 21

ASSOCIATE PRM CERTIFICATE VIRTUAL TRAINING Sessions released Mondays, October 1 – December 3

OPERATIONAL RISK MANAGER VIRTUAL TRAINING Sessions released Mondays, October 1 – November 26






Intelligent Risk - October 2018

MODEL RISK MANAGEMENT VIRTUAL TRAINING Sessions released Tuesdays, November 6 – December 11



CANADIAN RISK FORUM November 12 – 14 in Toronto

EMEA RISK LEADER SUMMIT November 14 – 15 in London


PRM TESTING WINDOW November 19 – December 21


2019 PRMIA RISK MANAGEMENT CHALLENGE January 21 - February 3

Intelligent Risk - October 2018


INTELLIGENT RISK knowledge for the PRMIA community ©2018 - All Rights Reserved Professional Risk Managers’ International Association

Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.