Risk Review

Page 1

RISK REVIEW AN INSIGHT INTO THE INDUSTRY

INTERNATIONAL


Risk Review INTERNATIONAL

04 RISK REGULATION 04................A centralised and consolidated approach to initial margin 06...............How to reduce the complexity in risk aggregation and regulatory reporting effectively

08................Expert Opinions

09 CREDIT AND MARKET RISK 09...............The credit markets: Is it a bubble?

11...............Risk it in a brave new world

13 OPERATIONAL RISK 13..............What does good risk actually look like?

15..............Back to the future of risk

17..............Expert Opinions

18 RISK MODELLING 18..............To model or not to model?

20..............The future for modelling and complexity

22..............The Missing Piece Of The Risk Management Puzzle

22 FUTURE OF RISK MANAGEMENT 24..............Securing business benefits while complying with liquidity regulations 26..............The power of data visualization 2

28.............Expert Opinions


REWRITING THE BOOK OF RISK By George Stylianides, PwC

Has risk management changed in the last 10 years? It’s certainly taken a battering. The storms which have rocked the industry have sent risk management into crisis mode. It’s retrenched to focus on regulation and compliance. And it’s become reactive.

In that spirit, I think that in five years successful firms will be the ones telling RiskMinds how they’ve evolved by:

1

Keeping risk strategy switched on. Risk changes. So does business. So risk management frameworks need to adapt too. That way they keep businesses compliant but also ready to take opportunities.

2

Making risk management a partner, not just a gatekeeper. The closer the risk function is to strategic and dayto-day decision making, the more it adds value.

3

Getting the operating model right. The clearer businesses are about who owns risk and who controls it, the better they are at managing it efficiently.

4

Taking a more connected view. Risk works beyond the traditional silos of market, credit, liquidity and operational risk. Joining up risk strategy, processes and teams across businesses can help them absorb shocks better and recover faster.

5

Being fast on their feet. Business happens in real time. Much of it is automated. Risk management has to keep up by harnessing technology to spot risk coming and manage it early.

But as the economy brightens, the industry needs more. It needs risk management to embrace change. As Shakespeare’s Prospero finds after a storm of his own making, the best way to adapt is to evolve: ‘I’ll break my staff…and deeper than did ever plummet sound I’ll drown my book.’

RiskMinds 2015 covered these issues, from the macro-economic to the day-to-day. It’s a great forum for understanding how we can move forward as an industry. Not by drowning our books like Prospero, perhaps, but editing them and adding extra, more purposeful chapters.

3


RISK REGULATION

By RiskMinds The impending regulatory changes, and what they mean for risk management within financial institutions, was one of the most discussed topics at RiskMinds International in Amsterdam in December 2015. Most agreed that the Basel Committee was “going backwards to come forwards� – in other words, repealing some of the reporting requirements it had instigated in the last decade as a result of the financial crisis, in order to simplify the reporting process. While most welcomed measures that would bring simplification and clarity, what affect this would have on the risk management framework within institutions was the subject of much debate. One thing was not in any doubt, and that is that 2016 will be another busy year for anyone involved in risk management. Will future regulations indeed be simplified? Or will the current pace of change mean that regulatory reporting will continue to be a huge consumer of banking resources? Whilst the end may be in sight in terms of the output from the Basel Committee, 2016 is a year for the implementation of those changes. Few would disagree with the overall premise of the latest regulatory reform, which has as its ultimate aim that of restoring confidence in the banking community and bringing about a more stable financial sector. However, questions remain as to whether banks can now settle down into a routine that involves a greater focus on credit, market and operational risk, rather than simply satisfying the demands of the regulators. 4

A CENTRALISED & CONSOLIDATED APPROACH TO INITIAL MARGIN The regulatory response to the 2008 financial crisis introduced a set of new rules which has significantly altered the non-cleared OTC derivatives landscape. The rules, that include recently implemented and forthcoming regulations, have the goal of driving transparency and stability in the markets. The regulations include the broadening of mandatory clearing, updates to capital treatment and the introduction of additional bilateral collateral safeguards. The collective impact is forcing financial institutions to adjust business strategies to compensate for changes to capital ratios, cost of credit, and liquidity. One of these regulations addresses the requirement for mandatory initial margin (IM) for non-cleared OTC derivatives, as expressed under the rule BCBS-261 pursuant to the Pittsburgh G20 accord. This regulation mandates financial institutions to post collateral to their counterparties to cover related exposures arising from their own potential


Risk Review: Sponsored Article INTERNATIONAL

default (defaulter pays). The implementation of this regulation has far-reaching implications for operational processes and infrastructure underpinning banks’ collateral management. The regulation will be phased in from September 2016, and expands to a majority of OTC derivative transactors over the following four years; this is just the beginning. As institutions look to implement mandatory IM bilaterally, they seek to do so in tandem with other regulations, requiring an optimised and ordered approach. Given the number of bilateral permutations between the institutions and portfolios involved, a scalable approach is required to ensure that the flow of margin is calculated, agreed, posted, and safeguarded consistently to meet the principles of BCBS-261.

to reduce disputes to a manageable level. Many of these problems could to a large extent be addressed if the process of calculating, delivering and holding collateral could be consolidated through a centralised infrastructure. With these challenges in mind, NetOTC is developing NetOTC Bilateral, which is a sophisticated IM solution that goes beyond the calculation and calling of standard two-way margining. For more information, visit netotc.com 

Dependencies raised include trade data standards conclusion, cross-border regulation implementation timings, and final regional rule interpretation. As explored at industry forums, there are a number of obstacles to producing a scalable solution for the market. In addition to matters outside the industry’s control, such as publication of final margin rules in key jurisdictions and the final determination of data standards, market players also face significant challenges in developing operational and technological systems to manage all steps of the IM process with their many derivatives counterparties. There is also the task of bringing sufficient standardisation to margin models

5


RISK REGULATION

HOW TO REDUCE THE COMPLEXITY IN RISK AGGREGATION AND REGULATORY REPORTING EFFECTIVELY

By Olaf Badstübner, global head and thought leader of Risk, Compliance & Security for Financial Services at Atos

When it comes to regulatory reporting, most banks are still facing substantial issues. Moreover, the process for a global acting bank to provide consistent reports in each jurisdiction and across all lines of business is a complex one if the banks have to cope with up to 2,000 individual reports.

Risk data aggregation and regulatory reporting are primarily stringent data management processes.

6

The underestimated factor Regulatory reporting is mostly led by the finance department as part of the disclosure process and it is associated specifically with the accountancy practice. The 2013 conclusion of the Basel Committee on Banking Supervision very clearly identified that bank information technology is inadequate to support the risk data aggregation and reporting practices sufficiently. “One of the most significant lessons learned from the global financial crisis that began in 2007 was that banks’ information technology (IT) and data architectures were inadequate to support the broad management of financial risks. Many banks lacked the ability to aggregate risk exposures and identify concentrations quickly and accurately at the bank group level, across business lines and between legal entities. Some banks were unable to manage their risks properly because of weak risk data aggregation capabilities and risk reporting practices. This had severe consequences to the banks themselves and to the stability of the financial system as a whole.”1 This is no surprise, knowing that the data collection and reporting process contains a lot of isolated source


Risk Review: Sponsored Article INTERNATIONAL

systems, as well as moving data sources, and process and media breaches. This makes regulatory reporting complex and as a result reporting processes are time consuming, ineffective, inconsistent and difficult to prove for most financial institutions. At the same time, financial institutions are facing constantly evolving and more granular requirements in reporting like AnaCredit, AIFMD, MiFID II, CCAR, BCBS 239, and new calculations for IFRS9, just to name a few.

Assess your current situation

The Atos Approach The Atos position is to bring a unified and seamless reporting data workflow process in place across the entire financial institution. Such a unique data- and workflow-driven platform gives financial institutions the flexibility to meet existing and evolving reporting requirements anywhere in the world, while delivering a strong foundation to address tomorrow’s regulatory and risk requirements. • One common framework and platform for all lines of business, jurisdictions and users. •

No fixed data model – meaning, data will only be sourced once by the given data structure from all your source systems without any data mapping process in between.

Most reporting systems are based on a fixed data model and are therefore time-consuming to adapt the data mapping when there are any changes in sourcing data or new reports required.

2. Data gathering, validation and report creation is a time-consuming process.

Data quality and data validation by hierarchical data ownership involvement and sign off steps.

3. We are facing issues in data consistency and quality.

Visualized workflow capabilities - which allow for the workflow to be adapted, as well as quick validation and calculation rules.

Versioning of the entire process including data validation, calculation rules, aggregation and sign off, any change in the workflow will lead to a new version and can immediately be compared against previous versions and highlight the changes. This allows for your organisation to be compliant in the longer term and effective in proving it.

Simple integration of new reports – a simple update of the template or calculation rules with no changes to the platform – allows you to adapt to regulation changes in half the time.

Full data traceability and analytics with transparency from sourcing to submission and back again – so you can drill down in your report and analyse at any point to see where your aggregation data came from.

Drill back functionality enables to show, for each reported figure, the underlying original source data on a granular level like transactional data.

Do a simple test by considering the following statements to determine the level of maturity of the regulatory reporting process in your financial institution. 1. Our financial institution is using several different processes and platforms across the group for regulatory reporting and risk data aggregation.

4. Excel spreadsheets are still a significant part in your entire regulatory reporting process. 5. We are not able to create on-demand reports within a restricted time frame. 6. We are not able to break down aggregated and reported figures immediately into original source data on a granular level. 7. Changes in calculation and workflow rules are difficult to prove by versioning. 8. We are not able to implement new required reports easily. 9. We have inconsistencies between internal reporting (MI) and external reporting. If you agree with at least half of the above statements, then you should start to reconsider your regulatory reporting strategy and environment. Addressing a low level of compliance reporting maturity should be a key Executive Board priority. Risk data aggregation and regulatory reporting are primarily stringent data management processes.

1 Basel Committee on Banking Supervision “Principles for effective risk data aggregation and risk reporting” 2013

7


RISK REGULATION INTERNATIONAL

EXPERT OPINIONS Click on the thumbnails below to view the experts’ opinions. The year ahead for the Single Resolution Board Elke König, Chair of the Single Resolution Board of the Financial Stability Board, explains what’s in store for the Single Resolution Board in 2016.

Has regulation made finance more stable? Ioannis Akkizidis, Global Product Manager of Financial Risk Management Systems at Wolters Kluwer Financial Services, speaks of the vast amount of work that has been done and what affects it has on performance.

Where next for financial services regulation? “Looking in a crystal ball is always difficult,” says Mattia Rattaggi, Head of Group Regulatory Strategy and Advisory at UBS on the future of financial services regulation. While his views are not necessarily those of UBS, he tells the journey of implementing regulatory reform.

How to avoid regulatory reporting burnout Charles Richard III, Senior Vice President and co-founder of Quantitative Risk Management makes light of ‘regulatory reporting burnout’ and suggests solutions.

AMA: Yay or nay? A hot topic at RiskMinds International 2015, the experts let us in on their view of the controversial subject of AMA.

8


CREDIT & MARKET RISK

By RiskMinds

The current state of the credit markets was a key subject for discussion at RiskMinds International 2015: is there a bubble building, and will it burst? If so, when? Will 2016 see a rise in corporate debt default, and what affect will this have on the global economy? Will the prolonged period of low interest rates continue, or will central banks turn a corner in 2016? Most agreed that all eyes will be on the Chinese and emerging markets’ economies in 2016, and the affect these will have on US and European economies’ growth paths. However, it is the unforeseen events for many that can have the most impact on markets; the ones that simply cannot be accounted for in advance.

THE CREDIT MARKETS: IS IT A BUBBLE? Having a sound risk management structure remains a key element of being able to keep a handle on credit and market risk, and much of that lies within a sound IT infrastructure that can both manage and assess data. Banks are hopeful of reduced complexity within the regulatory framework, freeing up time and resources to focus on technological innovation that can help them to manage credit and market risk more effectively going forward.

9


Risk Review INTERNATIONAL

“If the credit cycle were a baseball game, we are at the 8th innings,” explains Edward Altman, Max L. Heine Professor of Finance at the Stern School of Business, and guest speaker at RiskMinds International 2015 in Amsterdam. The credit markets are currently in a benign period – with interest rates low, default rates lower than average, and liquidity high. But according to Mr Altman, who has been studying the subject for almost 50 years, “the bubble is building.” “There are many bubbles that we hear about to do with asset prices,” he said. “With respect to credit, I define a bubble as when the default rate on the assets you are concerned with – in this case, corporate debt – goes above the historic average. The bubble will burst,” he added, “when we get to two standard deviation years.” Benign credit cycles tend to last between four to seven years, or an average of 5.5 years. “We are currently in year six,” he stated. “Currently, we are at a 2.6% rate of defaults, the highest level since 2009, and my prediction is that it will rise to 4.3% next year.” Looking at today’s high yield versus treasury market, spreads are 130bps higher than the historic average of 520bps: “The market is definitely anticipating higher than average default rates,” he added. “Contrarians would say this is not a bubble, this is just opportunistic debt financing, but companies can no longer shift their capital structure according to market conditions as much as they’d like to, when the equity markets go up, risky debt goes up.” The relationship between the default rates in the high yield market and economic cycles is key. Whilst the professor admitted that most macroeconomists would say that it is the economic cycle that leads defaults, he begs to differ, pointing out that, particularly when looking at the last three credit cycles, the default rate starts to go up between one and three years before the recession hits.

10

With respect to credit, I define a bubble as when the default rate on the assets you are concerned with – in this case, corporate debt – goes above the historic average. 2015 saw at least US$40bn of defaults, over half of which were in the energy sector. “And it’s only going to get worse,” he predicted. “Keep your eye on the retail space.” Whilst predicting that the credit cycle will end soon, he added the caveat that the actions of the central banks were key to extending the cycle, in terms of interest rate and liquidity decisions. “The game is likely to end really soon, in my opinion. This time next year, the default rate will rise above the historic average. We may not reach a rate of 10%, but I’m actually convinced that we’re coming to the end of the credit cycle.”

WATCH ED’S VIDEO BELOW:


CREDIT & MARKET RISK

RISK IT IN A BRAVE NEW WORLD Risk information technology (risk IT) has become a key cog in 21st century banking, regardless of geographic location or institutional size. The pressures on banks’ risk it teams are tremendous as they deal with fastchanging regulatory and business demands. It is critical that financial institutions have the necessary risk infrastructure in place to accommodate their business strategy. Industry veteran Filipe Teixeira draws on years of experience in the risk infrastructure field. He is now head of financial risk factory for Unicredit business integrated solutions (ubis), where he is responsible for market risk, counterparty credit risk, stress testing it tools and it solutions. Previously, he was a consultant in this area at avantage reply. In this interview, Filipe shares how he is facing up to the current challenges and seizing new opportunities on the horizon.

What do you think are the key challenges that Risk IT has to respond to? Banks in general are currently facing a number of simultaneous challenges that risk systems need to address. The first is the very market in which we operate. The low interest rate environment affects both the industry and the real economy, not least in terms of profitability. The other big issue, which should be familiar to your readers, is the ‘continuous wave’ of regulation. New regulatory requirements are being driven at every level – by the Basel Committee internationally, and by the European Commission, the European Banking Authority and the European Central Bank in Europe. Meanwhile at the national level, domestic regulators are continuing their supervisory roles in the areas of prudence and conduct. The demands of these regulators are also not stable. As we have seen in some of the regulatory reporting requirements – such as stress testing – the specifics can fluctuate and become increasingly demanding. The third big challenge, in many ways specific to our sector, is the expectation from the industry to cover a greater range of asset classes. Overall, this trio of challenges is putting immense demands on the Risk IT function.

11


Risk Review INTERNATIONAL

How have you dealt with these challenges at Unicredit? Like the rest of the industry, Unicredit has significantly invested in our risk infrastructure capabilities over the last four to five years – this includes a significant budget and human resource effort. At this juncture, the industry as a whole is trying to find the equilibrium between two equally important objectives. The first is the faster response time of solution delivery to business requests – we are trying to improve the ‘time to market’ from a Risk IT point of view. Secondly, we are trying to ensure that our solutions support the bank’s overall business strategy, while also remaining cost effective. In my view, the best way to achieve this equilibrium is to work closely with both the business and the risk sections of the Bank. We need to define our IT strategy to be highly responsive to the needs of both functions. Being a part of these strategic decisionmaking conversations reduces the likelihood that we will have to ‘firefight’ unexpected situations later on. It enables us to be well prepared and a lot more agile and responsive. We are now better able to deliver cost-efficient solutions, and be an integral part of the macro vision of the bank. This key involvement of the Risk IT function in the central decisionmaking process represents a sea change in our organisational standing. We have become key partners in the shaping of business and risk strategy. This is emblematic of a greater maturity within the organisation and signifies the value we bring both to the business and to risk.

Unicredit has been lauded for its adoption of cutting edge technologies, including in areas like high performance computing and the management of Big Data. Could you comment on the opportunities that exist in this space?

12

I will answer your question in two parts. Firstly on high computing technology – it is now an essential capability in Risk IT. In this way, we are greatly supported by the solutions that providers have brought to us. Some of these solutions were not designed with the banking industry in mind, but the providers are adapting them for our needs while also taking industry-specific legal constraints into account. Cloud computing is one such example. There are certainly limitations in these technologies – such as legal constraints – but overall we can leverage on them to achieve better performance in a more cost-efficient manner. Secondly, on the use of Big Data. We are increasingly using Big Data technologies to handle some of our market risk and counterparty credit risk-related stress-testing challenges. We are not working with Big Data per se, but we are using related technologies for our own purposes amid the ever-rising amount of data that the business and regulators are expecting us to handle.

Filipe Teixeira Head of Financial Risk Factory, Unicredit Business Integrated Solutions Filipe Teixeira is Head of Financial Risk Factory for Unicredit Business Integrated Solutions (UBIS). In this role, Filipe is responsible for financial risk information technology (IT) solutions for the Unicredit Group. He has nine years of experience in risk management and Risk IT solutions in British and continental European financial markets. In addition to these areas, Filipe specialises in Risk IT architecture, high performance computing solutions, risk management, capital management as well as the implementation of regulatory processes.

Article first appeared in CRO: Insights Journal, Reply Avantage


OPERATIONAL RISK

By RiskMinds

Conduct and culture came to the fore at the start of the financial crisis. Events more recently have ensured that it remains a hot topic today. What does a sound culture look like? RiskMinds International heard that firms need to take a more disaggregated approach to conduct and culture, bringing it into the heart of the business model in order to ensure the implementation of a sound risk management strategy in this area. How to deal with non-financial risk will become increasingly important as cyber security, financial crime, and geopolitical risk -- all of which are difficult to measure -- are set to take up more time in the day-to-day responsibilities of a CRO. Fintech also looks to be one of the biggest disrupters of banking going forward. As the digitisation of banking continues apace, banks will need to stay head of the game in understanding what this means for their business model. The ever-changing landscape of operational risk, and the ever-growing list of risk factors that come under this umbrella, means that now more than ever risk managers need to be at the heart of business decisions and have an innovative technological and risk framework to work with.

WHAT DOES GOOD RISK ACTUALLY LOOK LIKE? Whilst the scope of risk culture and risk conduct has grown at a rapid rate in the last decade or so, there has been an explosion of the topic of good conduct and good culture in the last couple of years in particular. But what does good risk culture actually look like? In simple terms, according to Rafael Gomes from Accenture, who gave an insightful presentation on the subject at RiskMinds Amsterdam, if you want to know what good risk culture and conduct looks like you have to practice a disaggregated approach. “There is no single risk culture within an organisation,” he stated. As an example, when looking at vulnerable customers, or as US banks call them “protected classes” – customers who at one point in their lives have required special provisions: payment holidays, certain kinds of information and so on – the kind of activities which define good customer outcome and which require certain types of behaviours “don’t sit easily in risk function or risk framework,” explained Rafael.

13


Risk Review INTERNATIONAL

“As senior managers have grappled with this, they have increasingly come up with a number of crutches to help them, that don’t actually work,” he adds, before going on to list four main myths of embedding a good risk culture: Myth number one - that if you hire good people, good behaviours will follow (when in fact individual behaviours are based on a culture). Myth number two - That it’s all down to apples and the barrel is sound (many of the miss-selling practices were actually an issue of the business model such as the overreliance on sophisticated maths). Myth number three - that incentives are essential for promoting the right behaviours (it’s really the non-financial incentives, particularly among high earners, that must complement this). Myth number four - the idea that conduct and culture are all really trading floor issues (when in fact many belong to retail). But perhaps the biggest myth of all, stated Rafael, “is the notion that culture is something intangible and fluffy.” Giving whistle blowing as an example, in order to get the framework right you need to have the right technology and tools in place in order to encourage people to speak up, as Rafael explained: “You need to integrate different phone lines, different channels of escalation, different databases, in order to provide anonymity, to ensure good due diligence and to ensure traceability and audit trail. This is an example of how good technology can encourage people to speak up.” Good culture also depends very much on how an institution interprets the rules it has been given. There are two approaches, according to Rafael, one that follows the letter of the law and uses it as a tick box exercise for compliance, and one that implements the spirit of the law. “If you follow it in spirit you can set out more sustainable behaviours

14

and that can be a strategic differentiator,” stated Rafael. “That begins to breakdown the game of cat and mouse between regulators and industry.” The key to building the framework of a good culture, according to Rafael, particularly where customers are concerned is to identify key touchpoints; whether that’s the marketing message, the decisions made within a business unit, or the employee selling the product. The customer won’t be the only benefactor of this approach: “Proactive behaviour from banks will ward off regulatory action,” he said. Whilst using data and KPIs was key to keeping track, he warned against the use of too many (“around 50-100 KPIs is about right”) and emphasised that they are only useful if subjected to qualitative analysis: “KPIs are just data, for it to be useful and for it to get buy in and to be used sustainably that data needs to be moderated by qualitative judgement,” he said. “The key to seeing what good culture looks like in your own organisation,” he concluded, “is to get the right information to the right people and empowering them to make better decisions.”


OPERATIONAL RISK

BACK TO THE FUTURE OF RISK “Risk managers today can’t do their jobs without a careful eye on the regulators,” said Michael Alix, Principal, Financial Services Risk Practice at PwC, at RiskMinds International 2015 in Amsterdam. There has of course been an avalanche of regulatory reform since the financial crisis, but in satisfying regulatory demands, risk managers have been unable to spend enough time looking at what future risks might look like. “We know that regulatory compliance is a full time day job, so given that, I suppose you are managing risk at evenings and weekends,” joked Alix.

The issue of internal models

But with the upcoming changes from the Basel committee this is set to change: “The positive changes that are coming will free risk managers and risk moderators to manage and model risk rather than model and manage regulatory issues,” he said.

“Sophisticated internal models consumed time and constrained resources,” he added. “This fine turning process, which was an industry in itself, became a distraction to risk managers often to the detriment of focusing on innovation.”

Internal modelling became the focus of huge investment in resources, as they became more finely tuned and more sophisticated, but became an issue for regulators: “Much of the vaunted reduction in the RWA came not from changes in risk positions but rather from “improvements” of internal models,” said Alix. “This led to troubling inconsistencies.”

15


Risk Review INTERNATIONAL

Hence the return to more standardised models will mean that those involved in risk management can get back to their day job – managing risk. “In the future this decoupling of internal risk modelling from regulatory compliance is inevitable and it can be good. Regulators will still care about internal models but they won’t depend on them,” explained Alix. “Regulatory capital will be what it will be; changes in RWA will require real changes in positions, and risk managers can return their attention to forward looking analysis of risks.” That’s not to say that internal modelling is obsolete. Rather, it will need to be repurposed for a greater role in predicting future risk. “Having spent resources building complex models, institutions should repurpose those to improve their risk management capabilities,” explained Alix. “Regulations won’t drive the models, firms need to know how to manage them on their own. “In short, as regulatory changes take a step backwards, risk managers can move to a new future of risk management and also approach the real unpredictability of the external environment.”

The issue of internal models Internal modelling became the focus of huge investment in resources, as they became more finely tuned and more sophisticated, but became an issue for regulators: “Much of the vaunted reduction in the RWA came not from changes in risk positions but rather from “improvements” of internal models,” said Alix. “This led to troubling inconsistencies.” “Sophisticated internal models consumed time and constrained resources,” he added. “This fine turning process, which was an industry in itself, became a distraction to risk managers often to the detriment of focusing on innovation.” Hence the return to more standardised models will mean that those involved in risk management can get back to their day job – managing risk. “In the future this decoupling of internal risk modelling from regulatory compliance is inevitable and it can be good. Regulators will still care about internal models but they won’t depend on them,”

16

explained Alix. “Regulatory capital will be what it will be; changes in RWA will require real changes in positions, and risk managers can return their attention to forward looking analysis of risks.” That’s not to say that internal modelling is obsolete. Rather, it will need to be repurposed for a greater role in predicting future risk. “Having spent resources building complex models, institutions should repurpose those to improve their risk management capabilities,” explained Alix. “Regulations won’t drive the models, firms need to know how to manage them on their own. “In short, as regulatory changes take a step backwards, risk managers can move to a new future of risk management and also approach the real unpredictability of the external environment.”

The destination is the same Whilst the next set of Basel regulations will represent a fork in the road, Alix emphasised that the destination is ultimately the same: “We’re all hoping to achieve effective risk management. The roads to success will be very different. We will use data, technology and models to find new routes, we’re not going to just fill potholes, and we will make better decisions with far greater success.” “The future of risk is forward-looking,” warned Alix. “Organisations that don’t adapt will be outperformed by those that do.”


Risk Review INTERNATIONAL

EXPERT OPINIONS Click on the thumbnails below to view the experts’ opinions. The biggest risks in 2016 “Cyber risk and criminality are probably the biggest issue facing risk management,” says Richard Bernst, Director of Risk Management at Export Development Canada. But what does he see for 2016?

How will Basel III change banking? “There will be much more certainty and clarity,” says Neil Esho, Deputy Secretary General of the Basel Committee on Banking Supervision. But how else will this important reform affect the industry?

What does 2016 hold? Fang Du, Adviser in the Division of Banking Supervision and Regulation, Board of Governors of the Federal Reserve Board, thinks that, while a lot has been done in risk management, there’s a lot to still learn from data.

The FCA’s goals Mary Starks, Director of Competition at the Financial Conduct Authority explains the FCA’s three main goals to oversee the risk industry, maintaining its successful practices and supporting its progression.

17


RISK MODELLING

By RiskMinds

The future of models was thrown into the spotlight given the impending changes of the regulatory framework. Having spent massive amount of investment and resources in building internal models, the question was, with the reduced reliance on these by regulators, what would the future hold for internal models? Some argued that this was the perfect opportunity to modify those models and reframe them to work within an internal risk management strategy, rather than using them as an exercise to fulfil regulatory requirements. However, the question remained as to whether, with the reduced reliance on internal models by regulators, banks would continue to be able to justify further investment into what is a hugely expensive and time consuming process. Panels also asked whether internal models were inherently flawed, and whether they were a distraction from looking at risk as a bigger picture. Others argued that, with or without a regulatory focus, they now constituted a crucial element of being able to make sound business decisions. One thing was not in any doubt - the conversations were interesting and would continue apace in 2016.

18

TO MODEL OR NOT TO MODEL? Bearing in mind that the prevailing capital regime has a strong influence on how firms take and manage risk, what affect will the upcoming proposed regulatory changes have? “People shouldn’t be calling it Basel IV,” stated Giulio Mignola, Group Head Of Enterprise Risk Management at Intesa SanPaolo, “Because they are going the exact opposite way. The big shift of the approach from the regulatory community in regards of models will be to actually remove the option of internal models from the spectrum,” he said. Which was not necessarily a good thing, as he went on to explain: “What the regulators are proposing is determining somehow the business models of banks with respect to operational risk, and introducing, instead of removing, systemic risks in the system. Every bank is optimising its own business model to try to fit with the standardised capital requirements and not to the real risk profile. We may have made all the banks sit on the same side of the boat.” Fellow panel member James Dennison, Vice President, Corporate Operational Risk Management of Bank of Montreal agreed with Giulio’s boat analogy: “You basically have an oar on either side, on the one side is AMA guidance and on the other side is principles for sound management or operational risk, and those two oars have to work together. If we rewind and live through a world of where we didn’t have quantitative AMA piece of it, our world would be very different.”The focus on models led to bigger and better discussions within businesses – it brought operational risk into the board room – and helped with the design and implementation of operational risk practices within banks: “I think that if we hadn’t had that focus,” said James, “we would be in a very different place to where we are today.”


Risk Review INTERNATIONAL

So what went wrong? There was not enough guidance, agreed both panel members. “Clearly there was no best experience out in the market, very few banks were using models for operational risk,” explained Giulio. “So the committee decided to draft guidelines that were very open to innovation, very open to invention by banks, and see if they could pick up at the end the best ones and best practices over time. It was supposed to be the garden where a thousand flowers bloom, but instead a million bloomed!” But why throw it away? asked Giulio: “The attitude from regulators is a bit extreme. We know the disease – we have to cure it, not kill the patient. The agenda of the regulators is to finish everything by 2016, so they are killing the thing that is not working because they don’t have the time to solve it.”

The effect on banks Resources within banks will undoubtedly change as a result of the changed guidance. “If you look at the resources that are assigned today, it’s a qual and quant focus,” said James. “If that quant approach

goes away in my mind it might not be a net neutral impact, we might need to see more supervisory focus and effort within our banks on qual aspect. There is a potential risk that businesses might say that given there’s a capital charge for me, why am I going to spend more time on qual aspect? That needs to be managed carefully by risk groups and supervisors.” “The risks are big for the discipline as a whole,” added Giulio. “You will not just hamper measurement of operational risk, you will hamper management.” Directing his comments direct to regulators, Giulio urged that they listen to industry: “If you remove internal modeling it would take ten years at least before those models can be brought back in. Don’t be driven by a shiny future of high cap numbers.” “If this is the path we go down, my request would be that there would be much more guidance on the qual forward-looking measure, because that’s where we need to focus,” said James.

19


RISK MODELLING

THE FUTURE FOR MODELLING AND COMPLEXITY As another fundamental overhaul of banking capital requirements looms on the horizon, some have questioned whether this heralds the beginning of the end for the use of internal models in regulatory capital. This was the subject of discussion for a distinguished panel at RiskMinds International 2015 in Amsterdam.

“I would not know how we could do day-today supervision without models and how we could do day-to-day supervision without models,” said Maarten Gelderman, Division Director at De Nederlands Bank.

Are internal models dead? “I think the question should actually be, should regulators determine whether you use models?” began Maarten Gelderman, Division Director at De Nederlandsche Bank.

“In the last couple of years we have moved to a world where if you have an incentive to use models, you are turning industry on its head. You are moving to a world where you say my job is to comply with regulations in order to ensure that I’m not to blame, and apart form that I’m going to ensure that I generate return on equity. Then you enter a world where every failure will receive an answer in the form of additional regulations.” Nassim was more forceful in his answer: “When I was a trader and you wanted to make money you went and found a market with a lot of regulation,” he said, giving as an example selling a fixed income note that depended economically on equity to a trader in Japan who was not allowed to have equity on his books. “Models are never a reality, if you give a trader a model he very quickly, within a day or two, will figure out where the model does or doesn’t represent reality. Whenever there is a difference between the model and the market price, it’s the model that’s wrong. The market is smarter than the people who build the models, otherwise they would be all be billionaires. You have to listen to the market, not the model.” “You can always slap the mathematicians and say it’s far from practice,” stated distinguished mathematics professor Paul

20


Risk Review INTERNATIONAL

Embrechts. “It’s an issue of trying to understand the model landscape, the model portfolio. I don’t think that’s really been done and that’s an important aspect.” Paul Embrechts went on to remind the audience that models came about as a result of questions from industry: “I’m responsible for copula, not for the concept, but for having reintroduced it. Why did we introduce it? Other people say this is really creating models that are too complicated, but it came about as a question from industry. Many of the reasons for our research are because of questions from the market.”

Nassim Taleb had a different view: “Using a model by definition for risk management has a severe flaw, because you end up with the regulation causing risk rather than mitigating risk,” he said. “The point is that it is a procedure not a result and you cannot use it for anything other than a vague indication.” “One of the lessons of the crisis is that you cannot manage model-based measures alone,” said Maarten Gelderman. “Industry should be encouraged to say no to certain products. The question we should be asking,” concluded Paul Embrechts “is how can we develop models that really benefit society?”

Is there a future for models? “Yes definitely,” said Maarten Gelderman. “I would not know how we could do day-to-day business without models and how we could do day-to-day supervision without models.”

21


RISK MODELLING

RiskMinds Report

THE MISSING PIECE OF THE RISK MANAGEMENT PUZZLE By RiskMinds

What’s the industry standard definition of liquidity? This is the question that instigated Stefano Pasquali’s research at Bloomberg five years ago. The answer was, after five years of crunching numbers and reading papers, that there isn’t one.

“What we try to do is build a market impact model. Bid/ask spread is an important component but not enough, particularly if you are in a stressed market condition.”

“Liquidity was the missing piece of the puzzle in terms of risk management,” said Stefano in his fast-paced presentation on A Machine Learning Approach In The New Regulatory Landscape at RiskMinds 2015 in Amsterdam. “There was no industry standard, and a disconnect between what people do on the academic, financial institution and regulatory side.” A lack of data, and a lack of data accuracy, are just two of the elements that make estimating liquidity difficult. Yet it is something that regulators are increasingly focused on – specifically, how liquidity is measured and managed. Bid/ask spread, an approach relied upon by academia and by the market, is inadequate. Pasquali embarked on developing a holistic application that could be used across every asset class. “Liquidity is a multi-faceted beast,” he said. “The traditional quantitative approach will never work. There are too many variables, that’s why we introduced machine learning.” Bloomberg’s liquidity assessment tool combines rich Bloomberg financial data with a machine learning engine and a market impact model that accounts for all the relevant factors that can influence liquidity. It does not rely solely on trading data, but employs cluster analysis to identify comparable assets and provides a dynamic list of these securities. This is how it addresses the problem of a lack of data.

22


Risk Review: Sponsored Article INTERNATIONAL

“This is where machine learning comes into play. Instead of defining the benchmarking system we let the machine do that, and around the bond we build a cluster of comparable bonds so that for every single asset we have a group of 2, 5, 20 or whatever similar bonds or assets or equities where we can borrow information from,” explained Stefano. Bloomberg’s liquidity assessment tool provides information such as the probability of selling a specific volume at a specific price, the expected cost of liquidation and expected maximum volume and expected days to liquidate a specific volume (given a maximum market impact). The tool also provides the level of uncertainty for each of these returns. “What we try to do is build a market impact model. Bid/ask spread is an important component but not enough, particularly if you are in a stressed market condition.”

Stress testing Data within the application can be overridden. Bid/ ask spread or a reference price can be changed, for example, which opens the door to another big use of the application – stress testing. There is direct application of Bloomberg’s liquidity assessment tool in prudent valuation, Basel III LCR: LRM, SEC Rule 22e-4, Recovery and Resolution Plan, Volkcer: RENTD and MIFID II, and indirect application in collateral management, best execution and LCR optimisation. To date, Bloomberg’s tool incorporates more than 130,000 financial instruments, including corporates and government bonds worldwide. www.bloomberg.com

23


FUTURE RISK MANAGEMENT RiskMindsOF Report By RiskMinds Every year the scope of the conversation at RiskMinds International changes. This year, the focus of discussion was very much on the impending regulatory changes and how banks could and should adapt to them. What the future of risk management entailed in a continually changing landscape was the question that many tried to find an answer to. Doubtless much of the discussion at 2016’s event will centre on how banks have gone about the implementation of the revised Basel rules. Will CROs be reporting widespread relief that their attention can be turned slightly away from ticking regulators’ boxes, thanks to decreased complexity and enhanced clarity of their requirements, or will the implementation of the new rules have caused further issues? One key element of the discussion was that risk management needs to continue to form an integral part of business management as a whole. Regulators have brought risk management into the boardroom and there was much consensus that this is where it should stay. In terms of day-to-day risk management, big data is set to loom large in 2016 as regulators amend reporting requirements and as banks themselves learn how to qualitatively analyse the wealth of data that technology now allows them to accumulate. This would present a challenge, but those that stepped up to the challenge would be the ones to remain ahead. Those that also turned their attention to the growing risks in cybersecurity and financial crime would also find themselves better-placed to manage an increasingly complex risk landscape. 24

SECURING BUSINESS BENEFITS WHILE COMPLYING WITH LIQUIDITY REGULATIONS Banks are seeking an integrated regulatory platform that will support compliance with Basel III liquidity requirements and also deliver a competitive advantage.


Risk Review: Sponsored Article INTERNATIONAL

Liquidity risk management and regulation are at the forefront of challenges facing the global financial industry today, as regulators have responded to systemic liquidity risk by introducing formal liquidity metrics into the reporting framework. Increased scrutiny by the authorities combined with the changing nature of funding markets has forced banks to rethink their approach to liquidity risk management and re-prioritize capital management. The Basel III regulations that followed the financial crisis require banks to maintain a minimum amount of unencumbered high-quality liquid assets (HQLAs), at least equal to estimated net cash outflows over a 30-day standardized liquidity stress scenario. In addition to calculating and reporting this short-term Liquidity Coverage Ratio (LCR), banks must also report their long-term Net Stability Funding Ratio (NSFR) – a measure of available stable funding in relation to required stable funding. The requirements of Basel III globally present financial institutions with substantial challenges ranging from simply keeping up with and interpreting the emerging regulations, to gathering and managing large volumes of data to meet tighter deadlines. However, regulation is not a stand-alone

issue. Banks also want to monitor liquidity for their own business ends, by identifying funding gaps, avoiding concentration risk and optimizing the deployment of their assets. Banks face strategic issues related to the impact of new liquidity requirements on profitability, and require risk analytics to go beyond simple compliance. These banks want the ability to aggregate and analyze liquidity not only for the new rounds of reporting, but also to identify the contributors and consumers of liquidity within their own organizations and client bases. The systems that banks have traditionally used for liquidity management and reporting were not designed to meet these new demands. AxiomSL offers a common data and computation platform that can be integrated across the entire enterprise, enabling firms to leverage their existing data and risk management infrastructure to ensure comprehensive regulatory compliance while maximizing business benefits. This platform will support a global approach to liquidity risk management as well as addressing the specific requirements of individual jurisdictions. Click here to download the full white paper.

In these times of constantly changing regulations, with increasing data analysis and reporting requirements, financial institutions need to reassess their technology. By taking a strategic approach to compliance using an integrated platform, they can not only meet new regulatory requirements, but also manage their liquidity more effectively to gain a competitive advantage

Authors John Heaps, Senior Vice President of Products and Pre-Sales, AxiomSL APAC Edward Probst, Senior Vice President and regulatory reporting subject matter expert, AxiomSL Ed Royan, Chief Operating Officer, AxiomSL

25


FUTURE OF RISK MANAGEMENT

RiskMinds Report

THE POWER OF DATA VISUALIZATION Legacy environment Silos are the enemies of consistency in data sources, financial libraries and risk methodologies; and unfortunately, most large financial institutions still contain a lot of silos. This is not surprising, as banks and insurers have developed through mergers and acquisitions meaning individual business units have their own systems for position keeping, risk analysis and reporting, and yet the output somehow still needs to be consolidated into a single statement of the truth every month, as fast as possible.

A new approach to managing risk ADV is complementary to Risk Data Warehousing projects leveraging Business Intelligence (BI) solutions. BI platforms seek to combine all relevant (or possibly relevant) data in a single repository and then mine the data to look for patterns. ADV, by contrast, begins at the executive dashboard, itself, defining the parameters and insights that an accountable executive needs to see, presenting a simple, consolidated view of this information only, making areas of interest or concern immediately visible. Luxoft’s Horizon is the most advanced and widely accepted of all ADV platforms in the market today. It is structured like this:

In most cases, that means large teams of experts must look through huge amounts of data to build a comprehensive report that normally takes the form of complex spreadsheets, often only intelligible to those with experience and expert knowledge. For a C level executive, who must take personal accountability for the accuracy of their company’s reporting, this is far from ideal. Risk reports today are by definition historical in nature (because it takes time to put them together) and still require detective work before executives can get to the bottom of any causes for concern.

• The dashboard comes first. Its landing page highlights the business areas that have the most immediate and relevant impact on corporate risk. These can include liquidity, tax compliance, counterparty risk, market risk, VAR (potential loss based on market trends) and other areas of interest

This issue is becoming increasingly important. Of 11 principles set out by the Basel Committee on Banking Supervision (in their BCBS239 document), 5 relate to accurate and timely reporting, and few banks or insurers are fully ready to comply with these requirements. No-one is happy with this situation and many institutions have tried to tackle this as part of wider projects to break down silos and create a single, always current, view of the truth. Projects of this kind, however, tend to be very costly, very complex and very time consuming. Fortunately, there is a better way to boost these projects at a fraction of the time and cost. It’s called Advanced Data Visualization (ADV).

• Through dedicated technology (APIs, connectors), Risk data are then retrieved from the Golden Sources and connected to the ADV star schema.

26

• For each risk area, KPIs and KRIs are defined and associated to specific widgets, relevant and contextualised to the data and patterns to be displayed.

• Orchestration techniques are then used to automate and sort data flows in order to provide clear insights on the dashboard.


Risk Review: Sponsored Article INTERNATIONAL

With advanced, real-time communication enabling the entire solution, the result is a fast and efficient way of putting actionable intelligence onto the desks of accountable executives.

From reactive to interactive and proactive ADV offers the prospect for transformational change in risk management. Because it is dynamic, it moves away from “a posteriori” reporting (seeing what went wrong only after it’s happened) and towards interactive management, which includes proactive scenario modelling, with insightful and visual analysis on the base case scenarios. Senior executives, who are accountable both to shareholders and regulators for safe management of their corporations, can now watch the evolution of key indicators as they happen, drill down any identified anomaly, navigate metrics to the level of granularity allowing them to understand and take action. This makes it possible for them to intervene at the point where effective and positive change can be made (before a potential concern turns into a real-world problem). From there, ADV collaborative module allows them to comment or request some further information, annotate and sign-off any dashboard.

Advanced Data Visualization offers the prospect for transformational change in risk management. Because it is dynamic, it moves away from “a posteriori” reporting (seeing what went wrong only after it’s happened) and towards interactive management, which includes proactive scenario modelling, with insightful and visual analysis on the base case scenarios.

The emphasis therefore moves from reporting on the business to steering the business. Senior executives can now see the underlying truths and emerging trends beneath the statistics. They can also establish a “simple version of truth” across their organization. Leading organizations are even planning to use ADV to digitize their Risk management processes and move away from paper (report print-outs).

Securing the benefits Risk management is a key discipline for any financial body, with increasing complexity and volumes in data. ADV is making it possible for senior executives to exercise greater control over risk management and use this as an active tool for improved day to day management of their businesses. That has positive benefits going beyond risk reduction. Early adopters of Horizon ADV have won strong approval from regulatory bodies, which has not simply improved their brand strength but has fed through into more accurate judgements about future investments, greater agility in making and executing key business decisions and a growing level of competitive advantage. The entire financial marketplace is moving towards greater and better use of relevant risk data in key business areas. Near-time risk management is logical and desirable and Luxoft is making it available now to financial institutions, large and small, with Horizon, its ADV platform.

27


FUTURE OF RISK MANAGEMENT INTERNATIONAL

EXPERT OPINIONS Click on the thumbnails below to view the experts’ opinions. Innovations in risk management Andrew Aziz, Director of Strategy, Research, & Quantitative Finance at IBM Risk Analytics believes that it’s an exciting time to be in risk management. He explains why.

The key trends in risk management Natalia Migal, Director of Corporate Risk Management at LeasePlan Corporation believes that the trends revolve around data and regulation. What affects will they have in 2016?

The biggest risk management challenge Chris Matten, Partner, Singapore Risk Assurance Practice at PwC explains which risk he thinks will pose the industry the most issues in the coming year.

Where next for risk management? Gerbert van Grootheest, Partner with Zanders B.V., talks of which regulations and guidelines will have an effect on risk management in the next 12 months in risk management.

28


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.