OPINION: Efficient Risk Management - A Unique Opportunity

By Paul Franks, Director, Financial Services, SAS

Following the release in 2012 of the BCBS 239 paper – ‘Principles for effective risk data aggregation and risk reporting’ – increasing attention has been given to the level of process, organisational and technology changes required to meet the new standard. The 14 principles of BCBS 239 can be summarised as data, speed and integration. The main focus lies on establishing a set of sound principles of risk assessment to better understand risk positions. In addressing these new requirements, SAS considers there to be five key action areas for banks in meeting the new standard.

Harmonising risk and finance

Harmonisation and integration of risk and finance has been a matter of discussion for many years. The complexity of existing structures and the huge number of regulations involved make changes difficult to implement, while organisational, process-related or technology reasons often block the route to a solution. Extended communication and collective action is leading to increased transparency and a common understanding of requirements. This, together with the comprehensive documentation of processes regarding risks and other control-relevant information, mean that common data management tools and standardisation of methodologies is now being realised.

Any critical examination of the tools used must not be limited only to data integration but to the entire process of creating knowledge in risk and finance with regard to its transparency and flexibility. Solutions must be dimensionally large enough and easily scalable for performance to be quickly adaptable in crisis situations.  

Uniform risk data management

A single point of truth as the basis for consistent risk control remains a key goal for banks yet data management for all aspects of banking is an ongoing issue. Most banks would be able to report having implemented an enterprise-wide data warehousing or master data project yet these often assume that their repositories represent 100 per cent of the managed data. Achieving a uniform, consistent risk view represents a challenge and we should not make the mistake of trying to represent everything in this data model. SAS recommends adopting the 80:20 rule and concentrating on the basic results data required. As a starting point, a standard data model is recommended, making it possible to focus on the explicit bank-specific challenges.

Automated and flexible risk reporting

Static reports are not adequate for the flexible requirements of technical departments such as risk. These reports derived using OLAP cubes provide a flexible view of the data from different angles but do not permit or support easy aggregation of non-additive performance figures. Different fixed hierarchy paths have to be pre-calculated or ad-hoc queries have to aggregate the data or provide only partial quantities. Such methods potentially miss capturing important sources of information. To be able to highlight specific situations, all individual business data must be accessed for flexible exploration.

Easily used statistical methods and visualisations of technical relationships should be available to give an understanding of the interrelationships. For risk management to be effective, it is necessary to integrate additional data flexibly and traceably and have rapid response to new information. Spreadsheets have continued to perform this function but this comes at a significant cost in terms of lost traceability and lineage thereby requiring manual effort for validation and assurance prior to use, to reduce potential for error. Reporting functionality should be used for more than daily reporting and should permit risk managers to explore, assess and comment on risks both qualitatively and quantitatively. 

Data governance

Data quality is an ever-present topic for banks, although its priority varies. Data quality assurance must occur directly at the data collection source. In practice, the source is always equivalent to the operational core system. This is a good approach in theory, but it cannot be directly implemented in practice as data is not only changed and entered in operational systems, but is also constantly modified and adapted in the course of processing.

Errors can occur along the entire process chain. To establish consistent data quality as a worthwhile goal, data quality dashboards and data quality seals have proven successful. Data quality indicators are recorded that reveal the correctness, completeness and consistency of the data. The decisive step occurs when the capabilities exist that allow the technical department to independently, quickly and easily use profiling to gain an overview of the potential field characteristics and thus identify incorrect values and error patterns. A business metadata concept, and not merely a technical metadata concept is essential. 

Real-time simulation

To take account of the uncertainty surrounding future value fluctuations in transactions and cash flow changes, banks must be able to run possible future development scenarios. The decisive added value of new knowledge generation processes lies in their ability to validate decisions in advance through simulations depicting the repercussions of a decision, ex ante on the risk of the overall portfolio. Rapid data integration processes play just as important a role as rapid valuations and visualisations. Simulation results should be available quickly so that they can actually be used to influence the decision-making process. If a bank wishes to implement such a system, then in addition to adopting high-performance technologies, it must make central simulation environments available for every individual user.

Better and faster risk processes have become necessary and this is not only due to regulatory requirements; the market situation is also forcing banks to think about industrialization and accelerating their information gathering processes. The decisive step is to use new technologies to create fundamentally superior processes. BCBS 239 provides banks with a powerful means of achieving higher efficiency. What will your response be to this unique opportunity?

Categories
Banking
Tags:
BCBS 239 paper, risk data aggregation and risk reporting, SAS, Paul Franks, principles of risk assessment
Author:
AB+F Online
Article Posted:
March 01, 2014

Review this content

Fields marked with an asterisk (Required) are mandatory.

Extranet Login

Remember me

Forgot password?
Click here

If you do not have an Email and Password please call: (02) 9376 9509 or email subscriptions@financialpublications.com.au