Getting control over your data and the benefits of doing so
Risk & Compliance Magazine talks with Lisa Roitman, Bloomberg's Business Strategist for Regulatory Compliance Technology, Janos Renz Hotz, Bloomberg's Best Execution Compliance Strategist, BTCA, Mike Tirello, Global Compliance Product Manager. SSEOMS, Bob Shea, Bloomberg's Global Compliance Product Manager, AIM and Paul Lanois, Credit Suisse AG's Vice President, General Counsel.
Shea: In today’s regulatory environment, an investment firm needs to collect, validate and normalise data from multiple sources to satisfy compliance and transaction reporting obligations. Firms are expected to deliver timely reports using industry standard identifiers, for example legal entity identifiers (LEIs), for securities, counterparties, decision makers and accounts related to the transaction.
Often, a heavy technical investment is required to integrate multiple systems which contain the data necessary to comply with the regulation. In addition to trade reporting obligations, firms are expected to implement a process to ensure that pre- and post-trade compliance checks are evaluated with timely data. A sophisticated design must be implemented so that security reference data can be acquired in near ‘real-time’, which will enable compliance checks to be run as new orders or trades are sent out for execution.
Lanois: As strange as it may seem, one of the biggest data-related challenges companies face is the hype surrounding new technologies. We have all heard, many times, how a new technology is poised to be the next best thing, and this has often pushed organisations into blind adoption just for the sake of appearing trendy and not losing out to their competitors, only for those tools or technologies to be dropped by those organisations, at great expense, shortly thereafter because the tool or technology does
not really meet the company’s needs. There is nothing wrong with being an early adopter and in adopting new technologies which, in many cases, can greatly benefit your business. There can be many benefits, such as reduced costs, as well as increased speed, efficiency or reliability of data. Nevertheless, adoption should always be carefully considered and must fit the organisation, not the other way around. Adopting the latest and greatest trend just because it is popular at the moment can be a huge mistake.
Lanois: It is necessary to strive toward achieving the best possible quality data since the quality of business decisions can only be as good as the quality of the data used to reach such decisions. It can be tempting to collect as much data as possible, but if the data is not reliable, then it may be useless or even harmful to use it to make decisions.
There is also a temptation to just grab data as it comes to run an analysis and then come up with statistics and reach a conclusion without spending time to actually understand the data and where it originated.
Shea: Companies that invest resources into improving data quality will increase their chances of success. This can be achieved by implementing technical solutions to validate and scrub the data, coupled with good process management tools to monitor the workflow.
Applying solid business intelligence to the data management process is critical to achieving quality data. Business intelligence can be used to predict expected values which can be used to address data exceptions and provide an edge to data-driven solutions. When a specific process which is critical to the overall data management workflow fails, automated recovery procedures should be implemented to address the issue.
Tirello: It is extremely important to be timely and accurate when responding to a regulator. Depending on your firm’s regulatory obligations, there are certain timelines that you must be able to respond within or face further scrutiny of systems, policies and procedures, and possible fines. Firms should conduct regular reviews of their business lines to ensure
that they are capturing all relevant data under their regulatory guidelines and then have frequent checks, or internal audits, to verify that the data has actually been captured and is retrievable. If business lines change, new reviews should be performed to ensure all new data is captured.
Hiring external auditors at least once a year to review your firm’s policies and procedures, as well as verification, is a must in today’s ever-changing regulatory environment and indicates to regulators that your firm is serious about compliance.
Lanois: Currently, the reporting process at many financial institutions is fragmented into reporting silos, each with their own database and tools to produce a particular regulatory report. For example, you would have a regulatory report for credit risk, liquidity risk, stress testing and so on. Each would be produced by a specific department using its own tools and systems, some of which may be old and not necessarily compatible with other systems.
To solve this, organisations may apply adjustments and reconciliation patches in order to respond to regulatory requests, however such legacy systems are likely to create further issues down the line as regulators increasingly expect organisations to provide more comprehensive reporting within shorter timeframes.
Lanois: A dataset is only as good as the quality of the data it holds. Therefore, it is essential that all data used to feed a central database is validated as it is imported, to ensure that there is no error, missing data inconsistencies, and that the quality of the data, such as its age, meets the reporting requirements that the organisation is subject to.
In addition, even though the industry is gradually moving toward harmonisation in relation to reporting templates, organisations with international operations are still facing the requirement of having to provide input in different reporting templates for the different countries where they operate.
Roitman: LEIs are simply one example of how important it is to have good data. LEIs are required as a prerequisite for trading. An LEI is simple to apply for and easy to look up as they are available through public databases.
However, as simple as LEIs are, they have caused all sorts of difficulties for financial market participants. What makes the process so hard?
If you do not know the entity you are doing business with, then using the right LEI number is impossible. We see this as a common challenge among firms. In connection with the Markets in Financial Instruments Directive II (MiFID II), for example, buy-side and sell-side firms have reporting obligations which require them to be able to identify a counterparty by its LEI number, but if you have never taken the time to make sure you have a clean and correct list of your counterparties, how can you possibly find and report the correct LEI numbers?
For buy-side firms, this has become a serious pain point as they historically have not drilled down to specifically identify which of their financial institutions’ trading subsidiaries or affiliates trade, clear and settle each type of financial product. The result is that there is no way to determine the LEI to use for reporting purposes, and therefore there is no way to fully comply with MiFID II requirements. Similarly, financial institutions need to reach out to thousands of their clients to collect regulatory documentation or affirmations in conjunction with evolving regulations. Without the most basic of clean data, specifically an up-to-date client list, it is impossible to automate processes or workflows. Too much time and money is spent remediating a firm’s problems with basic data. To save themselves from future operational pain, firms should focus on data quality and completeness from the start, and of course, on data integrity through the trade lifecycle.
Roitman: In an ideal world, professionals in trading, legal, compliance and operations departments will work together to build thoughtful processes so that the right information is collected from the start of any trading relationship.
For mature businesses, now is the time to focus on data. Evolving regulations are only going to require firms to know more and access what they know more quickly. The amount of data firms collect, use, house and engineer is simply going to continue to grow, and the only way to cope with these ever-increasing demands will be to work on intelligently automating processes. Therefore, you have to take the time to think thoughtfully about your data today. Work to make sure it is clean and complete and then think about its integrity – how will you secure it, share it and keep it up to date?
Lanois: In order to fix their data problems, organisations will have to fix the skills shortage. There are only a handful of highly-skilled analytics professionals. Then there is a need to upgrade legacy systems in order to resolve the silo issue.
Finally, organisations often hold lots of data without knowing what they actually have. A lot of the data is scattered, in legacy databases located on premises, data stored on cloud-based sharing services, Big Data platforms, and so on. A lot of data is also uncategorised and could be redundant, obsolete or trivial.
Lanois: Having better data does not mean having more data; rather, it means having data which has been validated. When a business is able to make decisions on the basis of data whose quality has been validated, it is easier to make more efficient strategic decisions in a more confident manner and without any need for rework, whereas unreliable data results in decisions that can often lead to missteps and the need to spend additional time to reconcile the data. In addition, the difficulties in understanding current performance and
trends may hinder the organisation’s ability to identify and exploit new opportunities. In many cases, focusing on the quantity of the data can be counterproductive.
Hotz: Better data means better infrastructure for making more informed and defensible choices. With better data you get more confidence in the process, better benchmarks that represent the intention of the portfolio managers
(PMs) and funds in a better way, and align the goals of PMs and traders. This gives a better framework to speak about and report on the fund’s costs, as well as managing processes to reduce slippage and consequently improve alpha. The issues to guard against here are introducing too much complexity with benchmarks too bespoke to be comparable across funds and clients.
Better and more pervasive data allows for much better trade surveillance checks by leveraging much more granular tests. Contrary to initial instincts, more granular checks, if performed within a repeatable and workflow-based process, lead to fewer false positives. The danger to guard against is undertaking tests that are too complicated to explain or repeat.
Data collected through best execution and compliance processes in a systematic fashion, allows for evaluating and evolving processes further. A ‘data culture’ in best execution and compliance allows for much more than just more meaningful alpha capture and better risk management on trade surveillance – it also creates the foundations for future proofing your organisation and better competing in a world where data is increasingly the number one commodity.
Tirello: There will be a higher demand for data quality and accuracy, which will then drive larger initiatives. Using data-driven machine intelligence allows trends and bad actors to be identified when previously missed. Firms will be able to look for trends across ‘siloed’ parts of the business that were typically kept apart. With transaction reporting already underway in Europe and consolidated audit trail (CAT) coming out soon in the US, firms need to make sure they understand their business workflows and that they are properly displayed in their books and records.
If they do not spend the time to review, then the ever-evolving tools at the regulators’ disposal, and fines, could not only damage a firm’s reputation, but possibly put it out of business. Contracting with a leading vendor in the space to manage and store your firm’s data is an option that allows a firm to leverage multiple firms’ experiences in audits and best practices where they would typically not have exposure.
Lanois: The fact that we generate more data today than we ever have will probably continue in the years to come, as we become an even more data-driven society. Data streams will increasingly be combined instead of sitting in isolation, meaning that legacy systems will need to be upgraded or replaced in order to allow further communications between systems, whether through application program interfaces (APIs) or apps. Smart algorithms and new technologies will enable more automation, less human interaction and a better understanding of the data.