CECL: one less place for weak data to hide


Thursday, August 2, 2018 | By Rajesh Kolluri

CECL: one less place for weak data to hide

If you’re reading this, you know that changes in business models, customer expectations and competitive pressures are forcing you to do more with your data. What might be less obvious is a larger, longer term change, in which regulators and investors focus more and more on data, as data.  I’m not referring to call reports and public disclosures, but loan and credit data at a detailed level.

In some lending segments, document-based or sample-based QA review is already obsolete, having been pushed aside by pure and full scale data reviews. Leading practitioners say that the end game is real-time, step-by-step data assurance, before a loan is even a loan.

The demands on loan data were already escalating, and then along came CECL. It’s easy to park CECL in the “accounting garage”, because it is, strictly speaking, an accounting standard. However, it has implications for risk, liquidity, capital and arguably the competitive position your firm will find itself in over the next decade. It’s not surprising that in our most recent survey about CECL challenges, 34% of respondents said that data was their number one gap.

By choosing a non-prescriptive approach to the new reserving standard, the authorities put every firm on the hook for deciding how to estimate their reserves. Since there’s no prescriptive method to conform to, firms need to be prepared to document, describe and defend not just the estimate itself, but also the process, models and data that went into it.

In other words, weak data has one less place to hide.

Many firms still rely on data stored in multiple systems, each with their own data profiles, standards and technology. We hear from many clients who struggle with getting the right data for CECL computations, even ones who have already made significant investments into a sound enterprise data approach. One of the common questions raised about loan data for CECL is “how good is good enough?”

For those of us in technology, we tend to see this problem as one of access or aggregation. But for our partners in credit risk and accounting, assembling the data doesn’t get them over the finish line. With CECL it’s the normalization and quality of the data that will have a huge bearing on how smoothly your firm transitions to the new standard.

No doubt many of you already realize this, and many firms have ongoing data management projects in flight, if not CECL specific, then at the enterprise level.  But CECL deadlines sneaking up, and 18 months is not a long project horizon for something this complex. If your data isn’t CECL-ready, you may need to change the problem scope from “enterprise data” and focus on what’s required to make CECL happen. We can break the process down into some basic priorities:

  1. Isolate the right data. As noted, sourcing the right data is the first step. With a large influx of structured and unstructured data flowing in from both internal and external sources, banks juggle multiple silos. Some systems were built to outdated requirements; others might have been inherited as a result of M&A activity. It’s not uncommon for both of those conditions to apply to the same system, and in some cases M&A legacy leaves gaping holes in the data that you need to precisely calculate even historical losses, much less make forward-looking estimates.
  2. Common data model and standards. Only the most diligent (and fortunate) firms have had the resources and commitment to assemble a viable, reliable, effective model for enterprise wide credit data. Without such a model, we find many banks resort to manual processes and multiple silos.
  3. Reliance on manual data processes. The systems that are most likely to contain the important loan loss data for CECL are somewhat notorious for manual workarounds, not just for moving data between silos, but for reviewing and validating it within a silo. As a technologist, I’m amazed that these systems are still doing what they were designed to do 30 years ago. Loan servicing is not a profit center, so these systems are often managed for cost, not quality. It’s not an issue as long as you generate accurate statements . . . . But now CECL will ask more of this data and the limitations will begin to show.
  4. Inadequate data lineage. We hear often that due to the factors described above, as well as others, many firms are struggling to get their critical loan loss data to a point where they can really trust it. This is echoed by 55% of survey respondents who said that data is their number one control concern when it comes to CECL.

Examiners and auditors expect to see that the reserve was estimated using sound quantitative analysis, and as the cliché says, garbage in, garbage out. The consequences of shaky CECL data are almost certainly going to include significant operational costs, and might be as severe as a material weakness in the audit process. It’s not an exaggeration to say that shaky CECL data could negatively impact the firm at a very strategic level.

No doubt some firms will take the path of least resistance, and treat CECL like just another compliance ticket to punch before a deadline. Many of them will regret it. CECL isn’t just another disclosure, it has real implications, and to get CECL right, you have to get the data right.

The problem is compounded to some degree by software vendors and consultants that take an overly simplistic view. They all have good intentions: they want to solve for CECL with a point solution that’s easy to deploy.

We spoke with one firm during a conference recently whose answer for CECL is to aggregate call report data and run estimates from it. One consulting firm at the same conference tried to show a room full of accounting and finance professionals how CECL can be done with spreadsheets.

Getting CECL right requires loan level data. There are circumstances in which a discounted cash flow is the appropriate method, and clearly midsize firms will want to treat at least some loans individually, pulling them in and out of pools as required. The “punch the ticket” approach to CECL may be fine as long as the favorable credit environment holds but when it turns, “data holes” will appear.

Is it possible to get CECL right, with a strong, appropriate data set without being distracted into a long and complex enterprise data project that could put CECL deadlines at risk?

To match the scale of the solution to the scale of the problem, the answer for many firms might be a purpose-built CECL data mart. The scale of the solution fits the scale of the problem, and can fit any firm, regardless of where they are in their enterprise data strategy.

If they’re already working an enterprise approach, whether that means an internal solution or an external vendor, CECL data mart fits within their warehouse, or draws from a lake. For firms that haven’t even started the journey towards an enterprise credit data model, the CECL data mart is a step in the right direction, and might even provide the basis of the fuller solution.

Innovation over the last several years has made some Generation one data architecture approaches obsolete, or nearly obsolete. The complexity of keeping a warehouse organized and updated becomes difficult at scale, and often the firm’s technology group doesn’t have, or can’t get the subject matter expertise to help the data evolve over time.

It’s critical with CECL to avoid that trap and use technology that’s sustainable and forward-looking. The space is crowded and it would be irresponsible to suggest a one-size-fits-all approach. When planning for a CECL data mart, institutions need to weigh a number of strategic considerations:

  1. Scope. As a first step, understand from credit risk, finance and accounting what the minimum viable data set is for how they manage reserving. If a scalpel meets the need, avoid the sledgehammer.
  2. Auditability. Business logic and rules within the solution should be applied in a clear, transparent manner that comply with organizational expectations. A key set of expectations is driven by your auditors. You don’t need to cover every detail, but ask your accounting team and get a sense of what “auditability” means for the audit firm your institution works with.
  3. Domain Knowledge. Use case expertise and domain awareness are critical. As noted, the CECL challenge isn’t about architecture; it’s about normalizing an absurdly complicated set of data points. Even if you wanted to staff up to manage the rules, you’d struggle to find the talent. Better to buy than build.  
  4. Customizability. Preset input requirements may or may not align to an organization's needs; further customization may not be possible or may be cost prohibitive. If you’ve already got enterprise data architecture components in place, a CECL data mart needs to play nice with them. If the CECL data mart is your first step in enterprise data, you need it to be extensible in a cost-reasonable way.
  5. Commitment. As the world and regulatory environment change, so do the needs of stakeholders. It's important to partner where there's a commitment to maintaining data, adapting as needs shift, and bringing new innovations to customers over time. The partnership shouldn't end when the implementation is complete.
  6. Total Cost of Ownership . . . including opportunity cost. Price of purchase is obviously an issue, but maintenance effort should be considered. You need something that supports agile, iterative methodologies that speed time to deployment. You need to be able to adapt. It’s also important to understand the implications if you choose not to create a CECL data mart. Again, the short-sighted approach to CECL will start to show weaknesses whether you find them when you start working toward the new standard, but certainly when the credit environment changes.

Today's lending businesses are built upon a foundation of credit data. Short sighted firms will compete purely on rate. Firms that want to win the long game are building and nurturing a competence in data management, and the goal is to build permanent solutions that revolve around automation, integration, consolidation, complexity reduction and speed-to-value. Wherever you are in your evolution, the data architecture for CECL is worth getting right.



Regulation


Theme picker