Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Deutsche Bank: IT becomes a job for finance

Rolf Roewekamp | March 3, 2015
In one of the Bank's most major projects, finance and IT have joined forces to create a central data warehouse for all risk and finance data. The giant project was on the verge of failure in 2013. CFO Joachim Müller redefined his role.

With this goal in mind, finance and IT began establishing a new infrastructure basis and process architecture for reporting by launching a large-scale project called StRIDe (Strategic Redesign of Information Delivery). "StRIDe is the largest change project the finance department at Deutsche Bank has ever seen," Müller emphasises. On the technical side, finance and IT are expanding and improving the central financial data warehouse (FDW). Data from three existing systems is to converge in the strategic FDW platform: regulatory information, accounting data and local regulation data from various countries. For the employees, this reduces the usually manual reconciliation process between the systems.

Information comes from a large number of the Bank's legal units, with around 1,000 data streams and more than 100 transaction systems contributing to the FDW. "The challenge was, and remains, converging the data from around 1,000 data streams. In order to accomplish this, we have to remove detours and data adjustments while making data refinement stricter and more consistent," Müller explains.

A client loan, for example, affects several levels in reporting. Equity backing has to be calculated for regulatory purposes, a risk management value must be determined to evaluate the credit default risk and the payment flows have to be recorded for accounting. In other words, different data is used in various parts of the company. Ideally, the data is provided from a uniform source and is subject to strict processes in order to ensure quality and consistency.

Here, too, the bank defined a uniform data standard, establishing how specialist departments have to load the data into the systems. "The original data quality at the source is the key, since any manual post-processing costs money and increases the risk of errors. Picture it as having to install a filter in a cloudy stream to clean the water, which is expensive and slows the speed," Müller explains.

But even the best-laid plans go awry. StRIDe initially aimed to organise finance data and to make processes better, more efficient and less costly, bringing the entire finance architecture to a higher level of quality. Yet as time went on, the new regulatory requirements imposed by the outside world grew noticeably stricter. Now, for example, the Bank had to provide information that it had either not collected so far or that did not even exist yet. What's more, the regulatory authorities wanted to receive the data faster and more frequently. "The scope developed so massively that regulatory requirements had a critical impact on milestone planning for the project," Müller notes.

Still, this was not the only reason the Bank initially failed to push the project with the kind of determination that the project name StRIDe might suggest. Things became more focused in October 2013. "We had put a lot of energy into the concept, but there was upside potential when it came to the implementation and realisation," reports Stefan Sutter, who attended the project on the IT side as managing director. "You can put a lot on PowerPoint slides, but ultimately it's the implementation that counts."

 

Previous Page  1  2  3  4  Next Page 

Sign up for Computerworld eNewsletters.