News

The writing’s on the wall for the bank ‘batch run’

When I was living in the States, shortly after the turn of the Millennium, I applied for a mortgage.

John Schlesinger
Blog,
John Schlesinger – Chief Enterprise Architect

When I was living in the States, shortly after the turn of the Millennium, I applied for a mortgage. After completing the paperwork, approval was timely enough. What shocked me was that the new account took six months to show up where I could see it. That’s because the data passed from one local centre to another at the month’s end, until it was finally processed by my local hub and became accessible online.

Little has changed in transactional processes at many banks since the 1970s. The “batch run” is alive and kicking. In the UK, banks pool and process transactions from branch clients in a single job, usually after the close of business. The results then appear to customers online the next day or (in the case of my mortgage) further out.

The costs of antiquated batch processing are mounting. Last month, we learned that “an issue with overnight processing”, meant 600,000 payments at a large British lender disappeared overnight.

We know that maintaining legacy systems takes 76% of banks’ budgets, leaving little to invest in meaningful consumer enhancements. In principle, the mainframe’s not the problem. Many banks run their core systems though upgraded IBM hardware, using the latest application servers running state-of-the-art chips.

The real problem is dated architecture and leadership inertia. By architecture, we mean the overall approach to systems that have been built for data-keeping, and the people who run those systems. System upgrades have not kept pace with changing habits.

In the 1960s, retail banking was exclusive to branches. The manager would send transactional data to HQ at the end of the day to generate new balances the next morning (adjusted for tax, fees, payments and interest). All transactions went through the branch, so branch accounting made sense.

With online processing in the 1970s, transactions were entered into a central online teller, logged and added to the overnight batch, which eliminated the need to send branch transactions by hand.

From the mid 1980s on, banking started migrating from the branch to ATMs, online banking, point of sale, call centres and card transactions. The concept of branch accounting died, but banks still provided no real-time customer and product overview.

Meantime, technology groups such as Google and Amazon developed modern architecture, operating middle offices which track risk in real time; transactions are cleared digitally to separate risks associated with the front office (orders, customer, channels) from the back office (shipment of goods).

Real time processing at banks integrates checking and deposit information with consumer and lending capabilities and other platforms. Each service gets the same information immediately. The customer view becomes more complete and banks can focus on profit and service.

So why, then, has there been no quantum leap from batch to real time at most large banks? It’s a combination of money, time and effort. Sticking with existing systems delays upheaval, which no one likes, least of all IT teams.

There are several possible paths to take. The first is “update in place,” where a bank retains its core system and overlays new digital channels linking front and back offices. The interface between the two is complex, time-consuming and the operation tends to be crushed by its own weight.

The second is a new “greenfield” core system. This typically takes six years and ends up being cancelled by cost and complexity.

The final option is “build-and-migrate,” where a new core system is developed and business sub-sets are converted book by book when business and IT opportunities allow. Gradually each unit runs on the new real-time architecture, enabling it to compete with competitors’ low-cost, high-volume models.

In Australia, ME Bank took this route and promises a new service approach. The project was built around a Temenos Transact back office platform, with the first data migration last July; home loan origination started on the new system in March.

ME will benefit from a “single customer view,” greater flexibility in product design and pricing and a shift from manual to automated processes in areas such as home loans. ME now processes around 50% of the group’s mortgages through its new system.

Temenos produces a report every year with Deloitte comparing KPIs of banks running modern core systems with those on legacy systems. It shows a significant and sustained performance divergence (a 6.5 percentage point lower cost-to-income ratio on average over five years).

Banks can no longer afford to view internet banking as a new channel; it is their brand essence. Their systems need to offer the same customer care as the competition. Profit will follow.

Filed under:

John Schlesinger
Blog,
John Schlesinger – Chief Enterprise Architect