Blog Post
Integrating Reconciliation Automation with Temenos T24 / Transact
Temenos Transact (formerly T24) is the undisputed giant of the core banking world, powering over 3,000 financial institutions. However, its immense power comes with significant complexity, particularly when it comes to data extraction for reconciliation. The system's unique architecture—often based on Multi-Value databases like jBASE or XML-heavy schemas in Oracle—means that standard SQL integration often fails. This guide provides a technical roadmap for connecting modern reconciliation tools like Reconwizz to the T24 environment without compromising the critical "Close of Business" (COB) process.
The Data Extraction Challenge
Unlike typical relational databases where "SELECT * FROM Transactions" works instantly, T24 stores data in dynamic arrays. Direct database queries can be slow and resource-intensive, risking system lag during banking hours.
The "Composite Entry" Problem: T24 optimizes for storage by aggregating multiple similar transactions into a single General Ledger (GL) entry. For example, 500 mobile banking fees might appear as one line item: "Sundry Income - $500." To reconcile this against the 500 individual lines from the Mobile Money provider, your software must be able to "explode" the GL entry back into its constituent parts (often found in the `STMT.ENTRY` table).
Integration Methods: Choosing Your Path
There are three primary ways to get data out of T24 for reconciliation:
1. Temenos Integration Framework (IF)
Best For: Real-time reconciliation.
The IF allows for event-driven architecture. Whenever a transaction happens in T24, it publishes an event (XML/JSON) to a queue. Reconwizz can listen to this queue and match the transaction instantly. This is ideal for high-priority Nostro accounts.
2. Data Extraction Service (DES) / Data Lake
Best For: High-volume batch processing.
Instead of querying T24 directly, many banks stream T24 data into an operational data store (ODS) or Data Lake (using tools like Temenos Data Lake or Kafka). The reconciliation software then reads from this secondary source. This ensures zero impact on the core banking performance.
3. Direct Database Access (Read-Only Replica)
Best For: Legacy setups (TAFC).
For banks running T24 on jBASE or older architectures, a nightly extract script (jQL) is often used to dump CSV files to a secure folder. Reconwizz's file watcher picks these up automatically.
Optimizing for Close of Business (COB)
The T24 COB is a resource hog. If your reconciliation extraction runs at the same time as the interest calculation batch, the system may hang.
Best Practice: Use "Pre-COB" checkpoints. Run your reconciliations continuously throughout the day. By the time COB starts at 6 PM, 95% of your transactions should already be matched. The final batch simply processes the remaining 5%. This prevents the dreaded "COB Failure" due to data imbalances.
Conclusion: Don't Fight the Core
Integrating with T24 requires respecting its unique architecture. Trying to force it to behave like a simple SQL database will lead to performance issues. By using approved pathways like the Integration Framework or ODS, and employing robust "GL Exploding" logic, you can achieve seamless, automated reconciliation that keeps your bank audit-ready without slowing down the core.