A selection of engagements across financial services firms in Australia and the UK - spanning ERP implementation, process automation, management reporting, and financial control.
A large, ASX-listed financial services group encompassing asset management, investment banking, lending, and operational businesses had a fundamental challenge: their finance team's data infrastructure had not kept pace with the group's rapid growth and complexity. With activity spread across diverse business units and multiple source systems, finance staff were spending significant time on manual data handling rather than analysis and insight.
I designed and built a finance data platform using Microsoft Azure - SQL Server, Logic Apps, Function Apps, and Data Factory - capable of ingesting, transforming, and surfacing data from across the group in a reliable, structured form. Alongside the platform, I delivered a suite of operational automations: bank data consolidation and reporting, automated journal pre-verification and booking, master data approval workflows, and an alert system for system and data anomalies.
The result was a step-change in the finance team's capability. Automated month-end processes for T&E, cash, FX revaluation, and consolidation significantly reduced manual close workload. Critically, the platform provided the infrastructure for the finance function to scale - growing to three times its original size and transitioning from a centralised team to a distributed model across multiple business units - without a proportional increase in manual data handling or coordination overhead.
A large ASX-listed financial services group was running its month-end close across approximately 30 accountants using a shared Excel spreadsheet with over 400 tasks. Simultaneous access caused editing conflicts and slowdowns, and the Group Financial Controller spent significant time manually coordinating teams and chasing progress - with no reliable real-time view of where the close actually stood. Audit sign-off evidence had to be collected and collated manually each year, a process that gave auditors limited confidence that reviews were happening in a timely way.
I designed and built a purpose-built close management system using Jira, Confluence, SQL Server, and PowerBI, integrated via API connections. Tasks were managed in Jira and automatically linked to Confluence documentation pages, created from a standard template at task setup. A PowerBI dashboard gave finance leadership a real-time view of task progress, documentation completion, and sign-off status across all teams - with enough detail to identify bottlenecks without requiring manual coordination. Preparer and reviewer sign-off was captured natively in Jira, creating a timestamped audit trail automatically.
Two years on, the system was handling nearly 1,000 tasks each month - 2.5 times the original volume - as the business continued to grow. Teams operated independently, the Group Financial Controller had oversight without being across every detail, and auditors could review a full, timestamped sign-off history directly in Jira rather than relying on manually assembled evidence.
A large ASX-listed financial services group had over 40 bank accounts across 7 banking relationships in three countries - and no automated way to consolidate them. Each week, a single employee spent two full days manually downloading CSV exports from each bank's proprietary portal, collating the data in Excel, and categorising transactions by hand. The result was a management report that was out of date for most of the week and disconnected from any operational process in the ERP.
Rather than pursuing time-consuming API integrations across 7 institutions, I built a smarter path: a pipeline that ingested each bank's existing CSV format directly, with controls to prevent duplicate ingestion and detect gaps in the transaction record. Files were simply dropped into a SharePoint folder - a task taking under a minute - and the pipeline handled the rest, consolidating data across all accounts and making it available daily for operationally critical banks.
On top of the consolidated data, I built a transaction recognition engine in SQL that matched keywords and account number combinations to automatically suggest the appropriate journal entry for each transaction. Accountants reviewed, added one-off entries where needed, and loaded the full journal into Dynamics 365 at the click of a button. The result: a process that had consumed two full days of staff time each week was reduced to minutes, daily cash visibility replaced a weekly snapshot, and bank transaction data fed directly into the ERP - eliminating a separate manual posting step entirely.
Three business units within a large ASX-listed financial services group needed to upload thousands of journal lines to Dynamics 365 each month-end. Using the native D365 Excel integration, each load took 8 to 12 hours - long enough that it had to be left running overnight on a staff member's laptop. When a load failed, month-end close was delayed, and the error feedback provided was limited enough that diagnosing the problem added further time.
I designed and built an Azure-based journal loading pipeline to replace this process entirely. Accountants simply dropped their journal file - Excel or CSV - into a SharePoint folder. The pipeline ingested the file into SQL Server, ran it through a Python-based validation process that checked balancing, field completeness, and master data alignment, then loaded the validated journal into D365 via the Data Management framework. Throughout, the submitter was kept informed via Teams, receiving detailed, actionable feedback if any issues were found.
The result was a reduction in load time from 8–12 hours to under 5 minutes - and the elimination of a recurring close risk. The validation engine was subsequently extended to cover automated journals from travel and expense systems, loan management platforms, and other group processes, giving a consistent, auditable control gate across all journal sources into the ERP.
A globally operating quantitative trading firm with over $200m in turnover and $400m in net assets needed to replace its legacy general ledger with a modern, scalable platform. Operating across multiple currencies and entities - with almost all revenue generated offshore - the firm's finance team faced a technically complex migration on a tight timeline with no appetite for disruption to its demanding close cycle.
I led the full NetSuite General Ledger implementation from business case through to go-live, including system selection, vendor negotiation, implementation, and end-to-end process reengineering. As part of the project, I conducted a thorough review of the existing chart of accounts - rationalising it from approximately 800 GL accounts down to fewer than 400, removing accumulated complexity that had made reporting, maintenance, and audit more difficult than necessary. The new GL was also integrated with the firm's existing data platform, ensuring all downstream reporting and automation processes were unaffected by the migration.
A data migration pipeline was built and fully tested in advance of go-live, so that the moment the final close was completed in the legacy system, migration into NetSuite could begin immediately - no delays, no last-minute surprises. The project was delivered in under six months with no interruption to live finance operations, the month-end close, or any dependent downstream process.
A globally operating quantitative trading firm was producing its monthly board pack in 15 days - a process that was absorbing significant capacity from a lean global finance team.
I conducted a systematic analysis of the entire reporting cycle, mapping task dependencies and identifying where work could be parallelised, reallocated, or automated. Every individual process was then targeted for streamlining, with Python and SQL used to automate the most time-consuming manual steps. A Jira-integrated tracking tool provided real-time visibility into close progress, and PowerBI dashboards connected to SQL Server replaced static spreadsheet packs for KPIs, cost variances, and headcounts.
The result was an 80% reduction in board pack production time - from 15 days to 3. Rolling forecasting replaced the annual budget cycle, anchored to the firm's existing Jira contracts database to give forecasts a direct link to committed spend. Once that contracts data had been validated, it was extended to drive the accruals process - automatically identifying variances between actual costs and contracted amounts, and suggesting the correct accruals to close any gaps. The finance team achieved all of this with fewer headcount than before, while materially increasing the quality and timeliness of reporting delivered to the board.
A global high-frequency trading and market-making firm was running its finance and middle office functions with heavily manual processes. Month-end close took five days, daily P&L production consumed six hours of staff time, and quarterly BAS reporting absorbed another five days - leaving the team of seven with limited capacity for analysis or control work.
I led a systematic automation programme across the function, using Python, VBA, and QlikView to redesign and automate each major process. Month-end close was reduced from five days to two. Daily P&L production fell from six hours to one. BAS reporting dropped from five days per quarter to half a day. A daily management report was designed and automated from scratch - pulling front office P&L, balance sheet, cost data, market KPIs, and operational incident reports into a single self-updating online view.
The result was a step-change in efficiency: the same functions were managed with a team of five rather than seven, quality improved, and the finance team's focus shifted from data production to insight. Junior team members were mentored through the programme, embedding automation capability as a lasting part of the team's culture.
During a broader review of BAS processes at a global high-frequency trading firm, I identified that GST had not been correctly treated across a material portion of the firm's activities - an error that had persisted undetected across four years of returns lodged by prior finance staff.
I initiated a full reanalysis of the firm's BAS history, rebuilding the GST position from first principles across the entire four-year period. PwC were engaged to partner on the formal claim, providing technical tax expertise to validate the approach and manage the ATO submission. Together, we successfully recovered a seven-figure sum in overpaid GST.
Beyond the financial recovery, the engagement corrected the firm's ongoing GST treatment and embedded a sound methodology into the BAS process - ensuring the issue would not recur. The project was initiated proactively, based on my own review, rather than in response to an ATO query or external audit finding.
The UK branch of a major South African bank - operating across treasury, fixed income trading, and corporate advisory - undertook an implementation of Oracle EBS R12 to replace its legacy financial systems. As finance lead, I was responsible for ensuring the system was correctly configured to support the branch's financial control and management reporting requirements, and to address indirect tax obligations.
I led the finance workstream from requirements through to go-live, ensuring chart of accounts design, entity structure, close processes, and key controls were all functioning correctly before cutover. Post-implementation, Oracle Hyperion Planning was deployed to transform the budget and forecasting cycle - replacing a manual, spreadsheet-heavy process with a structured system-driven approach. Reporting was further upgraded through the introduction of OBIEE and i3BAR interactive reporting methodology.
The project delivered a modern, well-controlled financial platform for the branch, alongside a material improvement in the quality and efficiency of planning and reporting processes.
Let's talk about where your business is today and what it could look like with the right processes and tools in place.
Get in Touch