Financial Intelligence Dashboard
Built reporting infrastructure from scratch for four business units across three countries. Before: four spreadsheets, four definitions of revenue, a two-day consolidation every month. After: one model, one report, one version of the numbers.
The problem
Four business units. Four countries. Four different source systems — and critically, four different definitions of when revenue was recognised. One unit recognised at invoice. Another at delivery. A third split the difference based on contract type. Every reporting cycle meant four separate datasets produced by four separate analysts, followed by a manual consolidation that took two days and produced disputes almost every time.
There was no single source of truth. Variances between entity reports had to be investigated by hand. Leadership was getting a monthly snapshot four days late, with no visibility into why numbers had moved. Decisions that should have taken an hour were taking a week.
This is a solvable data engineering problem. It just hadn't been treated as one.
The approach
The technical problem was straightforward. The political problem was not. Before writing a single line of Power Query, I spent three weeks getting the four entity heads to agree on a standardized chart of accounts and a single revenue recognition policy. That conversation was harder than anything in the build — it meant someone had to change how they'd been booking revenue for two years.
Once that was settled, the architecture was: Power Query connectors pulling from each entity's source system on a nightly schedule, transforming to a canonical schema, writing to a shared Excel data model that Power BI reads. The canonical schema has 12 dimensions and a single fact table for revenue transactions — entity, period, line of business, project ID, amount in local currency, FX rate, amount in INR.
Validation rules run at the Power Query layer before data ever reaches the model: duplicate invoice IDs, negative revenue on non-credit lines, missing project codes, FX rates outside a ±5% band from the previous week. Anything that fails goes to a separate “quarantine” sheet with a rejection reason. This moved error discovery from month-end review to data ingestion — catching problems when they're cheap to fix, not after they've propagated into a board deck.
A VBA bridge exports formatted snapshots in the legacy Excel format that two department heads refused to give up. It's unglamorous, but it meant zero resistance to adoption from the people most likely to reject a new tool.
Multi-entity P&L view
The result
- Reporting cycle cut by ~40% — bulk of remaining time is review, not assembly.
- Data errors down ~30%, caught at ingestion before they reach any report or deck.
- Cross-entity operational view in place for the first time — used weekly by 4 department heads and C-suite leadership.
- ~15 hours/month of manual analyst work eliminated. The analysts who built the report now do analysis instead.
- Dashboard became the single source of truth for ~₹33 Cr in strategic investment decisions.
What I'd do differently
Power Query is the wrong tool for a multi-entity, multi-currency consolidation at any serious scale. It works, but it's brittle: a column rename in a source system breaks a query silently, and the debugging experience is miserable compared to SQL. I spent several weeks tracking down rounding-error bugs that turned out to be FX conversion inconsistencies between Power Query's floating-point arithmetic and Zoho's.
A proper warehouse layer — Postgres or BigQuery with dbt models — would eliminate that entire class of bug and make the pipeline testable. The migration is planned for this quarter. The Power Query approach was the right call given the timeline and the team's tooling familiarity; it's just clearly not the final form.