Opinions expressed by Digital Journal contributors are their own.
Modern-day organizations are reliant on data to inform their product roadmap, customer loyalty, and all the in-betweens -- but behind the glitzy dashboards and KPI reports lies a nastier problem: siloed tools, competing definitions, and data pipelines that can't always be trusted. In this world, even the most beautifully constructed report can lead to poor decisions if the data that it portrays is broken or unreliable.
This is the task that confronts most enterprise analytics organizations. And one that data leader Thilakavthi Sankaran conquered by resolving two underlying issues: consolidating disparate business intelligence systems beneath a unified architecture, and enforcing hard-nosed governance practices to establish trust in enterprise data.
Across the sector, BI tools diverge more quickly than they converge. Not unusual for a firm to have old SQL-based reporting alongside Power BI dashboards, Tableau workbooks, Excel models, and in-house Python scripts. The result is all too familiar: marketing reports differ from finance reports, the operations teams have their measures, and inter-departmental alignment is challenging.
The problem is not necessarily technical; it's usually structural. A number of teams develop their solutions on different timetables, solving similar problems with alternative logic. Without shared architecture or a pattern of administration, definitions diverge. An "active user" to one will be slightly different but fundamentally different from another.
Rather than filling these gaps, Sankaran constructed something more profound: a common language of data, powered by centralized architecture.
The initial step taken was to map out the existing state auditing data sources, pipelines, reporting tools, and stakeholders. The vision that was outlined was familiar: siloed reporting stacks, inconsistent SQL logic, and redundant effort across departments.
To bring order to the chaos, the architecture was centered around a cloud-native data warehouse that was the sole source of truth. Snowflake was the base, with dbt handling scalable data transformation and Apache Airflow handling orchestration. Data pipelines were moved out of ad-hoc scripts and into version-controlled, modular workflows.
Both Power BI and Tableau were retained but redesigned to work with the same governed datasets. No longer were there reports competing with each other and showing different figures; the business now ran on a single model. One definition of KPIs was set once in dbt and reused across tools, regardless of which dashboard one opened.
What was different was not the toolset, but the approach. BI teams, data engineers, and business analysts were finally able to work together for the first time under a common paradigm. Metrics were no longer hard-coded into dashboards; they were versioned, documented, and stored centrally.
This centralization provided agility. If a definition changed, i.e., how revenue was being allocated, it cascaded to all dashboards downstream. Reconciliation requests that took weeks were reconciled in hours. Leadership was more confident with the data, and teams had one point of reference to analyze.
But building a shared system is only the start. To be able to make the data dependable, governance had to be addressed head-on and at scale.
Governance in the majority of large organizations is reactive: something that is only done after a compliance audit or a violation. But in this case, governance was embedded in the data lifecycle. All DBT models included built-in null, duplicate, and referential integrity checks. Airflow jobs included automated alerting: if a table was invalid or failed to meet its SLA, the right team got alerted in real-time.
Documentation was a first-class citizen. With dbt's auto-documentation, every field and every transformation step was traced back to the source. Analysts could follow a metric from the dashboard back to the source data ingestion point without having to open a dozen tabs or Slack a data engineer.
Security and access were controlled through role-based permissions, which controlled very sensitive fields such as personally identifiable information and made them accessible only to authorized users. It was not just controlled at the database level but also at the BI tool level, and this enabled the possibility of expanding self-service without increasing risk.
Interestingly, governance wasn't seen as an inhibitor. It was presented as a means to enable faster decision-making by guaranteeing that decisions were made based on clean, precise information. It reduced rework and guessing instead of creating tensions.
This habit, over time, had a ripple effect. The data team wasn't just responding to dashboard requests; it was setting the standard for how the company interacted with and understood data. Definitions of metrics were standardized across departments. New reports were faster to construct because the underlying rules had already been defined. Analysts had more time to analyze and less time spent on cleaning or validating.
This change did not happen overnight. It was a close collaboration with subject-matter experts, gradual onboarding, and ongoing learning. But the more teams transitioned to the common architecture, the greater the productivity. Analysts across teams could stand on each other's shoulders. BI was a lingua franca.
The payoff extended well beyond dashboards. Enhanced lineage and validation allowed compliance teams to pass audits with minimal manual intervention. Engineering teams could change code with confidence, trusting that tests would indicate regressions. And executive leadership could pose strategic questions without waiting weeks for fresh reports to be constructed.
By developing a single BI platform with governance built in, the company started thinking differently. It could share its analytics with more users, more information, and more business questions without ever having to compromise on accuracy.
This was not a technical win in a vacuum; it was an operational shift. Faster decisions were made. Fights over metrics subsided. Trust in data grew across the organization. And the company shifted from a data firefighting culture to a data fluency culture.
The infrastructure that was put in place was not only designed with today's problems in mind, but also with tomorrow's growth. With cross-tool integration, pipeline monitoring automation, and modular dbt models, the architecture is still flexible enough to support new tools, use cases, and compliance requirements as the business evolves.
Most companies have disconnected BI environments and unstable data pipelines. What makes this case distinctive is the way those problems were solved, not with new, shiny tools or disposable dashboards, but with systematic and purposeful design.
By emphasizing consistency over customization and governance over guesswork, the team built a reproducible model. It illustrates that scalable analytics does not always involve quick queries or large data; it's about getting tools, teams, and trust onto a common ground.
And in today's world of big data, that foundation can be the most vital investment a business can make.