Migration from Google Looker to MS PowerBI / Fabric

Our Lessons Learned from Migrating Over 200 Reports: From Audit and Rationalization to Implementing Robust Data Governance and User Adoption.

Łukasz Kidoń
Łukasz Kidoń Published on: June 30, 2025
Contact the author

The decision to migrate over 200 reports from Google Looker to Microsoft Power BI and Fabric was driven by a strategic desire to unify the data ecosystem and lower the TCO. A successful migration is a business transformation, not an IT project, based on four phases: deconstruction, translation, reconstruction, and implementation. This is a roadmap on how to turn a technological challenge into an opportunity to reduce technical debt, implement robust data governance, and achieve true analytics democratization.

Strategic Context: More Than Just a Tool Change

Our journey began not with the question "which tool is better?" but with a strategic decision to unify our data ecosystem. The move from Google Looker, a mature platform valued by our analysts, to Microsoft Power BI and Fabric was dictated by the goal of creating an integrated environment encompassing data engineering, analytics, and self-service BI. It was about lowering the total cost of ownership (TCO) by consolidating tools and leveraging native integrations within the tech stack we already used, like Azure. The decision to migrate is rarely dictated solely by the features of the analytical tool itself. More often, it is a symptom of a broader IT strategy, which in our case involved maximizing the return on investment in the Microsoft platform. The choice of Power BI did not stem from its abstract superiority over Looker, but from its deep synergy with the company's existing and planned infrastructure.

The Scale of the Challenge: Transformation, Not an IT Project

We faced the task of migrating over 200 key reports that were the analytical lifeblood for hundreds of users across various departments. We quickly realized this was not a simple "lift-and-shift" exercise. It was a transformation of how our organization thinks about, manages, and consumes data. The sheer scale of the undertaking immediately signaled that the biggest risk would not be technology, but managing complexity, people, and processes. The history of BI projects teaches that the biggest problems are human - lack of engagement, internal politics - and process-related, like unclear requirements or underestimating the task's scale. Therefore, from the very beginning, we viewed this migration as an organizational change program requiring strong executive-level sponsorship.

Phase I: Deconstructing the Monolith – Audit, Planning, and Rationalization

Following best practices, our first step was a deep audit of the existing environment. Instead of relying on anecdotal evidence, we dove into hard data. We analyzed usage logs in Looker to objectively understand which reports were critical to the business and which were just "informational noise." We created a comprehensive inventory of all reports, dashboards, data sources, and user groups. This foundational work allowed us to perform ruthless, data-driven prioritization and avoid the most common mistake in migration projects: moving everything "just in case."

A Power BI dashboard showing report usage statistics: a bar chart with the most used reports, a pie chart with a breakdown by department, and a trend line showing user activity over time.

The Art of Letting Go: Rationalization and Reducing Technical Debt

The audit revealed significant redundancy and waste. We identified duplicates, slight variations of the same analyses, and reports unused for over six months. By engaging business stakeholders, we collectively decided to archive or completely eliminate nearly 30% of the existing assets. This was our first big win. The migration became a conscious opportunity to clean up, simplify, and pay off technical debt, rather than mindlessly transferring it to the new environment. Every archived report saved dozens of hours of development work.

Reverse Engineering Business Logic

The most difficult task was reverse-engineering the business logic embedded deep within the LookML models. Many key assumptions and metric definitions were not documented outside of the code. Our team conducted "data archaeology," analyzing LookML code and cross-referencing it with the knowledge of business users to understand not only "what" was being calculated, but more importantly, "why." Documenting these "unwritten rules" became the foundation for the success of the next phase, forcing the organization to confront hidden, often inconsistent definitions.

Creating a Roadmap and an Audit Register

Instead of a "big bang" approach, we divided the project into iterative sprints, grouped by business domains (Sales, Marketing, Finance). Each phase ended with the delivery of a working set of reports, which allowed for early feedback collection and trust-building. The table below shows a simplified excerpt from our audit register, which became the central management tool for the project.

Business Domain Report/Dashboard Name Usage Frequency (monthly) Number of Users Criticality (1-5) Migration Priority (1-3) Decision Comment
Sales Quarterly Sales Pipeline 250 45 5 1 Redesign Key report for management. Migration is an opportunity to add forecasting.
Marketing Online Campaign Effectiveness 120 15 4 1 Migrate 1:1 Report serves its purpose well. Needs to be connected to HubSpot.
Finance Monthly Cost Summary 80 10 5 2 Redesign Data from the new ERP system must be integrated and visualizations simplified.
Operations Order Fulfillment Time (Version A) 5 3 2 3 Archive Duplicate report. Functionality covered by the "Logistics Dashboard."
Sales Sales Rep Activity (old) 0 1 1 - Delete Unused for 12 months. Replaced by a new dashboard in Hubspot CRM.

Phase II: The Great Translation – From LookML to the Power Platform

The translation phase is the heart of the technical part of the migration. Looker's strength is LookML – a powerful, centralized semantic layer that guarantees metric consistency. Our goal was not to destroy this fortress but to rebuild it. The key was the consistent use of certified and promoted semantic models in the Power BI service. We created central, optimized models for each domain, which were marked as "certified" after testing, becoming the new, trusted source of data – a single source of truth realized within the Power BI ecosystem.

Concept Equivalents: The Translator's Dictionary

Translating the logic from LookML to DAX and Power Query was the biggest challenge. DAX is powerful but has a steep learning curve. Power Query turned out to be the unsung hero, allowing us to move ETL logic closer to business analysts. However, the most important architectural change was consciously shifting transformations "upstream" – to SQL views and Dataflows Gen2 in Fabric. This kept our Power BI models "thin" and optimized. The following table was our map in this process.

Concept in Looker Equivalent in Power BI/Fabric Key Differences Implementation Tip
Looker Explore Certified Power BI semantic model An Explore is defined in code (LookML). A PBI model is built in a graphical interface. Use deployment pipelines and versioning of .pbip files in Git to mimic version control from Looker.
LookML Join Relationships in the Power BI data model + Merge transformations in Power Query A Join in LookML is performed at query time. In PBI, relationships are materialized. Always strive to build a clean star schema. Move complex joins to Power Query/Dataflows.
Looker Table Calculation DAX measure or calculated column Table Calculations operate on aggregated results. DAX measures operate within a filter context. This is the biggest trap! Always test results at different levels of aggregation to avoid errors.
Persistent Derived Table (PDT) Materialized view in a data warehouse or a table from Dataflow Gen2 PDTs are managed from Looker. Dataflows are part of the Fabric platform. Use Dataflows Gen2 to create reusable, intermediate tables, promoting the DRY principle.

Phase III: Rebuilding the Façade – Visualization and User Experience

Instead of blindly recreating reports, we used the migration as an opportunity to fundamentally improve them. Every significant report went through a mini-workshop with users where we asked, "what business decisions is this report supposed to support?" We consciously used Power BI's flexibility in customizing visuals, interactivity (bookmarks, buttons), and custom visuals from AppSource to create engaging analytical tools. We accepted that not every feature could be reproduced 1:1 and that the goal was greater value for the user, not a perfect copy of the old system. This required active change management and training in new ways of working.

A business analyst conducting a training session for a group of employees in a modern conference room. An interactive Power BI dashboard is visible on the large screen.

Phase IV: Implementation – Governance, Distribution, and Winning Users' Hearts

A technically successful migration without a solid governance strategy leads to chaos. We implemented a fundamental principle: separate workspaces for semantic models and separate ones for reports. Our certified semantic models were published in dedicated, tightly controlled workspaces. All content distribution to hundreds of users was done through Power BI Apps, which offered a clean, branded interface and the ability to create different audiences. Paradoxically, it was the well-implemented data governance that built trust in the new system, becoming a guarantor of quality, not a barrier.

The Human Factor: How to Convince the Organization to Change?

The best technology is worthless if people don't want to use it. The key to adoption was establishing a Center of Excellence (CoE) that acted as a support and evangelism group. Our training was tailor-made, role-oriented, and focused on business tasks, answering the question "what's in it for me?". We built a user community, supporting "champions" in business departments who helped their colleagues.

Summary: Was It Worth It?

Looking back, the migration was not only worth it but necessary. The biggest challenge was translating the entire data management philosophy. The most important lesson was that a migration is a unique moment to pay off technical debt and implement order from scratch. The benefits were both tangible (reduced TCO) and intangible (increased adoption of self-service BI, a fundamental increase in data trust). The move to Fabric has opened up new horizons for us in real-time analytics and AI. If you are facing a similar challenge, treat the migration not as a technical project, but as a data transformation program. Don't move the mess – clean it up.

Frequently Asked Questions (FAQ)

The decision is rarely based on a feature-by-feature comparison of the tools themselves. In our case, it was a strategic decision to unify the data ecosystem on the Microsoft platform (Azure, Fabric) to lower the total cost of ownership (TCO) and leverage deep, native integrations, rather than because of Power BI's abstract superiority over Looker.

Definitely translating the business logic from LookML into the DAX language in Power BI. A simple calculation in Looker often required a complex formula in DAX using the CALCULATE function and context modifiers. This required a deep understanding of the differences in the philosophy of both languages, especially how they handle filter context and aggregations.

Absolutely not. This is one of the most common and costly mistakes. A thorough usage audit and ruthless rationalization are key. In our case, we deleted or archived almost 30% of the reports, which saved hundreds of hours of work and allowed us to focus on what was truly critical for the business.

We accepted that it couldn't be replicated 1:1. We used workarounds: creating dedicated pages with matrix visualizations, educating users on exporting data to Excel, and promoting the "Analyze in Excel" feature. The key was managing expectations and showing how to achieve the goal with new, different methods.

Treat the migration not as an IT project, but as an organizational transformation program and a unique opportunity to pay off technical debt. Start with people, processes, and ruthless rationalization. Don't move the mess to a new house – clean it up. This is the only way to build a solution that will stand the test of time.

Yes. For our largest datasets, it offered performance close to Import mode but without the need to copy and refresh data. This is a fundamental change that eliminates the trade-off between performance and data freshness, and it's a key argument for a strategic move to the Fabric ecosystem.

It was absolutely fundamental to success. Including them in the process of rationalizing and redesigning reports changed the perception of the project from a "forced move" to an "opportunity for improvement." This ensured that the final products met real business needs and were not just technical copies of old solutions.

Łukasz Kidoń - Specjalista AI

Contact the author

If you want to automate processes in your company or have any questions, I will gladly analyze your needs and propose a dedicated solution.

Or write directly to: lukasz@kidon.pro