Decoding the Snowflake and Microsoft Partnership
Created on 2025-08-19 21:38
Published on 2025-08-20 15:58
In today’s world, data is the heartbeat of innovation—yeah, the oil of the 21st century. For large enterprises especially, strategic alliances between technology leaders are essential to unlock the full potential of their data estates, avoiding the trap of turning data leaders into Chief Integration Officers or forcing organizations to spend millions on complex, often pointless integrations.
In this article, I want to highlight the partnership between Snowflake and Microsoft, which in my view goes far beyond connecting technologies—it’s about transforming how organizations unify, analyze, and act on their data. Together, they’re shaping a future where insights come faster, AI becomes smarter, and business impact is immediate.
For organizations already leveraging Microsoft Azure and Snowflake, this collaboration unlocks a seamless, secure, and AI-ready data ecosystem. It empowers teams to break down silos, simplify complex architectures, and accelerate time-to-insight across analytics, machine learning, and AI applications.
At the Snowflake Summit in June 2025, I really enjoyed hearing Sridhar Ramaswamy, CEO of Snowflake, announce an exciting expansion of the Microsoft–Snowflake partnership: the integration of OpenAI AI models into Snowflake Cortex AI as a fully managed service. This collaboration will enable Azure AI Foundry to provide direct access to OpenAI’s advanced models within Snowflake, hosted in Microsoft Azure regions.
Snowflake’s leadership also unveiled innovations aimed at open, interoperable, AI-ready data. Highlights included Cortex Agent for Microsoft Teams — a tool I’m particularly excited about. On my end, I’m working closely with customers to explore how the Snowflake portfolio, together with Microsoft ecosystem tools like Azure OpenAI Services, Microsoft Fabric, Microsoft Power BI (now part of Microsoft Fabric) can drive measurable business outcomes.
How Microsoft + Snowflake Work Together
Today, I want to share an overview of how Microsoft technologies integrate with Snowflake to unify data estates, drive advanced analytics, and enable innovative AI solutions. I will also highlight the growing interoperability between Microsoft Fabric and Snowflake, with real-world use cases that unlock business value.
Importantly, Microsoft is committing to full interoperability by expanding its partnership with Snowflake and adding Apache Iceberg support in OneLake, allowing customers to seamlessly work with a single copy of their data —whether in Delta or Iceberg format— saving time and powering broader analytics and application scenarios. For example, you can use any of the Microsoft Fabric engines, like Power BI with real-time, high-performance connection via Direct Lake mode.
Benefits for Azure + Snowflake Customers
If you’re already using Snowflake on Azure, here’s what this partnership brings to the table:
Unified Data Estate: Eliminate duplication and fragmentation with a single copy of data accessible across Snowflake and Microsoft Fabric.
AI-Ready Architecture: Seamless integration with Azure AI, including OpenAI advanced models and Azure ML, for rapid deployment of enterprise-grade AI models.
Governance & Security: Leverage Snowflake Horizon with Microsoft Entra ID and Microsoft Purview for federated identity and end-to-end data governance.
Low-Code Enablement: Empower business users with tools like Microsoft Power BI (now part of Microsoft Fabric) and Microsoft Power Apps—without compromising compliance or security.
Marketplace Incentives: Snowflake workloads on Azure qualify for MACC (Microsoft Azure Consumption Commitment) drawdowns and co-sell incentives, making it financially attractive to consolidate on Microsoft Cloud.
Why Microsoft Partnered with Snowflake — and Why CDOs Should Care
Data in enterprises is scattered across clouds and on-prem systems, creating silos and slowing innovation. By partnering with “friend-competitors” like Snowflake and Databricks, Microsoft is building interoperability by design, so CDOs can spend less time stitching data and more time unlocking value through AI, analytics, and industry solutions.
I see Microsoft Fabric acting as the data gravity center, with OneLake, mirroring, shortcuts, and open formats enabling a single, unified data estate. The result? Flexibility, governance, and innovation without compromise — making Microsoft Cloud a compelling choice for enterprise data strategy.
Deep Dive: Microsoft Fabric + Snowflake Integration
The integration between Microsoft Fabric and Snowflake is designed to give organizations flexibility, efficiency, and control over their data workflows.
These integrations enable organizations to select the optimal engine for each scenario—whether leveraging Snowflake’s high-performance compute or Microsoft Fabric’s native analytics capabilities.
Here are the five integration scenarios I believe are most critical:
1️⃣ Snowflake Data into Fabric via Mirroring
Microsoft Fabric offers a Mirroring feature that integrates existing Snowflake warehouse data with the rest of the organization data managed by Microsoft Fabric—without complex ETL pipelines. This allows the data engineering teams to continuously replicate Snowflake data directly into Fabric OneLake, ensuring all data is accessible and unified.
Mirrored databases in Fabric Data Warehousing provide an autogenerated, read-only SQL analytics endpoint on top of Delta Tables, enabling users to query, explore, and manage data with familiar T-SQL—for example, developing SQL views, inline Table-valued Functions (or TVFs), and stored procedures to encapsulate business logic, managing object permissions, and even querying data across other Warehouses and Lakehouses within the same workspace.
Microsoft customers love that the compute used for replication comes at no cost, and storage is only billed up to their purchased capacity. In short—they’re paying for real value, not for data sitting on the sofa. That said, it’s important to remember the TCO also includes the compute used on the Snowflake side (i.e. data changes related to read queries). While Snowflake doesn’t charge compute for activities like authoring, metadata queries, access control, data change views, or DDL (Data Definition Language) queries, these actions may still generate underlying cloud service costs.
Currently, mirroring supports physical tables only, not views. If we need to work with a Snowflake view, the recommended option is to materialize it as a table in Snowflake (for example, using CREATE TABLE AS SELECT …) and then enable mirroring. Of course, we could also recreate the view by mirroring the underlying tables or use Data Factory to perform a traditional ETL process — although in that case, we’d incur additional costs for the duplicated data stored in Fabric.
You can learn more about Microsoft Fabric’s Data Mirroring and how it accelerates the adoption of multi-Agent AI by leveraging data agents and the new agent capabilities available in Azure AI Foundry. Please read my previous article here.
2️⃣ Snowflake Data into Fabric via Data Factory
For organizations needing more control, another scenario is Microsoft Fabric’s Data Factory which supports Snowflake integration using the Snowflake database connector.
This fully managed service enables:
Dataflow Gen2 sources
Pipeline activities: Copy (source/destination), Lookup, and Script
Copy jobs: full, incremental, append
Connectivity via on-premises or virtual network gateways
Snowflake or Microsoft account authentication
In this scenario, organizations pay only for resources used—true pay-as-you-go. Ideal for complex transformations, scheduled pipelines, and automation, while Snowflake handles large-scale storage, processing, ML, and analytics.
3️⃣ Microsoft Data into Snowflake via Data Factory
Moving data from Microsoft platforms into Snowflake is seamless using the Snowflake Connector for Data Factory.
Here, methods include:
Direct copy: If your data source meets Snowflake’s format requirements, you can copy it directly into Snowflake.
Staged copy to Snowflake: If the source data format isn’t compatible with Snowflake’s COPY command, you can use the staged copy method via Azure Blob Storage. The linked service converts the data into the required format, after which the COPY command transfers it into Snowflake.
REST API: If REST API are available, this is another method you can use to write data to Snowflake.
Data Factory (both as standalone - Azure Data Factory - or as an engine integrated in Microsoft Fabric) enables secure, performant, and orchestrated data movement for analytics, AI, and reporting in PowerBI.
4️⃣ Bidirectional Access with OneLake + Iceberg (Public Preview)
While Data Factory is a highly cost-efficient ETL solution, Snowflake on Azure customers will likely take advantage of the recent Public Preview enabling bidirectional interoperability with Microsoft Fabric via OneLake and Apache Iceberg tables. Here is why:
Single-copy storage: Snowflake writes Iceberg tables directly to OneLake.
Virtual access: Fabric engines can read the same tables, with metadata automatically converted from Iceberg to Delta Lake format—no need to rewrite Parquet files.
Cost efficiency: Eliminates the need for duplicate pipelines or separate copies.
Open standards compliance: Apache Iceberg and Parquet ensure compatibility and flexibility.
Microsoft OneLake works with Snowflake to store and access Apache Iceberg tables by writing Iceberg tables directly to OneLake and reading virtual tables converted from Delta Lake. Using Apache XTable, metadata is translated between Iceberg and Delta, so Snowflake can read Delta tables while Microsoft Fabric compute can read Iceberg tables. This ensures that data written on either platform is accessible from both, providing a seamless interoperability experience for joint customers.
To enable this, your Snowflake account on Azure must have its Entra ID identity configured to communicate with Fabric. Learn more here.
In the following demo, the Microsoft Fabric Team showcases OneLake shortcuts to Iceberg tables—the simplest way to query Snowflake data using virtualization, with no duplication required.
As summary, this scenario provides you with the following benefits:
Single-copy storage: Snowflake writes Iceberg tables directly to OneLake.
Virtual access: Fabric engines can read the same tables, with metadata automatically converted from Iceberg to Delta Lake format—no need to rewrite Parquet files.
Cost efficiency: Eliminates the need for duplicate pipelines or separate copies.
Open standards compliance: Apache Iceberg and Parquet ensure compatibility and flexibility.
5️⃣ Directional Access to OneLake Data via Iceberg Table APIs (Private Preview)
To simplify data access, OneLake now provides a REST API endpoint that supports interaction with tables in Microsoft Fabric through the Apache Iceberg REST Catalog (IRC) APIs for metadata read operations.
This introduces a new, programmatic way to work with your OneLake data tables—not limited to Snowflake users. These APIs empower developers and data engineers to seamlessly integrate OneLake into their workflows, unlocking advanced automation and interoperability with open table formats.
Use Cases Unlocked
Real-Time Decisioning: Combine the RTI engine available in Microsoft Fabric and Snowflake for fraud detection, supply chain monitoring, or customer engagement.
AI-Infused Applications: Build apps with Streamlit or Power Apps, backed by Snowflake Arctic LLM and Azure OpenAI.
Unified BI: Enable DirectQuery or Import Mode in Power BI with Snowflake data for secure, performant reporting.
Data Governance at Scale: Maintain a single copy of data with federated security and lineage across Snowflake and Microsoft Fabric.
Migration Acceleration: Move workloads from AWS or GCP to Azure with Snowflake + Microsoft Fabric as the anchor.
Strategic Takeaway for Data Leaders
This partnership is more than technical—it’s strategic. By positioning Microsoft Cloud as the premier destination for Snowflake workloads, organizations gain:
Faster time-to-value for analytics and AI initiatives
Reduced complexity and cost through a unified, governed ecosystem
Future-proof architecture aligned with open standards like Apache Iceberg and XTable
For CTOs, CDOs, and data leaders, the message is clear: embracing the Snowflake + Microsoft ecosystem is not just an IT decision—it’s a business imperative.