Microsoft Fabric Updates Blog

Announcing: Fabric Warehouse publishing full DML to Delta Lake Logs

We are excited to announce that the Data Warehouse now publishes all Inserts, Updates and Deletes for each table to their Delta Lake Log in OneLake!

Our vision is to break down data silos and make it really easy to share data from your Data Warehouses with other teams who use different services without having to create copies of your data in different formats.

What does this mean?

Today, teams have a wide set of skills and varying comfort levels with different tools and query languages such as Python, T-SQL, KQL and DAX. Instead of having to create copies of your data in different formats for each tool and service, Fabric leverages Delta Lake as a common format across all of its services. By only having one copy of your data, this makes it more secure, easier to manage, ensures the data is consistent across reports and it makes it faster and easier to share your data.

The Data Warehouse supports this, by publishing Delta Lake Logs for every table that you create in your Data Warehouses. When you modify data within a Data Warehouse table, those changes will be visible in the Delta Lake Log within 1 minute of the transaction being committed.

For example, say you want to use Python to query a Data Warehouse table by using a Notebook in a Lakehouse. All you would need to do, is to create a new shortcut in the Lakehouse and point it to the Data Warehouse Table. That table is now directly accessible by your Notebook and no data has been copied or duplicated! Data Scientists and Data Engineers are going to love how easy it is to incorporate your Data Warehouse Tables into their projects like Machine Learning and training AI models.

To learn more about how to create shortcuts that point to Data Warehouse Tables, please see this documentation article: Create a OneLake shortcut – Microsoft Fabric | Microsoft Learn

Conclusion

You might wonder, how do I enable this? The answer is that you do not have to do anything! This all happens automatically for your Data Warehouses.

Note, only tables created going forward will have all DML published. If you have an older table that you wish to be fully published, you will need to use CTAS (Create Table as Select) to create a new copy of the table with all of its data or drop the table and reload it.

To learn more about how to leverage your Data Warehouse’s data through its published Delta Lake Logs, please see our documentation Delta Lake logs in Warehouse – Microsoft Fabric | Microsoft Learn.

Related blog posts

Announcing: Fabric Warehouse publishing full DML to Delta Lake Logs

November 15, 2023 by Bogdan Crivat

Earlier this year, at Microsoft Build, we introduced, in Public Preview, Microsoft Fabric, “the biggest data product announcement since SQL Server”. Today, we are announcing the General Availability of Microsoft Fabric. Arun explains in detail why we all believe Microsoft Fabric will redefine the current analytics landscape. I will focus here on what it means … Continue reading “Microsoft Fabric, explained for existing Synapse users”

November 15, 2023 by Ryan Majidimehr

Welcome to the November 2023 update.   We have lots of features this month including Narrative visual with Copilot, cross workspace “save as” for Data Factory, the general availability of Semantic model scale-out, and many more. Continue reading for more details on our new features!    Contents Core Microsoft Fabric User API Power BI Reporting Button … Continue reading “Microsoft Fabric November 2023 update”