An Azure service that provides an event-driven serverless compute platform.
How do I trigger a python notebook to run from a Fabric power sheet update?
I want my flow to be:
Power sheet update
Pipeline trigger
Python notebook runs based on updated power sheet and pushes calculations to Lakehouse
PowerBI can use updated calculations in real time without me having to click refresh.
My current issue is that to use power sheets I think I need an Fabric SQL DB and there doesn't seem to be any events that fire when data is mirrored from Fabric SQL DB into Lakehouse.
Azure Functions
-
Praveen Kumar Gudipudi • 2,275 Reputation points • Microsoft External Staff • Moderator
2026-03-31T16:31:22.9633333+00:00 Hello Bill Hodgkinson,
To better understand the scenario and suggest the most appropriate trigger mechanism, could you please clarify the following:
- Is the Power Sheet backed by Fabric SQL DB, Dataverse, or another source?
- What is the current data flow between the Power Sheet, Lakehouse, and the notebook?
- Is the Python notebook running inside Microsoft Fabric?
- Are you open to using Power Automate or Azure Functions to trigger the pipeline?
- Do you require the notebook to run immediately after updates, or would a scheduled trigger work?
- How is Power BI connected to the Lakehouse (Direct Lake, Import, or DirectQuery)?
- Is the SQL DB → Lakehouse mirroring required in your architecture?
-
Bill Hodgkinson • 0 Reputation points
2026-03-31T17:28:57.2866667+00:00 Sorry, I'm using a Power Table within the Planning tool, not Power Sheet. It's backed by Fabric SQL DB (my research tells me this is the only way connect a Power Table). Desired flow of data is user will update Power Table, Power Table updates Fabric SQL DB. DB is mirrored to Lakehouse via Shortcut. This sets a trigger off that then runs my Python Notebook. The Notebook transforms the data and writes the output to a Lakehouse table that can be read by PowerBI.
I don't want a scheduled trigger, ideally I want it to be an automatic trigger. I would settle for a manual trigger but only via a button in the PowerBI frontend. I have just tried to implement this with a UDF but I can't use Spark inside a UDF so this isn't going to work for me either.
I don't really want to use Power Automate or Power Apps because it feels clunky and not a great UX.
Thanks for your help with this Praveen!
-
-
Praveen Kumar Gudipudi • 2,275 Reputation points • Microsoft External Staff • Moderator
2026-03-31T18:45:19.16+00:00 Hello **Bill Hodgkinson,**Please try use function app
Azure Function runs on a short timer (for example every 30–60 seconds).- It queries Fabric SQL DB for new or updated rows.
- If changes are detected → call the Fabric Pipeline REST API.
- The pipeline runs the Notebook activity.
Example API call:
POST https://api.fabric.microsoft.com/v1/workspaces/{workspaceId}/items/{pipelineId}/jobs -
Bill Hodgkinson • 0 Reputation points
2026-04-02T09:22:08.9866667+00:00 I don't want to be polling or scheduling the pipeline - will this not get expensive each time we poll?
I understand what you're saying about using the REST API but that doesn't feel like the right approach to keep everything contained within the Fabric environment without relying on external HTTP requests that need secret keys etc to keep them secure.
Is there no way of creating a trigger based the Fabric SQL DB updating?
-
Praveen Kumar Gudipudi • 2,275 Reputation points • Microsoft External Staff • Moderator
2026-04-02T13:59:30.2833333+00:00 Hello **Bill Hodgkinson
Use an Azure Function with an HTTP trigger that calls the Fabric Pipeline REST API whenever the data update occurs.
A Azure Functions cannot directly receive a trigger when a Fabric Power Table / Power Sheet update occurs because Microsoft Fabric Fabric SQL DB currently does not emit change events or webhooks.
So the Function App cannot automatically fire exactly at the moment the Power Table updates unless something explicitly calls it.
Please follow below steps.
- Create an HTTP-triggered Azure Function.
- After the Power Table update is written to Fabric SQL DB, the upstream process (for example the application, integration layer, or any automation that manages the update) sends an HTTP POST request to the Function endpoint.
- The Function App then calls the Fabric Pipeline REST API to start the pipeline that runs your Python notebook.
- The notebook processes the updated data and writes the output to the Lakehouse, which Power BI can read.
-
Praveen Kumar Gudipudi • 2,275 Reputation points • Microsoft External Staff • Moderator
2026-04-06T18:34:01.2766667+00:00 Hello **Bill Hodgkinson,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. In case if you have any resolution please do share that same with the community as it can be helpful to others. Otherwise, will respond with more details and we will try to help.
-
Bill Hodgkinson • 0 Reputation points
2026-04-07T11:45:38.26+00:00 Thank you for the response. I think this is the process that I've landed on as well for triggering the pipeline.
Do you know whether it's on the route map for Fabric to emit change events for the Fabric SQL DBs?
Thanks,
Bill
-
Bill Hodgkinson • 0 Reputation points
2026-04-07T11:45:55.72+00:00 Thank you for the response. I think this is the process that I've landed on as well for triggering the pipeline.
Do you know whether it's on the route map for Fabric to emit change events for the Fabric SQL DBs?
Thanks,
Bill
-
Praveen Kumar Gudipudi • 2,275 Reputation points • Microsoft External Staff • Moderator
2026-04-07T17:40:06.8333333+00:00 Hello **Bill Hodgkinson,
events or webhooks directly from a Fabric SQL Database. Fabric SQL DB currently does not generate event notifications when tables are updated, which is why pipelines, notebooks, or other Fabric items cannot yet be triggered automatically from those changes.
Microsoft has been expanding event-driven capabilities within Fabric, particularly around pipeline triggers, data activator scenarios, and event streams. However, database-level change events for Fabric SQL DB have not yet been formally announced on the public roadmap. The best place to track upcoming features and announcements is the official Fabric roadmap page:
In the meantime, the architecture you described—using an external trigger such as an HTTP-triggered function in Azure Functions to call the Fabric Pipeline REST API—remains the recommended approach for near real-time execution when a Power Table update occurs.
If Fabric introduces native database change events or event subscriptions in the future, that would enable a more direct pattern where the update in Fabric SQL DB could trigger a pipeline or notebook without requiring an external component.
-
Bill Hodgkinson • 0 Reputation points
2026-04-08T07:43:08.99+00:00 Thanks @Praveen Kumar Gudipudi ,
I tried to trigger a pipeline directly from an SQL trigger that's attached to the Fabric SQL DB but it wouldn't trigger using the REST API? Do you know whether that functionality is available in a Fabric SQL DB? I think it's called something like sp_invoke_external_rest_endpoint?
Thanks,
Bill
-
Praveen Kumar Gudipudi • 2,275 Reputation points • Microsoft External Staff • Moderator
2026-04-08T18:33:07.4966667+00:00 Hello **Bill Hodgkinson,
The functionality you're referring to — sp_invoke_external_rest_endpoint — is not currently supported in a Microsoft Fabric SQL Database.
An external trigger (Azure Function, app layer, or automation) is required to call the Fabric Pipeline API.
Sign in to comment