Reputation: 21
Our company is in the process of switching from using PBIX files as 'warehouses' to having an actual lakehouse in Microsoft Fabric. I am a big fan of the options to make multiple semantic models using the same data from the lakehouse and thereby avoiding the duplication of data.
In our current setup, we have split up data+model from reports, so we have one PBIX with our data and model and then several reports connecting live to this model. This means, we only have to update the data once and all changes will almost instantly be shown in all reports.
I am hoping to replicate this behaviour in our new setup. Using the direct query and direct lake connections is simply too slow, instead of waiting at most 2-3 seconds for a query to complete, it takes anywhere between 5-35 seconds. I do realize the possibility of importing data from the lakehouse, but this would mean that we either have to:
While step 2 would work, it would mean having to import existing lakehouse data and publish it again which seems like a waste. I was hoping to be able to simply do this online similarly to how the directlake/directquery semantic models are made.
Is it possible in any way? Thanks in advance!
I tried changing the lakehouse semantic model data connection to be import instead of direct lake or direct query but did not find the option for import.
Upvotes: 2
Views: 2955
Reputation: 11
The default semantic model and any other semantic model created through Fabric Web interface created in the Lakehouse is by default Direct Lake and cannot be changed. To create an import mode for the same semantic model, you can use the SQL analytics end point of the Lakehouse and bring in all the tables & objects and continue from there.
Upvotes: 1