Pressing 'Calculate Table' on Measures table in Direct Lake mode completely floods F8 Capacity #1477
Unanswered
KoningLeon
asked this question in
🙋♀️TE3 Q&A
Replies: 1 comment
-
|
I’m not in the office so this will just be an initial brief reply, but to answer question no. 1, Tabular Editor doesn’t distinguish between tables or their metadata content when running a refresh. It’s equivalent to executing a TMSL calculate refresh on the table. So if you see the capacity maxing out because of that, I’m 99% sure that’s a bug on Microsoft’s side, so you’d be better off reaching out to Microsoft support (assuming you can reproduce the issue by executing a TMSL calculate refresh through SSMS). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My team are migrating our semantic models from Import mode with a Synapse DWH to Direct Lake mode using the Fabric Lakehouses.
I first did a small POC with one of our simpler models and after that looked good we moved on to our biggest model. This semantic model has about 50 tables with 900+ columns and is 600mb on disk. The reason we are starting with this model is because in memory it is too big to refresh in PBI Service because it exceeds 3 gb in memory, the limit for a F8 capacity.
The problem arose when my team member was building the model. He had connected TE3 to a Fabric workspace were the Direct Lake version of the model is deployed. The model has a Measures table with one dummy record to hold all our measures, as is common practice. This table holds about 114 measures in total.
At one point he decided to press 'Calculate Table' for the Measures table (not sure why but so be it). This resulted in our F8 capacity completely spiking to max burst capacity of 400%. After which the capacity was not able to be used for at least a couple of hours because of smoothing.
The Fabric Capacity Report shows the semantic model was maxed out by XMLA Read Operations. I feel like the Calculate Table button should not do anything when it contains only measures because it stores no data. Doubly so because of Direct Lake mode.
Now we are left with some questions:
I understand the model is pretty big but it should not be this problematic right? Considering normally the capacity is barely taxed when running it in Import mode. I feel like models should be able to be a lot bigger. We have been running this model on a B2 AAS server two years ago and last year with a PBI Pro license without any issues so I didn't expect Fabric to have so much trouble with it.
Beta Was this translation helpful? Give feedback.
All reactions