Reputation: 43
I'm working on a business intelligence project for banking transactions. After completing the ETL phase My supervisor asked me to search the difference between the Tabular and the multidimensional models and which one is more adaptable to our needs. after choosing to work with the tabular model I got noticed that I have to choose between import and live connection to connect power bi and our model. So here are the questions that has come to my mind: *How and when tabular model use memory? *How and when Power BI import use memory? *What should I exactly import into power bi from my tabular model? *Is import mode import the model that is already use memory cache or something else? *How much storage of memory do I need if the size of my Data Warehouse DB is approximately 7GB?
NB: I still not too familiar with Power BI So maybe I'm asking the questions in a wrong context.
I would be so grateful If anyone could help me in this.
I tried to use import mode to import my whole model but there is always a problem of memory. Should I use live connection instead?
Upvotes: 3
Views: 948
Reputation: 4967
Your question isn't clear, so here are a few options for you. SSAS Tabular, Azure Analysis Services (AAS) and Power BI use the same underlying engine for the tabular model, the vertipac engine. Power BI is a superset of SSAS Tabular, and currently has more focus from the internal project team. MS are currently trying to move customers from AAS to Power BI. See here.
my Data Warehouse DB is approximately 7GB
Importing the data will create a copy of the data from the data source, and hold it in memory. The dataset will not have a 1 to 1 relationship in size, as the vertipaq engine will compress the data down. So you will have to test this.
However you have don't just have to plan for the sufficient memory to hold the dataset, you have to remember that memory will be used in querying the data too. For example a FILTER function basically returns a table, that query table will be held in memory until the results of the measure are computed and returned. Memory will also be used when dataflows are being processed, even though they will be writing to blob storage and not being held in memory. There are data model size restrictions for Power BI Pro of 1GB, but the size restrictions are larger for Power BI Premium.
For direct query and live connection, it will have a far lower memory overhead than importing, as it will not be holding the full data model, just the total for the result set generated and returned via the data source. For most cases this will be quite low, but if you are returning detailed data, then it will take up more memory. You can also use for direct query modes you can use aggregations, to store a subset of data in Power BI, rather than query the data source.
If you are using SSAS Tabular/AAS you should not really use Import mode in Power BI, you'll be building the measures and data model twice. If you use SSAS Tabular/AAS, you should use Live Connection. If you wish to use Power BI, then use Direct Query, however you have to ensure that your data source can respond to the queries generated by Power BI quickly, so it should be in a star schema, indexed and enough scale to handle queries quickly.
Upvotes: 3