1. Model Optimization
a) Remove unnecessary columns and tables
- Use DAX Studio Vertipaq analyzer to understand what the largest table and column are in your dataset.
- Remove columns that are duplicated in other parts of your dataset.
- Use the Measure Killer tool to find columns and tables not used in reports for deletion from the model.
b) Use correct data types:
- Use the smallest datatypes (you can check with Vertipaq analyzer (e.g., fixed decimal)
- Reduce precision (e.g., 0 decimal points instead of two).
c) Use Star schemas
- Use dimensional modelling (Fact-Dimension) instead of a snowflake or flat table.
- The further the number of joins from the fact table, the slower the filtering by dimension tables.
- Use low cardinality fields in relationships to minimize search.
d) Reduce cardinality
- Avoid unique strings (e.g., GUIDs, transaction IDs) in visuals or joins (although sometimes unavoidable).
- Split or group detailed columns (e.g., “Day” instead of full timestamp).
e) Disable auto date/time tables. Turn this feature off in options, as it bloats the model with hidden additional data. Use a central date table instead.
f) Aggregation tables
- Create summary tables (Import mode) for high-grain DirectQuery data. Try to avoid transaction-level tables in your model. If transaction-level data is required, consider using direct query, so as not to increase the size of your model (although reports will be slower).
- Power BI automatically uses them when queries match the aggregation grain.
g) Incremental refresh to reduce the dataset refresh time.
- Set up incremental refresh on large tables where data does not change historically, or if it does,s make use of an updated date field to use in your range parameters so as to ensure historical changes are also kept up to date (instead of e.g. the last day/week).
- Although incremental refresh may spike memory at some point, the reduced time will, in general, free up more memory time for report consumption.
h) Check model relationships are optimized.
- Try to keep as many relationships as one-to-many as a rule.
- Avoid bi-directional cross-filtering that can destabilize the model. Use only where necessary.
- Avoid many-to-many relationships that will slow down the model.
2. DAX measure optimization
- Avoid row-by-row iteration unless necessary. SUMX, FILTER, can be heavy; use SUM, AVERAGE, COUNTROWS where possible.
- Use variables in DAX measures to allow expressions to be reused in measures
- Consider using IF(ISINSCOPE() to use different measures for totals than filters. More complex, but worth testing.
- Avoid nested IF, SWITCH, or FILTER chains on large tables.
- Avoid DISTINCTCOUNT on large tables, you can try SUMX(VALUES(‘table'[column]), 1)
- Pre-calculate in ETL (views, etc., rather than putting load in DAX measures)
- Use KEEPFILTERS() and REMOVEFILTERS() carefully, as each adds cost.
- Test multiple variations when using complex measures, use SELECTCOLUMNS() instead of SUMMARIZE or GROUPBY()
3. Data Source / ETL Optimization
- Filter early in Power Query (or table source)
- Push filters to the source so DB does the work.
- Materialize complex transformations in the source. Avoid heavy joins, grouping, and parsing in Power Query. Push it back to SQL views, staging tables, or earlier in the pipeline.
- Compress categorical data – Replace text fields with dimension keys, e.g., ProductID, not ProductName
- Optimize refresh frequency if possible. Find the best time of day before peak report usage times.
- Use Incremental refresh to reduce refresh load and refresh times.
4. Report Layer Optimization
- Limit visuals per page. Each visual is one or more DAX queries. Try to reduce queries to around 8 per page.
- Use ‘Apply all filters’ to prevent visuals from refreshing on each slicer change. Wait for the user to finish slicing before applying the query, to prevent repetitive queries from being issued.
- Turn off unnecessary interactions between visuals using ‘Edit Interactions’.
- Pre-calculate totals in DAX or SQL. Cards with complex measures can be replaced by base measures or pre-aggregated values.
- Use bookmarks or buttons to toggle between visuals, to reduce the number of charts showing at one time (reduces the number of DAX queries being issued).
- Avoid visual-level calculations like TOP N Filters. Move logic into measures or calculated tables.
5. Service Level – For Power BI Premium / Fabric capacities.
- Enable a large dataset storage format. Reduces memory fragmentation and improves refresh concurrency.
- Use Aggregations + DirectQuery to split heavy workloads. Import summaries, query details on demand.
- Enable Query caching in capacity settings. Caches query results across users for faster loading. Note cache is cleared after a dataset refresh and potentially every hour as it checks for changes every hour.
- Use the XMLA endpoint to connect SSAS to the model and analyze refresh with SQL Server Profiler.
- Consider upgrading to higher capacities if the model is optimized, but still slow.
- Note that with Premium capacity, you can monitor CPU and memory usage using the Power BI Premium capacity metrics app.