My Favorite Topics from Power BI Summit 2023
Power BI development lifecycles can be time consuming and a frustrating experience by all involved
parties. Business intelligence is caught between two different groups in the organization: IT and
Business. Two departments that work almost completely opposite of one another. IT is structured while business is ad hoc. IT has a deployment calendar while business needs it now. IT is familiar with tech and business avoids it whenever possible. The BI professionals role is to play intermediary between both parties and develop analytical solutions that satisfies the business while working within the constraints of IT.
The Power BI Summit offered solutions to the challenges faced by BI professionals during development. To improve relations with the business, the BI department must engage stakeholders in a positive way. That means making requests easy to submit, clearly communicating project priorities, and being transparent in prioritization decisions. To improve relations with IT, the BI department must adhere to the processes set by IT which means having a clear change management strategy, prioritize ahead of time so only the most important requests are worked, and have a clear plan for execution. Several presenters at the conference offered advice on how to juggle the needs of both parties.
Start small and iterate – it’s easier to enhance than it is to subtract
Clearly communicate statuses – set expectations and deliver
Stay focused – be clear on the business objective and build to it
More details are better – More upfront details reduce disappointment later
Support the users – training and documentation critical to adoption
Another workshop focused on getting clear requirements suggested involving requesters early and
frequently during the design process. Making the development process as simple as possible for
business stakeholders will reduce the fear of working with the BI department. They need a standardized process for submitting a request and need to receive a response when they enter it. Meet with stakeholders and explain how the process works. Be clear on commitments and key project dates.
During early development discussions, involve all stakeholders with an ability to impact the project
deliverable. Collect lots of documentation early in a project so all desires, constraints, and commitments are clearly stated and understood. Use multiple approaches to collect user requirements (user interviews, ride alongs, wireframing, brainstorming) and review with requesters for project alignment. Build demoes quickly to test with potential users. Use the feedback to iterate quickly. Continue the cycle until a final version is agreed upon. Get signoff from project owners to confirm project was delivered to expectations.
Based on these recommendations, the Summit recommended working out of Azure Dev Ops to support the BI development lifecycle. A DataOps approach is a standardized set of technical practices, project processes, and streamline architecture to deliver high quality BI solutions quickly. BI product quality and performance should have defined standards and monitored continuously that they meet those standards. Some tools of the trade to help deliver high quality BI products quickly are
Tabular editor – This is a great modeling tool to standardize development and report editing
DAX Studio – Optimize DAX calculations with the query editor and stored procedures
PBI Tools – an option for version control using JSON and .pbit files
When building an enterprise BI environment, developers and engineers are challenged to build high
quality BI products quickly. They struggle to build high quality products quickly because there are several people working on the same report but at different phases of the development lifecycle. It’s common to have data engineers working in the data warehouse while BI developers are working with data models and reports. Many times the ownership lines are not clearly defined and data engineers can work on BI developer tasks and vice versa. This overlap of work and responsibilities can lead to work being overwritten or lost.
The Power BI Summit offered multiple recommendations to improve data management tasks by
recommending working with a thin report structure. A thin report structure is the concept of breaking
datasets apart from reports. This structure allows data modelers work with dataset updates without
slowing down the development of report developers. It’s a better structure that reduces the
responsibility overlap common to BI development. If you find that many of your reports require Power Query for additional data transformations, then it is recommended to leverage Power BI dataflows to create reusable assets where data transformation are common. Report developers can leverage these curated datasets to create new reports.
The data management process needs to be structured in a way that new ideas can be tested without
interrupting the production environment. One presenter recommended developing in a dev, test, prod
structure to prevent unintended negative impacts to production data. This is where DevOps can offer
another solution to managing data infrastructure updates. DevOps offers a repeatable process to follow
for every request. Plan and assign the work. Estimate work and deliver by estimate. Test during multiple development stages. Get approval from project owners once complete to confirm requested data is now available.
Large datasets can also slow the development of BI tools so it’s recommended to reduce dataset size to
the minimum amount of data available. Since that is not always an option, Power BI offers the ability to incrementally refresh large datasets to reduce refresh times. Subsequent refreshes will only pull new
data and append to historical data. Only the report manager can setup an incremental refresh. Keep a
second copy of any incremental refreshed report because once published, the report can’t be
downloaded from service. Parameters are required to set start and end times. Data should be reviewed
when incremental refresh is set to confirm no data was duplicated. Since incremental refresh leverages
hybrid tables, then the hybrid table setting needs to be turned on.
During the summit, some tools were recommended to help with the report development process.
Versioning is a challenge for report developers so the following list of tools can help with version
Azure Dev Ops offers an approach to source control and standardized deployment pipelines
PowerShell has a programmatic command line to automate many administrative tasks
Tabular Editor uses JSON files for versioning control to easily make changes to a PBI model
PBI Tools offers source control features like source control, code comparison, and governance
Power Automate and APIs – automate report deployment, administration, and provisioning
Bad data modeling foundations is a main driver to a less than optimal Power BI experience. Power BI
leverages the star schema modeling approach to leverage all of its capabilities. Bad model design can
lead to a list of bad Power BI performance. Too big, too slow, poor design, unexpected behavior, or
inaccurate data can be a result of bad model design.
Fortunately, the summit offered plenty of options to improve the data modeling experience. First, it’s
important to understand why good data model practices are so important. A good data model improves
understanding of the data, increased performance, and more resilience to change. There were some
new features introduced that makes data model management even easier. The properties pane has
been updated to manage relationships with less steps, fewer data model refreshes required, and the
ability to make model changes in bulk. Parameters were recommended as an option to reduce model
size during testing. Once a report is published to service, then a parameter can be changed to import a
larger size of data.
There were a few recommendations of how to use DAX to support data model maintenance. Use virtual temporary tables to create new segmented views of the business. Virtual DAX tables can be unioned or intersected to create larger datasets with reduced refresh times. X functions can speed up calculation times when a total sum, average or other statistical formula is needed. Calculated groups should be used when trying to create slicers with multiple options. DAX can be used to create a virtual table for slicer selections.
The highlight of the data modeling theme was the number of tools that can help automate the tasks of
the BI development lifecycle. There are so many tools offered to increase development efficiency. Here is a list of recommended tools to assist with data model management.
Power BI and Power Automate team up to automate Power BI administration
ALM Toolkit: make bulk incremental changes to reports and publish to service
DAX Studio: Improve the efficiency of you measure writing and publish bulk updates
Power BI Helper – reduce file size and simplify models along with versioning control
Measure Killer – Identify unused columns from a data model and remove to reduce file size
Tabular Editor - Fast editing, copying over, and moving of measures
The challenge that a Power BI administrator faces is making access to data easy for the end users. They balance making data available versus making data secure against bad actors. An administrator should most concerned with the ease of access users have to reporting. To increase utilization of data, an administrator must make access and understanding of data as simple as possible. The administrator
must monitor multiple aspects of the Power BI experience. To deliver an optimal experience they have
to monitor access, incidents, gateways, capacity, and support requests. At the end of the day, you are
the doctor responsible for solving the ailments that users experience.
Your role as an Administrator is to make sure your Power BI environment is as healthy as possible. An
administrator will need to monitor the environment for issues and proactively resolve them. You will
need a view to all activities being conducted. Monitoring Power BI critical to the experience so use
automation to understand activity. Review activity and audit logs to see who your highest users are, the most viewed reports, or the volume of workspaces. These metrics will help you identify what parts of the Power BI ecosystem can provide feedback or need support. Upgrading to premium helps by offering higher limits on dataset size, dedicated hardware to computing, increased control over refresh rates, improved query performance, and additional Power BI enhancements. Monitoring the health of
Premium capacity is very important to avoid performance issues.
Once you know the users and assets requiring support, then an administrator can provide the support
needed to increase adoption. There is a balance between democratizing data and data security. The
administrator doesn’t want to make data inaccessible with security permissions. Therefore, they must
try and protect the data while making it available. A single administrator can get buried by the number
of administration responsibilities of a Power BI environment. A few presenters recommended delegating a group of people to manage administration with guidance from a Center of Excellence or Analytics Committee.
Separating datasets and reports helps delegate responsibilities across multiple people. Separating
datasets offers the benefits of allowing authors to work independently, improve control of permissions,
and creating a single source of truth that report developers can use. Best recommended practices by
many presenters is to train anyone on how to use the tool, separate data sets from reports, thoroughly
test any assets that are deployed, identification report owners with contact information for questions,
and monitor dataset creation with audit logs and usage reports. If you decide to have multiple people
working, then you need a way to manage different versions. One presenter recommends manage
version controls with template files (.pbit) vs. desktop files (.pbix). Since Power BI desktop files are
binary files they can be difficult to share and compare.
Automation should also be leveraged to reduce administrative overhead. Use Power Apps and Power
Automate to automate some of the tasks administrators are responsible for. Power Automate also
works with other task tools like GitHub, Trello, or SharePoint to automate workflow tasks. An
administrator could create a Power App for Power BI access request instead of manually managing
access. Power Automate, Teams, and Power BI can alert to data quality issues and auto email a data
entry person that an data entry error has occurred. This is an example of how data quality management
can be proactive vs. reactive. Dataflows (premium only) can automate data ingestion and reduce refresh timings. Dataflows have the capability to detect new data and refresh when available, otherwise refresh is paused until a new dataset is introduced.