Last week I attended the virtual Tableau Conference 2022. I always get excited by this event (although I’ve never attended in person) because I know that I’m going to walk away with inspiration, education, and new relationships. The conference covers the Tableau suite of tools along with examples of how the tool is helped companies with their data initiatives. The conference is important because it keeps the community enthusiastic about data in general and offers ideas to get past data challenges. I wanted to share with you the what I thought were key takeaways from the conference. Create good data before good dashboards, increase accessibility for faster problem-solving, and how building a data community drives increased data adoption and literacy.
Good Data Before Good Dashboards
The first theme that stood out for me at the Tableau Conference was the need for good data throughout all levels of the organization. To get people the data they need there are three ideas that were covered during the conference. The three ideas were to provide a unified view of all data touchpoints, increase accessibility to all levels of the organization, and increase understanding of the business to build applicable data tools.
Example: Using technology to enable the sales team with a 360-degree view of customer
All people within the organization should have a unified view of data. Not everyone should have access, but anyone who needs access to the data for effective decision making should have it. During the Building Effective Dashboards for the Duke University Health System talk, the presenters stressed the purpose of Duke University Health System was to serve the patient. Therefore, all hospital staff and doctors should have access to the information they need to deliver the best patient experience. Understanding the patient journey can lead to providing the best health recommendations from Duke Health practitioners. To create a unified data architecture, Kevin Glover the Director of Product Management at Tableau covered the concept of data mesh. Data mesh is decentralizing the ownership of data and distilling that ownership to the teams more familiar with the processes driving the collection of data. They can own the pipeline, clean it, set SLAs, create new interfaces, and enforce governance. This would require additional data professional roles like a data product manager or owner. The data owners can publish trusted content that other teams can use. Other teams could enhance the data by testing certified sources with other exploratory sources available to them which can lead to new, more robust data sources. These enhanced data sources can be shared back to the trusted content repository and made available to the organization. This cycle of testing, blending, and sharing data can lead to better and faster insight generation. On top of these enhanced data sources being made company-wide, these consolidated data sources can be traced back to their original source for improved trackability and governance.
The second idea was increasing accessibility to data will lead to faster problem solving. Again, I turn back to the Duke Health team on how you can increase accessibility through dashboard design. Associate Director of IT Claire Howell and IT Analyst Katie Capaldi at Duke University covered building effective dashboards to improve accessibility at all levels within an organization. Katie and Claire believe in the power of data democratization. They know that everyone’s role at Duke University Health is to deliver the best patient experience possible and understand how their roles play a part in that service delivery. That is why they are fighting to get the right data to the right people when they need it. This means that dashboard designers need to think about how to make data more accessible to all levels of the organization. Katie and Claire suggest really understanding why a dashboard is being built, what information needs to be included, and what the user will do with the information once it’s available. This approach has multiple benefits. The dashboard is built for the client need, the designer increases their understanding of data purpose, and collaboration across parties’ results in a more informative data product. Capaldi and Howell try to make sure dashboards are designed for the audience in mind. They know their audience is the staff and doctors delivering the patient experience and they don’t want BI tools to be a hindrance. The presenters highlighted the importance of a dashboard being built so the user knows how to use it, the user needs the information, and the user knows what to do with it. If the designer can’t answer yes to all three points, then there is an accessibility problem and they should revisit the dashboard until it fulfills its intended purpose. Easy access to information, that is easily consumable, and applicable to strategic decision-making.
The third idea was for designers to get familiar with the business problems and build data products to solve for the biggest and most complex problems. Principal Richard Starnes, Senior Manager Jitendra Kumar, and BI Analyst Brij Sharma of Deloitte displayed their understanding of the intersection of business tech and sales teams. They demoed an example of how they used Salesforce, Snowflake, Tableau, Primary, and Secondary data sources to deliver optimal sales results. They kept the outcome simple. Use Salesforce to grow sales. With that waypoint, they used the technology available to deliver results. Their approached was to overlay data platforms along different points along the enterprise sales journey and gather the appropriate data points. The data could be shared in the form of dashboards, but even more powerful was the ability to augment their data sources for AI modeling of conversion predictions, propensity predictions, and score predictions. Even better was they could then feed these prediction scores back into the customer profiles to help the salesforce select opportunities with the highest probability to convert.
The Deloitte team documented the sales journey then overlaid the metrics needed at each touchpoint
Increase Accessibility for Faster Problem Solving
The next theme that stood out for me was the need to give everyone access to quality data. While I highlighted the need earlier, the theme really deserves its own attention because of the several presentations focused on the topic during the conference. Increasing accessibility to data at all levels required data democratization, establishing an Analytics Center of Excellence (ACOE), and designing data products for the users in mind.
Data democratization is allowing more users to make faster decisions. Product Owner Sara Bonefas and Senior Software Developer Ashley Dierkes of Discover Financial Services shared how they drive data democratization by breaking up their Tableau strategy into two different parts. IT takes ownership of the platform and puts the responsibility of data management on business departments. Placing the ownership of data management on the departments leads to the users’ feeling ownership of the data and its quality. While Dierkes and Bonefas don’t manage the data, they are responsible for the Tableau platform. To foster a community of excitement around Tableau they hosted a mini-conference at Discover HQ. This event allowed analysts from every department to meet others they had not met and collaborate around Tableau. The conference also provided opportunities for analysts to share their stories around Tableau. Dierkes and Bonefas also led the Discover TUG where other Tableau enthusiasts could gather, learn, and discuss. The group had attendance as high as 200 people at one point. This is another opportunity for attendees to present their success stories with the platform. These types of events and communities drive the enthusiasm around the product and offers collaboration opportunities that can lead to better analytical outcomes.
Example of all working on data and sharing across teams during Tableau Keynote
Product Owner of Enterprise Data and Analytics at Insight Global Kaitlin Pisani spoke about accelerating your data culture with a center of excellence (CoE). She covered the challenges any org is going to encounter when trying to establishing a CoE. Resistance to change, siloed analytics teams, too many conflicting dashboards, no communicated vision, and an inability to quantify program benefits are all hurdles that will have to be addressed during setup. With so many challenges, you have to start somewhere and that’s exactly what Pisani recommends. Start small and make progress towards you CoE goals. Name those organizational challenges and solve them. Identify key players to partner with in the business, especially c-suite members as they have the power and influence to make changes. However, the presenter felt that a bottom-up approach to adoption may deliver better results since the frontline employees are responsible for the numbers we see in reports. While leaders should define where we are going, they should lean on their employees to understand what actions make the data move and support them with the tools and resources needed to make them move in the preferred direction. Also, document everything along the way. With so much input from so many different people you’ll want to make sure you document meetings, meanings, definitions, process, and standards learned along the way for future reference and issue resolution. At the end of any CoE program, it’s really about the relationships you create along the way so learn to work with others effectively so everyone involved can reach a common goal. To create the relationships the CoE was established to promote the power of data. The CoE consisted of four components: a Governance Council (GC), an Executive Steering Committee (ESC), a Tableau User Group (TUG), and technical leads. The model is a hub and spoke structure where the GC was given direction by the CoE and then distributed communication to the technical leads of each department. The only way the program is effective is that it is built upon trust and accountability. To establish that trust, attending meetings is required, self-accountability for leaders was defined, data solutions were built collaboratively across teams, and a dose of patience because transformation takes time.
I wanted to cover two critical parts of deploying an effective CoE. The Executive Steering Committee and the Governance Council. Pisani mentioned the need for an executive steering committee which is a body of decision makers who have power to drive change and who are committed to driving a data-driven culture. They must be willing to understand the pain points of employees in the trenches and get them the data tools they need to make the right business decisions. These executives must also support and defend the business intelligence needs of the staff. Always asking what data do we need from which groups, how they will collect that data, and how they will distribute findings from the data. A Governance Council is a governing body of subject matter experts that create and maintain data governance initiatives, such as, setting data strategy, creating data policies, and establishing data management standards. If you’re not familiar with data governance, then I recommend researching the concept because I’d like to use this part of the article to cover a new DG concept for me: Federated Data Governance. The concept of a federated data governance moves the ownership of data definitions from IT to the business units. Since business units are more educated on the processes within a department and the data those processes create, then it makes sense to move responsibility from centralized IT to a de-centralized business unit model. Of course, the GC will set the standards for how data is handled, stored, and shared but otherwise business units will be responsible for accuracy of data and establishing metric standardization. Glover gave an example of how an HR department would manage Workday data within the larger master data management ecosystem. Glover points out that the HR team knows their data best so they should own the data as a product. There would be a department data owner who works closely with data engineers to get the data the department needs. Transparency and SLAs are also set by the business to ensure data is available when needed. The engineers would then work with HR to surface the data to the rest of the org. However, if there are any questions about data accuracy or completeness then data owners would field any questions about the data.
An example of the hub and spoke center of excellence model for Insight Global
A major theme for the presentations I attended was designing dashboards for easier consumption by the user. To design a dashboard that is easier to consume, then you need to know what type of information users want to consume. You can learn a lot about your users through user interviews, wireframing, and prototyping. That is the exact approach taken by the Lovelytics Team. I really appreciated the depth VP of Visualization and Training Chantilly Jaggernauth goes to understand the needs of the users. Her company’s process for gathering requirements ensures dashboards align with the needs of the business users. They start with a user story capturing the core users and the information they need to answer questions. Next, they capture a goal for the new dashboard which includes the list of questions the dashboard is expected to answer. All of this information and more is captured in what they call the business discovery phase. This phase is the time to meet with the clients, understand their key business metrics, and the data they need to answer those questions. During this phase, the Lovelytics team will try to hone in on specific details about the dashboard, such as, the intended audience, preferred display mode, mediums for display, filtering logic, and color preferences. They will also ask the client if they have any example dashboards to help guide design. After they complete the business discovery, the create wireframes to share with the client and get buy-in. The wireframe will guide the data discovery session to ensure all requested views can be supported by available data. Once all data is confirmed, a prototype will be created for the client to test and approve. Finally, a last round of edits will conclude although there shouldn’t be any surprises. The client was involved in every step of the process so they should be ready to sign off on the completed dashboard. A dashboard built to their specifications containing the data they need for strategic decision making.
The requirements gathering process Lovelytics uses for dashboard development
Speed to insights is always a major focus for the analytics community and Tableau Conference had several talks about speeding up design so you can spend less time on development and more time on information sharing. There were two parts of the Tableau Conference that showed the benefit of standardized and templatized solutions for Tableau. Tableau Speed Tips are always one of my favorite parts of the Tableau conference because any time I can learn to speed up my design process then it’s a big win for insights generation. While I didn’t get to attend all the speed tips presentations due to so many great presentations at the conference, I definitely caught Head Coach Ann Jackson at The Data School New York present as I always learn some great speed tips from her. The topic I attended was on statistical process control charts and how to create them faster. Speed tips around pre-aggregation of measures, copy/editing, and the Index function will shave minutes, if not hours off my design process. Templates are another great way to speed up your design process especially if you’re working with common business metrics. Senior Product Manager John Demby and Director of Product Management Nicholas Oury highlighted Tableau’s Accelerators to help you jumpstart your dashboard development. Many of the templates available in Accelerator are built from dashboard design best practices and aligned with the most relevant data across multiple industries. Another way you can do less design and more analysis.
Community Drives Data Adoption and Literacy
The final major theme that stood out to me at the conference was driving data adoption and literacy within the organization. This section covers the need for executive champions to lead the charge for better analytics, the need for development of future data leader, and the power a community has to drive adoption.
The three pillars to drive analytics culture, adoption, and literacy within your organization
Let’s discuss the need for executive champions and developing new ones. For any effective data adoption strategy, a leader who can influence other executives is required to drive the adoption of organizational-wide analytics. According to Pisani, one of the challenges to an effective data culture is a lack of established vision. The vision of the company comes from the leadership team. Having a leader involved in the planning process that requires analytics be part of the process drives adoption in the c-suite. By promoting the benefit of analytics with other executives the right champion can drive interest of analytics with their peers. Soon, all the executives are requesting analytical support to their decision-making. Brian Smith, Sr Advisor of Cardinal Health would also support the idea getting the buy-in from senior leadership. His analytics program includes the Tableau Heads which are a community of company leaders willing to collaborate to drive better analytics. However, it’s not enough to have executive champions currently because you also need to have executive champions of the future. Brian Smith also covered his approach to building champions within an organization. He was responsible for building a community of Tableau Users starting 10 years ago. He led the Tableau User Group (TUG) at Cardinal Health. They met around every 2 months with a purpose to bring people together who had never met to have discussions about Tableau. However, TUG didn’t address the issue of Tableau training and development. Smith had adopted the Tableau Quest idea originally presented by Fiona Gordon. The program offered a light online training schedule mixed with practical exercises which were focused on the Cardinal way to make the content relevant and applicable.
Creating executives of the future requires training and professional development. Brian Smith would be quick to point out that training initiatives sometimes don’t work. Time can be a major hurdle for employees to get past when it comes to training programs. They view it as a burden and taking important time away from projects that matter. However, Data and Analytics Senior Director of Mercado Libre Adrian Quilis has had success with his training program by using a push method vs a pull method to training. The difference between the two approaches is the business tries creating content that staff can access when they have time instead of trying to push more training on an already lean staff with limited availability. He established an organizational Data Academy providing all employees the access to the training tools they need to develop their data skills. While it’s great to have the resources available, that doesn’t mean people will use it. Quilis offered incentives to get skilled up. The data academy is strongly aligned with company and department initiatives. Data upskilling at the academy leverages real company data to learn from. By learning new data skills, employees are actually able to use data tools to hit their personal goals faster than if they hadn’t used data. The company also offers promotions based on your data expertise. Employees who have shown initiative and progress in the Data Academy get prioritized during promotional conversations. The nice thing about Quilis’ data academy is there is a beginner and intermediate path so you can start your data education based on your current level of expertise. Which means even the most data novice person can attend, get training, and leverage that training in their job to drive results.
Libre Mercado’s data learning path offered through the Data Academy
A community is extremely helpful in growing engagement and excitement behind analytics. It is also a training ground for future data leaders. There were so many examples of building out a Tableau community throughout the conference and I’d like to share some of those examples here. First, The University of South Carolina faced a daunting challenged in 2019 to create a single source of trusted data and Vice Dean of Graduate Education Angelina Sylvian and Business Intelligence Strategist Caroline Maulauna knew they were going to have to engage the community if they wanted to be successful. They had also bought into the idea of a Federated Data Governance approach by acting as a liaison between end users and technology. This approach meant they would need to rely on the analytics groups across the campus on work on getting their buy-in. They established an Analytics Community of Practice consisting of professionals that had similar job functions across departments. They kept them engaged with monthly newsletters, events, and trainings. The group would discuss how to more effectively communicate with data. Capaldi and Howell contend that sharing with others is powerful. At Duke University, on top of the Governance Council and Center of Excellence they have also established a Tableau User Group and a Duke Analytics Community. In these groups, Duke allows Tableau Users to develop through group events, challenges, and meetups where data professionals can meet and collaborate on all things Tableau. Capaldi and Howell learned that the way you manage a thriving analytics community is through strong communication and commitment. Define the results you want to see from your BI platform and then develop communities around those goals.
The Tableau Conference is a great event to get excited about data and learn how to get others excited with you. I walked away from it this year with a renewed excitement about democratizing data across the organization. When I return to office, I can’t wait to share my learnings with my executive team, my peers, and other co-workers. The more you can engage and excite others around using data effectively the better your insights will be. Open up the data, educate the org on how to use it, and create tools that can make using data easy to consume and enjoy the benefits of a data savvy workforce. Head back to the office and start implementing what I shared today and enjoy the benefits of your new approach to data.
Comments