Sales Operation Performance Dashboard
User research | Design
Visualizing data to provide business process transparency
DURATION
MY ROLE
January 2013 - January 2014
Lead designer
TEAM
• Product owner(s)
• Designer
• Data subject matter experts
• Developers
CLIENT
Large, globally distributed IT organization
Highlights
• Provided a framework to guide the team in gathering the requirements data needed for designing an effective dashboard
• Iteratively refined the dashboard design using input from users and subject matter experts
• Applied data visualization best practices
• Learned about available reporting engine capabilities for application in the design
• Worked with the developer to understand feasibility and helped map data to the design
Problem
Provide executives and managers with transparent views of the end-to-end quote to cash sales process to help them know where to focus to improve the business process.
Project goals
• Portray metrics within a dashboard to help the business improve the efficiency and effectiveness of the business process, respond to workload variation and improve quality.
Challenges
• The dashboard was to built in Cognos, which did not fully support all of the data visualization functionality that we might want to include in the design.
• Obtaining appropriate data to surface on the dashboard
• Educating the business team to think in terms of solving business problems for target audience rather than just presenting KPI’s.
• Ensuring correct and consistent understanding of the data presented.
To comply with NDA, I have intentionally hidden and replaced content in this case study.
Process
I involved users and subject matter experts in order to gather information regarding what business questions the dashboard was to help answer and to identify corresponding metrics, while applying data visualization best practices.
Methods
• Performed user research to understand what users needed from the dashboard.
• Provided a framework for translating business needs into appropriate dashboard elements.
• Iteratively refined the design using input from users and by applying data visualization best practices.
• Assisted the team with mapping available data sources to the intended dashboard view.
• Learned reporting engine (Cognos) capabilities and worked with development on feasibility.
• Provided feedback on the implementation as the dashboard was being built.
Discovery
Worked with users and subject matter experts to understand how the dashboard could provide executives and managers with answers to key business questions.
Provided a structure for gathering requirements
Methods
Involved a user council
Reviewed business questions of interest with users and obtained their reaction to the proposed dashboard.
Involved executives and managers
Reviewed the dashboard designs with users and subject matter experts, iteratively refining the design.
User roles
Operations executive
Transformation executive
Operations manager
Transformation manager
User goals
Gain visibility to process bottlenecks
Ex. Are we billing accurately?
Be responsive as a business
Ex. How quickly does the business respond to volume fluctuations?
Improve the customer experience
Ex. What aspects of the process most impact customer satisfaction?
Identify process improvements
Ex. Are there parts of the business that we can learn from?
Meet financial objectives
Ex. What is the cost per transaction?
User scenarios
Below are example user scenarios for an executive role
As an executive, once, or even a few times, a day I want to find answers to business questions that I have. Along with other input and data from other sources, this will help with making strategic decisions as to where to focus when improving business processes and delivery speed, how to allocate resources, and where to focus efforts to obtain revenue from our proposals . To answer some of the questions, I will run the dashboard multiple times, for different time periods, as well as sectors and channels.
Example business questions
Based on quotes that have completed the various stages of the end-to-end processing:
• Are we meeting our brand specific targets for processing quotes through the end-to-end pipeline?
• What portion of our quotes is not meeting those targets? How much potential revenue is being impacted?
Among our proposals that are currently waiting for decisions from the customer:
• How long are they waiting?
• How much of our potential revenue is wrapped in proposals that have been waiting a long time?
Design
The design was iteratively refined through user feedback and application of data visualization best practices, while addressing feasibility constraints.
• Performance comparisons between brands and countries were supported.
• Sales quotes that took longer to complete than the business target timeframe were highlighted on both a percentage and value basis.
• Drill through links were provided to allow users to access quote level details.
Phase 1 version
In this first build of the dashboard, chart use was limited for feasibility. The dashboard helped to answer the question of where were quotes taking a long time to produce, as well as show what portion of sales proposals were waiting an extended period for a decision.
• Spark lines were used to indicate trends in a compact manner.
• Bullet charts provided performance context relative to targets.
• A line chart for volume provided both past and future facing views.
• Problems areas that deserved attention were highlighted.
• Filtering options were provided to slice the data.
Complete version
In the full version of the dashboard design, additional chart types were used to help visualize data at a glance across a number of key metrics involved with the sales process.
Reflection
This was one of the projects where I had the opportunity to lead the business in considering the unique factors that go into dashboard design. I am thankful for the opportunity to learn and work in this space.
Here are some takeaways that I have:
Consider platform and data measurement decisions early
Creating a dashboard that is effective in providing the right insights is dependent on having the right measurement data available, as well as the capabilities of the reporting platform to surface that data in visually efficient manner. These factors should be given consideration early on in the project so that they can be accounted for in the design and, where possible, be addressed directly through platform selection and data enhancements.
Make dashboard testing real
To evaluate a dashboard, its helpful to have users think through the data presented and attempt to use it to extract real insights. That being the case, building an interactive prototype that uses real business data could go along way towards identifying issues early and making refinements ahead of full dashboard build out. However, this must be done in a rapid and inexpensive manner. One way to do this would be to build a sandbox environment that could be used to rapidly test early versions of many different dashboards across projects, thereby gaining cost and time efficiency.