Stephen Few defines a dashboard as: "A data dashboard is a visual display of the most important information needed to achieve one or more objectives, with the data consolidated and arranged on a single screen so the information can be monitored at a glance" (Few, 2004).
Data dashboards typically include several visualisations such as graphs or other visual representations of data and minimal text to describe the indicators being displayed on the dashboard. By displaying these visualisations on a single screen, the user can directly compare and draw conclusions from the data ‘at a glance’, which is not possible if the data is split across several screens or requires scrolling to view.
Data dashboards are increasingly being used in the social sector to monitor performance of projects, programs, teams and organisations. When appropriately designed and executed, dashboards allow effective tracking of performance in an engaging manner, and support timely interventions. Dashboards are also commonly used within cross-agency collaborations (e.g. collective impact initiatives) to track shared measures of change.
Unlike infographics where the information displayed is generally static, data dashboards are dynamic, with information being regularly refreshed and updated, often in ‘real time’ (i.e., as soon as new information becomes available).
Key characteristics of data dashboards include:
- All visualizations fit on a single screen
- Displays the most important indicators to be monitored over time
- Regular updates of data (ideally automatically)
- Is easy to understand; can be understood and used by anyone with access
- Often includes filtering and ‘drill down’ functions which enables users to view the data of most interest to them (e.g., filtering by location, age or gender). The visualisations then update to display only data that meet the characteristics chosen.
When well designed, dashboards enable the transformation of data repositories into consumable information. The visualisations facilitate identification of trends and patterns, and ideally, the information presented in a dashboard is used to guide decision making and action.
A dashboard may be designed for different purposes, including as a communication tool to engage with different audiences (internal and/or external to an organisation) or as a management tool to regularly monitor and evaluate the progress of a project, program, team or organisation. Commonly, dashboards are intended to support learning and improvement; by providing regular updates on progress, information can be regularly reviewed and used to inform any necessary changes to design or implementation.
A common criticism of dashboards is that they don’t provide adequate space for narrative explanation or reflection which may be critical to understanding the data presented. One strategy to mitigate this is to create both a dashboard, a single screen of visualised metrics, and a dashboard report, a multi-page document with one or two visualised metrics per page along with an explanatory narrative.
Dashboard software options
Dashboards may be developed using a variety of free or paid software, most commonly business intelligence (BI) software packages. Popular options include Microsoft Power BI which has both free and paid subscription options, Tableau and Yellowfin. Although not specifically designed for this purpose, Microsoft Excel can also be used to create dashboards.
Most data dashboard software allows for manual and/or automatic import of data in a variety of common formats (e.g., .csv and .xls files). When selecting a software for your dashboard, make sure to read the terms and conditions carefully and consider issues such as where (which country) the data is being stored, who retains ownership of the data, the fees (e.g., if by user or by month) and whether the software is well established and likely to remain operational over the next few years.
Many dashboards are web-based and thus, can be accessed from anywhere via an internet connection. Some data dashboards are publicly available (see below for examples), while others are only accessible to those who have been provided with access such as relevant staff within an organisation.
COAG Performance Reporting Dashboard
The Council of Australian Government (COAG) Performance Dashboard reflects the joint performance of all Australian governments across multiple sectors, including housing, education, healthcare and infrastructure. The data used in the dashboard come from a variety of surveys, administrative data sources and censuses, and generally displays progress over time since 2008. The dashboard allows the user to filter by state or territory of interest, and display indicators for each sector. For each indicator, there is a clear statement displayed about whether that indicator is on track to reach the benchmark (target) set by COAG. The dashboard was developed by Data61, a CSIRO entity that is the largest data innovation group in Australia.
The COAG Performance Dashboard is a good example of a dashboard that displays a multitude of information in a relatively clear and accessible format. However the number of indicators included can be overwhelming at first glance.
South Australia Health Emergency Department Dashboard
The SA Health Emergency Department Dashboard displays in near real time the current status of all public emergency hospital departments in South Australia. This includes the current waiting times, triage categories of patients presenting, expected arrivals and departures and a rolling 24 hour summary of patients presenting and departing across all emergency departments. The data on the dashboard is updated every 30 minutes and there is a detailed glossary available to help users understand the eight sections of the dashboard.
Although the dashboard is not very visually appealing, it is a good example of public services being able to display their service data in near real-time, increasing accountability and transparency, and promoting consumer choice. The colour coding of yellow and red helps the user immediately identify potential areas of concern and the glossary provides more information for those wanting to understand in more detail the data displayed on the dashboard.
Loddon Region Children and Youth Area Dashboard Partnership
The Loddon Region Children and Youth Area Partnership Dashboard displays eleven key indicators across early childhood . The partnership involves a collaboration of over 30 organisations from a variety of sectors. The seven indicators displayed were selected in a collaborative process from an initial long list of 243 indicators, as they were deemed the most important indicators to track over time.
Once the indicators were selected, the dashboard was designed using a free subscription to Microsoft Power BI, with data automatically imported from a variety of government administration data sets. A graphic designer assisted in creating the final visual display. The dashboard allows the user to select region(s) of interest.
This is a good example of a dashboard that was developed with relatively little cost, using freely available software. Once the indicators had been selected, the main cost associated with the dashboard was three to four days of a designer’s time to design the visual appearance of the dashboard (with feedback from stakeholders), and five days of one person’s time to identify the data sources, clean data and import them into the dashboard, and develop processes for regular refreshments of these data.
Advice for choosing this method
- Have you identified the main outcomes (changes you desire) and indicators (measures of these changes) for your project, program, team or organisation? If not, these should be identified before you consider whether a dashboard is an appropriate monitoring tool for you.
- Is it possible and meaningful to measure the indicators you have chosen on a regular basis? (anywhere from daily to quarterly). If not, it is unlikely a dashboard will be a helpful tool for you, as dashboards are best suited for monitoring changes over shorter-time periods. A more static report such as an infographic or webpage is sufficient for data that is not updated or available on a regular basis.
- Are most of your indicators quantitative measures? The focus on data visualisations rather than text means dashboards are best suited where performance can be adequately monitored through quantitative (numeric) rather than qualitative indicators. Dashboard reports (described earlier) or other reporting formats (e.g. photo stories) may be more appropriate when qualitative indicators predominate.
- Do you have resources available (particularly people and time) to develop and maintain a dashboard? Although dashboards are not necessarily expensive to build nor require specific programming expertise (other than familiarity with using data spreadsheets), they do require some initial investment in deciding what information will be displayed, how it will be displayed and how the data displayed on the dashboard will be regularly collected and uploaded (refreshed) to the dashboard. It is essential to include resources for stakeholder engagement in your planning, as without ongoing leadership sponsorship and stakeholder support it is unlikely that your dashboard will be used.
Advice for using this method
This assumes that you have answered ‘yes’ to the questions above in the advice section on choosing to use a dashboard.
- Define your primary purpose for the dashboard: As described above, dashboards may be designed to serve different purposes. It is important to clarify the primary purpose of the dashboard before commencing design - a dashboard is only a tool, not an ‘end’ in itself, and will likely only be successful if part of a broader communication, evaluation, implementation or learning strategy.
- Define your key audience(s) for the dashboard: Before embarking on any dashboard design, you should define who your key audiences are, what information they will want or need and when, and how they would like to access the dashboard (e.g., via desktop computer, smartphone or tablet). The best way to ensure you meet your audience needs is to ask them directly what their information needs are!
- Consider how you will build your dashboard: This includes what software you will use to create your dashboard and if you will engage a designer and/or developer to help you build your dashboard or ‘do it yourself’.
- Identify the indicators that you will display on your dashboard: Our experience suggests you should identify no more than seven indicators to display on your dashboard – any more than this and your dashboard is likely to become visually cluttered, and thus ineffective at communicating information ‘at a glance’. You want to ensure that you choose indicators that are specific, valid and reliable measures of the change(s) you desire, and can be measured on a regular basis.
Defining which indicators to include on a dashboard is often the most difficult step in dashboard design. A useful process to identify your indicators for your dashboard is to first develop a ‘longer list’ of potential indicators, and then convene a collaborative workshop with members of your key audiences to prioritise indicators for inclusion. You may find it useful to agree on indicator criteria and then visually map each indicator against the criteria using a spider diagram or scoring matrix, or use discussions and/or voting to identify the most important indicators to include. For the chosen indicators, you should then decide on which levels (if any) dashboard users will be able to select and/or filter from (e.g., by geographic location or age).
- Decide what you would like your dashboard to look like: Once you have identified your indicators, rather than immediately developing your dashboard using your chosen tool, it is helpful to first develop ‘mock ups’ of what the dashboard will look like. This includes how each indicator will be presented (e.g., the type of graph or other data visualisation), as well as the design and layout of the overall dashboard. The dashboard ‘mock up’ can (and should) be shared with the intended dashboard users for review and feedback before you begin to build your dashboard, often saving you much time and energy in making lots of changes to your dashboard once the data visualisations and associated text have already been developed.
- Conduct usability testing with the intended audience(s): Once you have a prototype version of your dashboard, recruit some representatives from the intended audiences who have not been involved in its development to review it. Observe how they go about using the dashboard, and follow up with questions such as whether they understood what is being displayed and whether they could easily identify data that may require action (e.g., negative trends, unexpected results). Plan for at least two rounds of usability testing to maximise the likelihood that the final dashboard will meet the needs of your audiences.
- Develop a dashboard implementation strategy: This should include where and how the dashboard will be publicised, processes for automatic or manual upload of new data and responsibilities for ongoing maintenance of the dashboard (e.g., responding to any errors or bugs, collating feedback from users).
- Develop a dashboard evaluation strategy: You should specify at the outset when the dashboard design and use will be evaluated so that it can be revised as needed. Typically, this would occur after some weeks or months of use by the primary audience, depending on the purpose of the dashboard and how often the data is updated.
Rainbow Framework Tasks
BetterEvaluation (2017). Visualise Data. http://www.betterevaluation.org/en/plan/describe/visualise_data
BetterEvaluation (2017). Infographics. http://www.betterevaluation.org/en/evaluation-options/infographics
BetterEvaluation (2017). Reporting Needs Analysis. http://www.betterevaluation.org/evaluation-options/reporting_needs_analysis.
BetterEvaluation (2017). Methods for monitoring and evaluation. http://www.betterevaluation.org/en/guides/swot/methods_for_monitoring_evaluating
Chiang, A. S. (2011) 'What is a dashboard: Defining dashboards, visual analysis tools and other data presentation media'. Dashboard Insight [Website]. Retrieved from: http://www.dashboardinsight.com/articles/digital-dashboards/fundamentals/what-is-a-dashboard.aspx (archived link)
Department of Prime Minister and Cabinet (2017). Performance Reporting Dashboard http://performancedashboard.d61.io/aus
Evergreen, S (2017). Various blogs on http://stephanieevergreen.com/blog/
Few, S. (2004) 'Dashboard Confusion', Perceptual Edge [March 2004]. Retrieved from: https://www.perceptualedge.com/articles/ie/dashboard_confusion.pdf (pdf)
Kania, J & Kramer, M (2011). Collective Impact https://ssir.org/articles/entry/collective_impact
Komischke, T (2015). How to design effective dashboards
Rawool, A (2017). Beginners Guide to Dashboard Design https://webapphuddle.com/beginners-guide-to-dashboard-design/
SA Health (2017). Emergency Department Dashboard http://www.sahealth.sa.gov.au/...
Smith, V. S. (2013). Data dashboard as evaluation and research communication tool. Data visualization, Part 2. New Directions for Evaluation,140, 21–45.
Victorian Department of Health and Human Services (2017). Loddon Dashboard https://app.powerbi.com/view...
Expand to view all resources related to 'Data dashboard'
'Data dashboard' is referenced in:
- Rainbow Framework :