Design Principles for Data Visualisation in a Healthcare Setting

The need for user-focused iterative design when presenting healthcare data to a wider audience

Lucy Lindsay
Towards Data Science

--

Healthcare generates vast quantities of data and potential for data analysis. Prior to any analysis or comparisons being made, it is important to fully understand the data and how it is going to be used. As part of a project into data visualisation, funded by the Health Foundation, the North East Quality Observatory Service (NEQOS) formulated seven design principles.

Whilst the focus of the project was to convert flat file formats to interactive tools, we had a great deal of variation; both in the data sources and the type of output. During the course of the project, we found it beneficial to draw up some guidelines to aid the development of the tools. Following a brief literature review and interactions with clinicians as part of the project these were further developed to generate the seven design principles:

  1. Clinicians are involved every step of the way
  2. Lead with the right clinical question
  3. Make data timely and relevant
  4. There’s no learning curve for users
  5. There’s no need to reinvent the wheel
  6. Data is flexible and shareable
  7. The right tool for the job

Design Principle 1. Clinicians are involved every step of the way

End-user (clinician) engagement in the design stages is essential for getting the functionality of an interactive data visualisation (DV) tool/dashboard right for them and ensuring acceptability.

For a DV tool/dashboard to be used to inform quality improvement it needs to be acceptable to the end-users, in this case, clinicians. Acceptability is increased if users are involved in the design stages. Involving clinicians in the design phase allows exploration of how they or members of their team would use a DV tool/dashboard and therefore functionally is not based on supposition from the development team. Due to the competing priorities and clinical commitments engagement with clinicians had to be shaped and adapted to fit a clinical model.

Clinicians’ requirements and key components for a data visualisation tool

This principle was made with regards to a healthcare project but holds true across many fields — the End-User should be involved in every step of the design process as this will ensure that functionality exists in a tool as expected and that uptake of the tool is effective. It is a misuse of resources to design a tool that is not fit for purpose and therefore not routinely used.

Design Principle 2. Lead with the right clinical question

Be clear about the main clinical question(s) for end-users and design the DV tool /dashboard accordingly.

When working with a large dataset, there is a temptation to try and answer questions that are not being asked. This can cause data overload, potentially leading to key information being missed. A dashboard or data visualisation tool should “filter the noise” so that the key “signal” is easier to find in all the information.

Ultimately a clinical question can be focussed with a number of follow-up questions if necessary. When end-users would like to have an overview of their clinical unit they are asking a different set of questions than when exploring the data to find out in-depth information about different aspects of quality. Therefore a DV tool/dashboard should ideally be designed to allow for various levels of detail depending on the questions being asked; it should be possible to obtain an overview of how a department or hospital is functioning as well as answering in-depth queries into different aspects of the actual care quality.

Design Principle 3. Make data timely and relevant

Data should be as up-to-date as possible (and consider excluding data that is too old to answer the question(s)).

Data may lose some of its power and become irrelevant if it is “old”. Decisions on quality of care should be made based on the most recent data available that is considered useful. It may be worthwhile excluding metrics based on underlying data that clinicians consider to be too old to provide useful insights. This is an idealised situation. The data used in this project was based on publically available National audits. Production of these audits requires time to collect, cleanse and analyse the data, and therefore there is a natural delay. It is possible that the most recent data available may already be obsolete to a degree. In a clinical setting it may be possible to collect real-time data however in this scenario there will be little or no opportunity for data cleaning or aggregation and consequently comparison of either clinical units or quality of care may be impeded.

Design Principle 4. There’s no learning curve for users

Ensure that the DV tool /dashboard is self-explanatory to end-users to use and easy to navigate.

The end-users should not require significant IT skills to use the tool or dashboard. End-users have limited amount of time to learn how to use a new tool. in order to maximise usage, the tools and their navigation need to be self-explanatory and should not require a qualification in IT/Computing to be able to interact with the data with little lead-time. There should be no need for a conventional User Manual to be able to interact with the data. To assist with usage of the dashboard, in this project, we were able to incorporate detailed “how to” sections on the visualisations. These provided clarification of how the tool was going to work without impeding the displays and similarly reinforcing the ease of use of the DV tool/dashboard. Using floating “how to” can also be used to provide explanatory text; explaining what the data shows and how best to interpret it.

It is important that navigation is intuitive within the DV tool. The order and flow of metrics should make sense to clinicians. The destination from any link should be something that is expected. The involvement of the End-users in the design stages should aid with this navigation process, giving an indication of how a user approaches the product. It is important that the users are able to spend their time focussing on the data and the story it is telling rather than trying to navigate the DV tool/dashboard.

Design Principle 5. There’s no need to reinvent the wheel

Be cognisant of recognised visual design principles. Include interpretive narrative where interpretation is required, ensuring that you avoid “busy” visuals and information overload.

Layout and formatting are important — there needs to be a good balance between visual complexity and information utility. If there is inconsistency throughout the DV tool / dashboard then users may be overwhelmed. Similarly, colours need to be visually appealing, appropriate for colour vision deficiency (CVD) and does not have unintended implications; for example, using red and green but not in a RAG-rated situation. Navigation elements should be consistent (same place, look the same, clear labelling) and they should look like they are related or part of a collection. Once functionality is set-up on one viz (vizualisation) the users expected all further vizes to have the same functionality and feel.

Developers of a dashboard/DV tool should be cognisant of PARC principles (Proximity, Alignment, Repetition and Contrast) and Gestalt principles; it is human nature to look for patterns whether they exist or not. If two items look similar, it will be assumed that they function similarly. When creating a dashboard displaying multiple metrics and graphs human nature dictates that those that are close together will be associated metrics.

When it is felt interpretive narration is required this needs to be sufficient to be helpful and provide answers without the visualisation being too complex to provide easy-to-view answers to a question. It important that the visualisations on any dashboards are clear and not cluttered with unnecessary text.

Design Principle 6. Data is flexible and shareable

Ensure that the DV tools/dashboard has the capability for outputs to be extracted for inclusion in users’ own report and presentations.

The concept of data recall underlies the navigation touched on in earlier design principles ((Design Principle 4) and (Design Principle 5)) one of the key features of a DV tool/dashboard is that any selection made when the tool is first being used is retained throughout the product. This improves navigation and ensures that visualisations are correctly interpreted.

To maximise the productivity of an interactive dashboard, and that of the end-users, it should be possible for a user to set–up their own queries of the data which are retained within the software. These can then be shared or re-run at regular intervals. Being able to save the queries ensures that there is no variation in the question being asked at the next time-point and this should assist in validating the results and highlight any issues (e.g. being an outlier).

It is important that data and data visualisation can be shared, this also provides a way to improve the quality of care being provided where data can be presented either to highlight any issues or to reinforce any positive improvement. It is therefore important that any product has the capability for outputs to be extracted for inclusion in users’ own reports and presentations. It should be possible to save individual charts with the layout of the chart being conducive to extraction (i.e. any interpretive text does not overlap the chart). The charts should then be saved in a format that allows them to be shared, either by email or by inserting them into a different document or presentation.

Design Principle 7. The right tool for the job

Static and interactive dashboards both have a useful role, as static reports may still be required to provide overviews of care quality and interactive DV tools/dashboards may be required to allow for ‘deep-dives’ of the data.

The production of an interactive data visualisation tool does not mean that a static product is no longer required. Both types of reports (static and interactive) have distinct roles which are both beneficial to clinicians. Whilst static reports have limited content, in that the questions asked are predetermined and cannot be changed, they also provide contents that is literally in the hands of the end-user. Some clinicians appeared reluctant to use online tools due to outdated/ restricted IT systems that may operate within the National Health Service (NHS) and are prepared to sacrifice the interactive nature of the DV tool for stability and reliability.

The interactive software used should also provide stability and reliability in terms of output but will also allow for the discussion and variation in questions being asked of the data. Assuming that the underlying computer hardware and network provision it would be beneficial to choose the right tool for the right job; the static report providing the overview of data and high level analysis and the interactive tool to allow further investigation of more specific questions.

Conclusions

These design principles were formulated as part of a project to investigate whether interactive data visualisation help clinicians improve patient care. Although they were intended to relate to DV tools and dashboards the overarching principles can be extrapolated to any reporting system. Similarly, the design principles could be applied to alternative fields, outside of healthcare, where client involvement would be beneficial in the design of a tool measuring outcomes.

Adoption of these design principles may encourage uptake of a product; guidance from the end-users and their involvement in the design phases may make the product more appealing by being more “fit for purpose”. Ultimately the principles should support and reinforce the design of a usable product that requires little lead time and behaves as expected.

--

--