
Evaluating the impact of health information - HTML version

5. Tools for evaluating legacy content
Reviewing and evaluating legacy content is a challenge for many organisations. During the COVID 19 pandemic, health information providers rose to the challenge of creating new information rapidly to meet the needs of their audience. As a result of this rapid approach, review cycles slipped. The PIF TICK includes some helpful mechanisms for organisations to use to make sure their information content maintains its quality and continues to improve.
Legacy content can take many forms but webpages and leaflets are the biggest content challenge for organisations providing health information. Evaluating your content can help you prioritise what to review and update immediately, what to review later and what to archive as no longer relevant or helpful.
PIF TICK criteria 1.7
Minimum: A resource review schedule is in place, together with a process which explains the different ways you will review information based on information type, last review date, updates in evidence and feedback. Some level of review must take place every three years.
Desirable: The different levels of review based on impact of resources, risk and use are documented and evidence provided for the rationale.
Thinking about your legacy content
It can help to start by asking these questions when dealing with legacy content:
- Why was it created?
- When was it last reviewed?
- Do you have any impact data or feedback?
- Does it meet your style guide and reading age guidance?
- Is there still a need for the content?
- Does the content align with the strategy of the organisation?
Asking these questions can help streamline your legacy content. You can:
- Combine information and reduce duplication.
- Decide if you’re the right organisation to provide the information.
- Streamline and signpost to other credible information providers.
PIF members use a number of tools to manage legacy content based on evaluation data.
Impact versus effort matrix
Using a tool such as the “impact versus effort” matrix can help organisations to prioritise where their efforts are best spent.
Information with high impact and low effort to update is a priority – top left quadrant.
Information with low impact and high effort, bottom right quadrant, could be archived, particularly if another provider covers the topic and you could signpost instead.
Prioritise legacy content
Impact - high
Effort - low
Consider archiving legacy content and signposting to other providers
Effort - high
Impact - low
Content performance matrix
The charity, Mind, uses a content performance matrix to evaluate the performance of its online content.
Metric | Target | Source |
Publication/review date | Target: <3 years | Editorial records |
Reach | Page traffic views and sessions, Source: | Online feedback form, Google analytics |
Accessibility of language | <reading age 11 years/ grade 7 | Hemingway Editor, which is free (www.hemingwayapp.com) OR a paid web governance platform, like Silktide |
The content matrix developed by Mind looks at page views versus user satisfaction ratings. This helps the charity prioritise content for review.
- Content with high page traffic and high satisfaction is ranked as great content.
- Content with high page traffic and low satisfaction is ranked as high risk with opportunity for improvement.
- Content with low page traffic but high satisfaction is viewed as niche or potential.
- Content with low page traffic and low satisfaction is considered for archiving.
Tool: Surveys
Data collected
- Mainly quantitative
- Qualitative questions can be included
Pros
- Relatively easy to set up
- Can use one or more channels and keep the survey consistent
- Some online tools will analyse the data
- Can be cheap and easy to use
Cons
- Questions need to be worded carefully to avoid influencing the answers
- May result in a lot of data to analyse
- Paper-based surveys will require manual processing
- Some questionnaires can only be used under licence so may attract a cost.
Comments
Two key categories of question:
- Closed – providing two or more options to choose from
- Open – allowing people to answer in their own words
Open questions provide qualitative data and offer detailed insights into people’s experiences.
Answering open questions can take much longer than choosing a response from a list, so they should be used with care.
Top tips:
- Use plain language. Aim for a reading age between 9 and 11.
- Test your questionnaire with a small group of volunteers before you start using it. This provides an opportunity to check people understand your questions and ensures you have addressed any accessibility issues.
- Consider use of corpus linguistics or other AI tool for analysis of free text.
Tool: Validated questionnaires
(A questionnaire or scale developed to be used among intended respondents. Tested using a representative sample to demonstrate reliability and validity.)
Data collected
- Mainly quantitative
- Qualitative questions can be included
Pros
- Can be disease-specific or generic to people’s skills and confidence
- Can gather data on people’s experiences of care or their quality of life
- Standardised questions to enable comparison
Cons
- Some questionnaires need to be licensed and so may attract a cost
Comments
Key examples include:
- Patient Activation Measures (PAMs)(8)
- Patient Reported Outcome Measures (PROMs)(9)
Top tips:
- Questionnaires should be used before developing your information intervention to establish ‘baseline’ data.
- Questionnaires should be repeated after the intervention, or at intervals during the period of intervention, to evidence changes over time.
Tool: Interviews
Data collected
- Mainly qualitative
- Quantitative questions can be included
Pros
- Rich insight into individual experiences
- Accessible for people who might find other formats challenging
- Gives flexibility to use different channels
- Thematic analysis can be used to identify recurring or important themes
Cons
- Time-consuming to analyse
- Need a skilled interviewer
- Individual experiences may not be relevant to others
- Can be expensive to facilitate, especially if in person
Comments
Interviews offer an opportunity to gain insight on a one-to-one basis. They can take place in person, via the telephone, or online.
Interviews can be structured, with pre-planned open and closed questions.
Include adverse event reporting if needed. For example, when funded by a pharmaceutical company.
Top tips:
- The interviewer should have a clear understanding of the questions they need to ask. They should have skills and experience in conducting interviews and be aware of any additional communication requirements.
- Additional support needs should be in place. These might include a personal assistant, facilities to accommodate a guide dog or wheelchair, or a BSL or language interpreter.
- The interviewer should explain how the interview will be run, what you will be doing with the data, and obtain and record the consent of the interviewee.
- Ensure you have support available to follow up if any critical issues are raised or if either the interviewee or the interviewer become distressed as a result of the conversation.
- Plan a suitable location for the interview, ensuring it offers a safe, confidential space, and is accessible for anyone attending in person.
- Consider incentives for interviewees.
Tool: Focus groups
Data collected
- Mainly qualitative data
Pros
- Rich discussion drawing on multiple perspectives
- Participants may contribute more as the discussion develops within the group
Cons
- Requires skilled facilitation
- Can be time-consuming to analyse
- Can be expensive to facilitate, especially if in person
- Risk of only involving people already warm to your information
Comments
A facilitator should guide the discussion. A second facilitator should take notes and offer practical support with the organisation and management of the session.
The facilitators should encourage all participants to contribute. The interactions between individuals in the group offer an additional opportunity for insight, as they can stimulate discussion and debate.
People with similar backgrounds and experiences are more likely to relax and contribute openly.
You may want to plan two or more groups to involve a broad range of people. Focus groups can run in person or online. They tend to be 45-90 minutes long, and they need to be planned carefully with skilled facilitators.
Top tips:
- In general, focus groups should involve between 6 and 10 participants. If the group is too small there is limited opportunity for discussion and debate. If too large, quieter participants find it difficult to contribute.
- In the context of evaluating your impact you are most likely to be looking for a group of people who have used your information or your service. They will already have some shared characteristics.
- Consider incentives for participants.
Tool: Complaints and compliments
Data collected
- Mainly qualitative data
Pros
- Your organisation should already collect this data
- You can use your recorded actions to demonstrate that you are implementing improvements
Cons
- Some complaints and compliments may be specific to one individual experience and therefore not universally relevant
- Data tends to be submitted by people who have had particularly good, or particularly bad experiences
Comments
This data can be helpful as a collated set over time, identifying trends or themes over defined periods of time.
This form of engagement can open opportunities to work more closely with people who have direct experience of your offer.
Sharing information about the actions you have taken, changes you have made, and what you have learned from feedback with your stakeholders can offer further opportunities for engagement.
Top tip:
- As a team you should have processes in place to record and respond to any feedback you receive, whether that is positive or negative, and what actions you took because of it.
Tools: User stories and case studies
Data collected
- Mainly qualitative data
Pros
- Rich and detailed insights
- Powerful communication tools
Cons
- Analysis can be time-consuming
- Stories can be very personal and experiences may not be replicated for others
Comments
Some people will be happy to write about their experience, or draw a picture to illustrate it.
Some people will be happy to tell their story to someone, and check that it has been recorded accurately.
Some people may need significant support to explain their experiences, through an advocate, carer, or friend.
Top tips:
- Individual stories can be a way to bring your data to life when you are looking at how to share what you have learnt about your impact.
- Ensure you obtain appropriate consents for individual stories and comply with your data protection policy.
Tool: Outcomes Star
Data collected
- Quantitative and qualitative
Pros
- Simple visual format listing key outcomes
- Records information over time
- Range of Outcome Stars are available across a variety of sectors
Cons
- Only available under license, so costs will be incurred
Comments
The Outcomes Star is a set of person-centred, evidence-based tools designed to measure and support change when working with people. Outcomes Stars are designed to be used with individuals. There are 7 stars specific to health and wellbeing.
Top tip:
- As a team you should have processes in place to record and respond to any feedback you receive, whether that is positive or negative, and what actions you took because of it.