Users regret health decisions based on bad information

More than 4 in 10 adults say they regret a decision they have made based on bad health information. The 2024 Edelman Trust Barometer Special Report: Trust and Health found those aged 18-34 are most likely to report regret. The top three sources people reported being misinformed by were product advertisement, friends and family and user-generated content. 

The report is based on more than 15,000 responses from 16 countries. It also found that, while 2 in 3 people feel empowered to manage their health, fewer than 4 in 10 feel empowered and also trust the health system. Higher trust and empowerment led to better personal health outcomes.  Other key findings include:

  • 2 in 3 people say political polarisation has harmed their health.
  • Friends and family are as trusted as scientists and medical experts to tell the truth on health.
  • 1 in 3 agree that by doing their own research the average person can know as much as a doctor.

Almost 6 in 10 say contradictory advice, changing recommendations and a lack of information prevent them from taking better care of their health. The report highlights that being a "reliable source of trustworthy information" would increase trust by more than 10% for businesses, NGOs and governments.

Read the full report via the Edelman website here.

Digital and media literacy curriculum evaluation 

A new report assesses changes in students' digital citizenship, media literacy and attitudes towards misinformation and disinformation after the delivery of a digital citizenship curriculum. The London School of Economics report is based on delivery of the curriculum in primary and secondary schools across the UK. Testing data suggested consistent improvement across all age groups after as little as six weeks. Factors influencing resilience to misinformation and disinformation include learning about digital health and digital safety.

The report also highlights a "persistent digital divide" affecting digital citizenship. Students from "media rich" and "digitally experienced" households demonstrated a more intuitive grasp of how to navigate digital tools. Those from less experienced households faced a steeper learning curve, impacting their engagement and the benefit drawn from digital citizenship interventions. 

Read the full report via the LSE website here.

Global Principles for Information Integrity

The United Nations has launched new Global Principles for Information Integrity. The principles emphasise the need for immediate action to address the harms caused by misinformation, disinformation and hate speech. Launching the principles, UN Secretary-General António Guterres said algorithms should not control what people see online.

Read more about the principles via the United Nations website here.

Evidence summaries: Making health information more accessible

In its latest bulletin, Evidence Aid highlights four summaries that focus on the different ways healthcare can be made more accessible. It says, in many cases, low health literacy – coupled with the use of complex medical jargon and language barriers – results in misuse and misunderstanding of health information, further deepening inequalities and disparities among social groups.

Access the summaries via the Evidence Aid website here.

Study: Co-producing health information with young people

A new paper reflects on how children and young people acted as advisors to coproduce information about Long COVID.  The work was underpinned by the Lundy model, a framework which provides guidance on helping children and young people contribute to matters which affect them. The report highlights the importance of developing rapport, facilitated by using approaches and activities which create a space where children and young people feel comfortable and listened to.

Read the full report via the Wiley Online Library website here.

Priorities for an AI in healthcare strategy

The Health Foundation has published a report outlining priorities for an artificial intelligence (AI) in healthcare strategy. It says this strategy must be developed under the guiding principle of responsibility to ensure the use of AI is not only legal and ethical but also works for the greater social good. The Health Foundation says existing ethics frameworks and guidance are insufficient in this regard. A renewed approach is needed with responsibility at its core to ensure AI works for all.

Read the full report via the Health Foundation website here.

Study: Identifying the needs of people with Long COVID

A qualitative study aiming to identify the needs of people with Long Covid has highlighted the importance of tailored support and public education. Authors say, alongside support and increased research, there is a need for societal awareness of the condition. This should be promoted via local and national initiatives which aim to educate the public and reduce stigma.

Read the study findings in full via the BMJ Open website here.

Invitation to take part in research

Dr Bethan Treadgold is currently leading a programme of research funded by the NIHR School for Primary Care Research, called Accessibility. The aim is to explore possibilities around primary healthcare professionals getting more involved in online support groups to, for example, quality approve information and advice. Bethan would like to invite primary healthcare professionals and members of the public with experience in using online support groups to take part in one-off informal interview. If you would be interested in participating or have any questions, please email [javascript protected email address].

Listen: Embedding patient perspectives into research

The Not Just Patients! podcast focuses on breaking barriers to meaningful patient involvement in healthcare. In the latest episode, Robert Joyce talks about the critical role of patient involvement in clinical research and the various ways in which patients can bring rich perspectives.

Catch up on all recent episodes via the Not Just Patients! website here.