Background: Mental health problems are recognized as a pressing public health issue, and an increasing number of individuals are turning to online communities for mental health to search for information and support. Although these virtual platforms have the potential to provide emotional support and access to anecdotal experiences, they can also present users with large amounts of potentially inaccurate information. Despite the importance of this issue, limited research has been conducted, especially on the differences that might emerge due to the type of content moderation of online communities: peer-led or expert-led. Objective: We aim to fill this gap by examining the prevalence, the communicative context, and the persistence of mental health misinformation on Facebook online communities for mental health, with a focus on understanding the mechanisms that enable effective correction of inaccurate information and differences between expert-led and peer-led groups. Methods: We conducted a content analysis of 1534 statements (from 144 threads) in 2 Italian-speaking Facebook groups. Results: The study found that an alarming number of comments (26.1%) contained medically inaccurate information. Furthermore, nearly 60% of the threads presented at least one misinformation statement without any correction attempt. Moderators were more likely to correct misinformation than members; however, they were not immune to posting content containing misinformation, which was an unexpected finding. Discussions about aspects of treatment (including side effects or treatment interruption) significantly increased the probability of encountering misinformation. Additionally, the study found that misinformation produced in the comments of a thread, rather than as the first post, had a lower probability of being corrected, particularly in peer-led communities. Conclusions: The high prevalence of misinformation in online communities, particularly when left uncorrected, underscores the importance of conducting additional research to identify effective mechanisms to prevent its spread. This is especially important given the study’s finding that misinformation tends to be more prevalent around specific “loci” of discussion that, once identified, can serve as a starting point to develop strategies for preventing and correcting misinformation within them.

Bizzotto, N., Schulz, P., De Bruijn, G., The “Loci” of Misinformation and Its Correction in Peer- and Expert-Led Online Communities for Mental Health: Content Analysis, <<JMIR. JOURNAL OF MEDICAL INTERNET RESEARCH>>, 2023; (e44656): N/A-N/A. [doi:10.2196/44656] [https://hdl.handle.net/10807/272021]

The “Loci” of Misinformation and Its Correction in Peer- and Expert-Led Online Communities for Mental Health: Content Analysis

Schulz, Peter;
2023

Abstract

Background: Mental health problems are recognized as a pressing public health issue, and an increasing number of individuals are turning to online communities for mental health to search for information and support. Although these virtual platforms have the potential to provide emotional support and access to anecdotal experiences, they can also present users with large amounts of potentially inaccurate information. Despite the importance of this issue, limited research has been conducted, especially on the differences that might emerge due to the type of content moderation of online communities: peer-led or expert-led. Objective: We aim to fill this gap by examining the prevalence, the communicative context, and the persistence of mental health misinformation on Facebook online communities for mental health, with a focus on understanding the mechanisms that enable effective correction of inaccurate information and differences between expert-led and peer-led groups. Methods: We conducted a content analysis of 1534 statements (from 144 threads) in 2 Italian-speaking Facebook groups. Results: The study found that an alarming number of comments (26.1%) contained medically inaccurate information. Furthermore, nearly 60% of the threads presented at least one misinformation statement without any correction attempt. Moderators were more likely to correct misinformation than members; however, they were not immune to posting content containing misinformation, which was an unexpected finding. Discussions about aspects of treatment (including side effects or treatment interruption) significantly increased the probability of encountering misinformation. Additionally, the study found that misinformation produced in the comments of a thread, rather than as the first post, had a lower probability of being corrected, particularly in peer-led communities. Conclusions: The high prevalence of misinformation in online communities, particularly when left uncorrected, underscores the importance of conducting additional research to identify effective mechanisms to prevent its spread. This is especially important given the study’s finding that misinformation tends to be more prevalent around specific “loci” of discussion that, once identified, can serve as a starting point to develop strategies for preventing and correcting misinformation within them.
2023
Inglese
Bizzotto, N., Schulz, P., De Bruijn, G., The “Loci” of Misinformation and Its Correction in Peer- and Expert-Led Online Communities for Mental Health: Content Analysis, <<JMIR. JOURNAL OF MEDICAL INTERNET RESEARCH>>, 2023; (e44656): N/A-N/A. [doi:10.2196/44656] [https://hdl.handle.net/10807/272021]
File in questo prodotto:
File Dimensione Formato  
jmir-2023-1-e44656.pdf

accesso aperto

Tipologia file ?: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 129.51 kB
Formato Adobe PDF
129.51 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10807/272021
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact