• Home
  • Uncategorized
  • Insights and Recommendations From Moderators and Community Members for Keeping Online Peer Support Safe: Thematic Analysis

Background: Online peer support can help people living with long-term physical health conditions to manage their mental well-being. Although the potential negative events that can occur and risks associated with web-based peer communities are well recognized, our understanding of how best to moderate these spaces is relatively limited, particularly with regard to new communities. Previous work has focused on the experiences of either moderators or community members. Objective: This study aims to explore the perspectives of both members and moderators of a new online peer support community to evaluate the moderation procedures and inform recommendations for best practice. Methods: Community members (n=39) who participated in a research trial of a new online peer community, CommonGround, were interviewed. The moderation team (n=5) was invited to a focus group. Community member interviews explored their opinions of moderation policies and the behavior of the moderation team. The moderator focus group explored their experiences of moderating the community, including perceived benefits, common challenges, and areas for improvement. All interviews and the focus group were conducted online, audio-recorded, and transcribed verbatim. An inductive thematic analysis was conducted to sort the data into overarching themes through an iterative process. Results: Effective moderation was considered critical in creating a safe space that members wanted to engage with and for mitigating any risks, particularly around the spread of medical misinformation. Both moderators and community members felt that the moderation policies and practices were appropriate and applicable to the community. Moderators found navigating the moderation threshold, where they balanced safety against free speech, challenging when determining whether to intervene or not. Being part of a team with mixed clinical expertise helped moderators build confidence in navigating this threshold and also presented other benefits of easy access to support and improving the consistency of their moderation practices. It was suggested that in order for a community to flourish, community members would self-moderate. However, moderators and members felt that the strong community culture and high levels of member engagement that are needed to support self-moderation had not yet evolved. Proposed improvements to moderation included new features to support the efficiency of identifying new content for review and reviewing the rule of anonymity. Conclusions: Moderation is critical in making online peer communities feel safe and engaging. Moderation practices should be co-produced with the target audience to ensure that they are aligned with the community’s unique moderation wants and needs, including clear escalation pathways, transparent communication patterns, and plans to review and update policies or procedures as the community evolves. There should be technological features that promote self-moderation, as the community may shift towards self-moderation as it matures. It is also critical to ensure that moderators feel supported so that they are best placed to support the broader community. Trial Registration: ClinicalTrials.gov NCT06222346; https://clinicaltrials.gov/study/NCT06222346

Subscribe for Updates

Copyright 2025 dijee Intelligence Ltd.   dijee Intelligence Ltd. is a private limited company registered in England and Wales at Media House, Sopers Road, Cuffley, Hertfordshire, EN6 4RY, UK registration number 16808844