Hostname: page-component-76fb5796d-skm99 Total loading time: 0 Render date: 2024-04-28T16:31:27.947Z Has data issue: false hasContentIssue false

South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people

Published online by Cambridge University Press:  28 July 2023

Xanthe Hunt*
Affiliation:
Institute for Life Course Health Research, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
Dionne C. Jivan
Affiliation:
Department of Psychology, Faculty of Arts and Social Sciences, Stellenbosch University, Stellenbosch, South Africa
John A. Naslund
Affiliation:
Department of Global Health and Social Medicine, Harvard Medical School, Boston, MA, USA
Elsie Breet
Affiliation:
Institute for Life Course Health Research, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
Jason Bantjes
Affiliation:
Institute for Life Course Health Research, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa Alcohol, Tobacco and Other Drugs Research Unit, South African Medical Research Council, Cape Town, South Africa
*
Corresponding author: Xanthe Hunt; Email: xanthe@sun.ac.za
Rights & Permissions [Opens in a new window]

Abstract

Mental disorders are common among university students. In the face of a large treatment gap, resource constraints and low uptake of traditional in-person psychotherapy services by students, there has been interest in the role that digital mental health solutions could play in meeting students’ mental health needs. This study is a cross-sectional, qualitative inquiry into university students’ experiences of an online group cognitive behavioural therapy (GCBT) intervention. A total of 125 respondents who had participated in an online GCBT intervention completed a qualitative questionnaire, and 12 participated in in-depth interviews. The findings provide insights into how the context in which the intervention took place, students’ need for and expectations about the intervention; and the online format impacted their engagement and perception of its utility. The findings of this study also suggest that, while online GCBT can capitalise on some of the strengths of both digital and in-person approaches to mental health programming, it also suffers from some of the weaknesses of both digital delivery and those associated with in-person therapies.

Type
Research Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press

Impact statement

This study found that online group therapy for university students can capitalise on some of the strengths of both digital and in-person approaches to mental health programming, being easily accessible (akin to many digital interventions) and allowing for interpersonal connection (akin to many in-person therapies). However, we also found that online therapy suffers from some of the weaknesses of both digital delivery and those associated with in-person therapies. For instance, because digital interventions are designed to be scalable, they are often manualised. However, manualisation made some users feel that the programme lacked personalisation, and flexibility and responsiveness in the here-and-now. Other weaknesses of the digital platform included the lack of accountability and difficulty managing group dynamics online. The implications of this study are that questions still remain about whether – from an implementation perspective – it is more useful to think of online therapies as a digital intervention (like an app) or simply as group therapy that happens to be held on a digital platform (like telepsychiatry). Many of the issues raised in this study are ones germane to the literature on mental health apps in low- and middle-income countries, including convenience and personalisation of scalable interventions. However, our findings show that the relational elements of the intervention – the ‘human’ elements – are important to participants, and to users’ sense of the programme as ‘real’ (rather than virtual). Difficulties arise, however, because precisely the factors which give the programme its ‘real’ feel for participants (synchronous delivery, the requirement of a clinician to deliver content, and need for strong Internet bandwidth among users), are those which pose barriers to scale.

Introduction

Mental disorders are common among university students (hereafter ‘students’) (Auerbach et al., Reference Auerbach, Mortier, Bruffaerts, Alonso, Benjet, Cuijpers, Demyttenaere, Ebert, Green and Hasking2018; Bantjes et al., Reference Bantjes, Lochner, Saal, Roos, Taljaard, Page, Auerbach, Mortier, Bruffaerts and Kessler2019). A recent survey of first-year students across eight countries reported that the lifetime prevalence of common mental disorders was 35.3% and the 12-month prevalence was 31.4% (Auerbach et al., Reference Auerbach, Mortier, Bruffaerts, Alonso, Benjet, Cuijpers, Demyttenaere, Ebert, Green and Hasking2018). Major depressive disorder (MDD) and generalised anxiety disorder (GAD) are the most common conditions, with 12-month prevalence rates of 18.5 and 16.7%, respectively (Auerbach et al., Reference Auerbach, Mortier, Bruffaerts, Alonso, Benjet, Cuijpers, Demyttenaere, Ebert, Green and Hasking2018). Left untreated, mental disorders pose serious risks to students’ educational achievement and functioning (Alonso et al., Reference Alonso, Vilagut, Mortier, Auerbach, Bruffaerts, Cuijpers, Demyttenaere, Ebert, Ennis and Gutiérrez‐García2019). Compared to their peers, students with mental disorders are less likely to cope with the transition to university (Al-Qaisy, Reference Al-Qaisy2011; Grøtan et al., Reference Grøtan, Sund and Bjerkeset2019), have lower academic achievement (Grøtan et al., Reference Grøtan, Sund and Bjerkeset2019) and worse long-term employment, productivity, relationships and health outcomes (Eisenberg et al., Reference Eisenberg, Hunt and Speer2012). Mental health concerns among students have been exacerbated over the course of the COVID-19 pandemic, as reflected in numerous studies from diverse settings reporting a worsening of mood, greater perceived stress and increased alcohol consumption during this period (Charles et al., Reference Charles, Strong, Burns, Bullerjahn and Serafine2021; Copeland et al., Reference Copeland, McGinnis, Bai, Adams, Nardone, Devadanam, Rettew and Hudziak2021; Visser and Law-van Wyk, Reference Visser and Law-van Wyk2021).

Yet, despite the high burden of need and the risks of foregoing treatment, many students with mental disorders do not receive any treatment at all (Eisenberg et al., Reference Eisenberg, Hunt and Speer2012; Bruffaerts et al., Reference Bruffaerts, Mortier, Auerbach, Alonso, Hermosillo De la Torre, Cuijpers, Demyttenaere, Ebert, Green and Hasking2019). Reasons for the treatment gap include stigma, a lack of awareness of their need for care, and low knowledge and/or acceptability of available resources (Ibrahim et al., Reference Ibrahim, Amit, Shahar, Wee, Ismail, Khairuddin, Siau and Safien2019; Hilliard et al., Reference Hilliard, Watson and Zizzi2022). For young people specifically, the desire to deal with challenges alone, beliefs about the ineffectiveness of therapy, and competing priorities and demands on their time are key barriers to seeking care (Ennis et al., Reference Ennis, McLafferty, Murray, Lapsley, Bjourson, Armour, Bunting, Murphy and O’Neill2019). Resource constraints in low- and middle-income countries (LMICs) exacerbate the treatment gap on university campuses as mental health services are often under-resourced, over-stretched, or entirely absent (Demyttenaere, Reference Demyttenaere2004; Saxena et al., Reference Saxena, Thornicroft, Knapp and Whiteford2007; Docrat et al., Reference Docrat, Besada, Cleary, Daviaud and Lund2019). Furthermore, traditional in-person psychotherapeutic treatment options are often not feasible, affordable or easily scalable to address the large need for care. And, given low rates of engagement in services (Vanheusden et al., Reference Vanheusden, van der Ende, Mulder, van Lenthe, Verhulst and Mackenbach2008; Bantjes et al., Reference Bantjes, Saal, Lochner, Roos, Auerbach, Mortier, Bruffaerts, Kessler and Stein2020), it is also possible these modalities may not be the optimal approach to reach all (if not the majority) of students.

In the face of these circumstances, there has been interest in the role that digital mental health solutions could play in meeting youth mental health needs, particularly in resource-constrained settings (Waegemann, Reference Waegemann2010; Grist et al., Reference Grist, Porter and Stallard2017; Punukollu and Marques, Reference Punukollu and Marques2019; Leech et al., Reference Leech, Dorstyn, Taylor and Li2021; Bantjes et al., Reference Bantjes, Hunt and Stein2022). The COVID-19 pandemic accelerated the utilisation of digital interventions to provide remote psychiatric care, helping to establish digital technologies as viable treatments (Stein et al., Reference Stein, Naslund and Bantjes2022a). There has been a proliferation of digital interventions brought to market in the past decade (Lehtimaki et al., Reference Lehtimaki, Martic, Wahl, Foster and Schwalbe2021).

However, a recent systematic review and meta-analysis demonstrated mixed effectiveness of these interventions and experts have cautioned against unrealistic optimism about digital mental health interventions’ potential to address the broad need for mental health services or supplant in-person services (Grist et al., Reference Grist, Porter and Stallard2017). Evidence of implementation challenges facing digital interventions broadly (Ford II et al., Reference Ford II, Alagoz, Dinauer, Johnson, Pe-Romashko and Gustafson2015; van Olmen et al., Reference van Olmen, Erwin, García-Ulloa, Meessen, Miranda, Bobrow, Iwelunmore, Nwaozuru, Umeh and Smith2020) has highlighted the need for consistent demonstrations of their usability and effectiveness (Lehtimaki et al., Reference Lehtimaki, Martic, Wahl, Foster and Schwalbe2021; Stein et al., Reference Stein, Shoptaw, Vigo, Lund, Cuijpers, Bantjes, Sartorius and Maj2022b). Moreover, researchers and practitioners have noted context-specific challenges facing users in LMICs, including practical barriers such as the cost of cell phone data, lack of connectivity and limited access to smart devices (van Olmen et al., Reference van Olmen, Erwin, García-Ulloa, Meessen, Miranda, Bobrow, Iwelunmore, Nwaozuru, Umeh and Smith2020). Despite increasing access to digital devices across most contexts, understanding the barriers to using digital mental health solutions in LMICs is essential to close the persisting digital divide (Bantjes, Reference Bantjes2022).

Digital interventions exist at a range of intensities and levels of digitization: Some, like online therapy, have digital delivery, but otherwise resemble traditional, in-person mental health services, as reflected by the synchronous connection to a mental health provider. Others, like apps, are fully digitised and exist at high (e.g. artificial intelligence chatbots) and low (e.g. mood monitoring calendars) intensities. Digital interventions can be conceptualised as existing along these two dimensions (digitization and intensity) as per Figure 1. There are of course other dimensions along which digital interventions can be conceptualised, including introducing the necessity for a practitioner, the potential to scale-up interventions, and the degree to which individuals can choose how and when to engage with the interventions.

Figure 1. Digital interventions as organized along the dimensions of digitization and intensity.

Different types of digital mental health interventions have specific strengths, such as being able to scale to many individuals at low cost or being easily accessible at the time and place of an individual’s choosing. However, weaknesses are also notable, such as the requirement for continuous data access and challenges sustaining user engagement (Martinez-Martin and Kreitmair, Reference Martinez-Martin and Kreitmair2018).

In an attempt to address the twin challenges of limited scale of in-person psychotherapy, and the criticisms levelled against fully digitised interventions such as apps, a group of researchers from South Africa (SA) and the United States (US) developed an online group cognitive behavioural therapy (GCBT) intervention. The programme, designed for delivery via videoconferencing to students with mental health problems, aimed to capitalise on the cost- and scale benefits of a digital platform, but not lose the effectiveness and acceptability of in-person therapy. The programme was piloted in 2020 by Bantjes et al. (Reference Bantjes, Kazdin, Cuijpers, Breet, Dunn-Coetzee, Davids, Stein and Kessler2021) and found to show promise as an effective and sustainable intervention for the treatment of anxiety and depression among students.

However, the pilot results do not allow for a nuanced understanding of how the intervention was experienced by students, including important questions around whether this ‘hybrid’ format – of a traditional therapy modality delivered on a digital platform – was acceptable to the programme’s users. Moreover, recent reviews have called for additional research into online group therapy, particularly given the paucity of evidence on the implementation of these programs (Weinberg, Reference Weinberg2020). The objective of this study was to conduct qualitative interviews with participants from the 2020 pilot study to explore their perspectives about the intervention’s content and delivery, and to provide insights to guide future implementation of the GCBT intervention.

Methods

Design

This study is a cross-sectional, qualitative inquiry into students’ experiences of an online GCBT intervention. It is a sub-study of a pilot open-label trial of the GCBT intervention in which 158 students were enrolled into the GCBT intervention (Bantjes et al., Reference Bantjes, Kazdin, Cuijpers, Breet, Dunn-Coetzee, Davids, Stein and Kessler2021). Qualitative data were collected at two points during the study, firstly through a single open-ended online questionnaire item sent to all 158 participants, and secondly through in-depth interviews with 12 randomly selected participants. While the correct processes for determining sample size in qualitative studies are debated, we used the guidelines from Clarke and Braun (Reference Clarke and Braun2013).

Setting

Students for this intervention were recruited from a university in SA, a country with high rates of mental disorders and low treatment rates (Bruwer et al., Reference Bruwer, Sorsdahl, Harrison, Stein, Williams and Seedat2011; Newson et al., Reference Newson, Sukhoi, Taylor, Topalo and Thiagarajan2021). Mental disorders are particularly common among young people with a recent study reporting the 12-month prevalence rate of common mental disorders to be 31.5% among first-year students, with MDD and GAD being the most common disorders (12-month prevalence of 13.6 and 20.8%, respectively) (Bantjes et al., Reference Bantjes, Lochner, Saal, Roos, Taljaard, Page, Auerbach, Mortier, Bruffaerts and Kessler2019). In SA, roughly 92% of individuals in need of mental health services do not have access to them (Docrat et al., Reference Docrat, Besada, Cleary, Daviaud and Lund2019). Moreover, where resources are available, utilisation rates are often low (Bantjes et al., Reference Bantjes, Saal, Lochner, Roos, Auerbach, Mortier, Bruffaerts, Kessler and Stein2020). In this study, no symptom threshold was set on participation, and so participants included both individuals with and without clinically significant symptoms of depression and/or anxiety.

Participants and procedure

Recruitment for the pilot open-label trial is described in depth in Bantjes et al. (Reference Bantjes, Kazdin, Cuijpers, Breet, Dunn-Coetzee, Davids, Stein and Kessler2021) but, briefly, involved information about the intervention being posted once on a student affairs Facebook page at a university in South Africa in mid-2020. The post explained that web-based groups were being offered to help students learn psychological skills to reduce symptoms of anxiety and depression. The 175 students who responded within 24 h to the notice provided informed consent and completed a baseline assessment before being randomised to one of 15 GCBT groups. No symptom threshold was placed on participants, and anyone who wanted to participate was eligible to do so. For the qualitative sub-study, which is the subject of this article, data were collected in two formats. First, in September 2020, all 158 students who participated in the GCBT intervention were invited via an online questionnaire to give qualitative feedback about the online intervention. Then, a random sample of participants was invited to attend semi-structured in-depth interviews. Five rounds of recruitment emails were sent to 12 participants each time (60 in total). In each recruitment batch, the 12 participant email addresses were randomly selected from the total sample using a random number generator. One reminder email was sent to each participant before a new batch of emails was sent out. The process of sending recruitment emails was continued until the sample size for the in-depth interviews had been achieved (n = 12). These in-depth interview participants were then interviewed via MS Teams by a trainee clinical psychologist. These interviews were recorded, transcribed and analysed. Ethical approval was obtained from the Psychology Ethics Committee, Stellenbosch University, (N19/10/145, Project ID: 12977). All participants provided informed consent.

Instruments

The online questionnaire entailed a single open-ended field which prompted participants to insert text in response to the question, “Please use the space below to give us any other feedback about your experience of the group. Tell us what you liked or did not like, and what you think we can do to make this group more helpful to other students in the future”. For the in-depth interviews, a semi-structured interview guide was used which contained open-ended questions relating to students’ experiences of the online GCBT intervention. These questions were guided by our desire to understand the following:

  1. 1. Acceptability of the ‘hybrid’ format – of a traditional therapy modality delivered on a digital platform

  2. 2. Barriers and facilitators to engagement with the psychological skills training group intervention

  3. 3. Reflections on the digital intervention as it compares to any other experiences with mental health treatments

  4. 4. Contextual and cultural appropriateness of the intervention and its method of delivery

The semi-structured interview guide for the individual interviews can be found in Supplementary File 1.

Details of the GCBT intervention

The online GCBT intervention, delivered via MS Teams, consisted of 10 weekly group sessions which were 60–75 min each. The content of the intervention is described in detail in Bantjes et al. (Reference Bantjes, Kazdin, Cuijpers, Breet, Dunn-Coetzee, Davids, Stein and Kessler2021), but session topics included emotional triggers and automatic thoughts, identifying emotional triggers, challenging automatic thoughts and core beliefs, recognising stressors and using strategies to solve interpersonal and emotional problems, overcoming rumination and guilt, behavioural activation and coping with difficult emotions. Each group had between 10–12 members who attended 10 workshops organised into 5 topics. The membership of groups was largely fixed. Each participant got an interactive workbook which served as a guide and included activities and summaries focusing on the main ideas and skills for each session.

Data analysis

For analysis, the data from the survey and the in-depth interviews were coded separately. This was because, at the time of analysis, the survey data were seen as routine monitoring and evaluation data, while the in-depth interviews were designed as a qualitative sub-study. Data from each dataset were anonymised and pseudonyms were assigned. They were then analysed via inductive thematic analysis and the six-phase approach outlined by Braun and Clarke (Reference Braun and Clarke2006). The phases of this approach include familiarisation, coding, generating themes, reviewing themes, defining and naming themes and writing up (Braun and Clarke, Reference Braun and Clarke2006). All data were managed and analysed in Microsoft Excel (Microsoft Corporation, 2018), where units of meaning from the survey and qualitative transcripts were pasted into a spreadsheet, and a code was assigned to each new unit of meaning using adjacent columns. Units of meaning – codes – were then organised together into larger, descriptive groups, and names of these groups (themes) were assigned. Once the analysis was complete, it became apparent that the prompt for the survey data collection had generated some unique responses which were not captured in the in-depth interviews. The decision was then taken to include both sets of data – from the online questionnaire and the in-depth interviews – in one final analysis. Data were combined and results were written up. All analyses were conducted by two independent coders: E.B. and D.C.J. for the survey data and X.H. and D.C.J. for the in-depth interview data. All analyses were reviewed for quality control by J.B.

Results

A total of 125 respondents completed the online questionnaire (out of N = 158; 79% response rate), and 12 of the 125 were randomly selected to participate in in-depth interviews. Eighty-six percent of the questionnaire respondents self-identified as female and the mean age of the sample was 21.96 years (see Table 1 for detail). The demographic profile of the respondents to the in-depth interviews was broadly similar to those of the larger sample (70% self-identified as female, and their mean age was 21.01 years).

Table 1. Participant demographics for the whole sample

Note: The value in bold is significant at the <0.05 level.

The findings from both the online questionnaire and the in-depth interviews provide insights into how the context in which the intervention took place, students’ need for and expectations about the intervention; and the online format impacted their engagement and perception of its utility. The themes identified in the data are summarised in Table 2. The themes arising primarily from the online questionnaire are flagged in italics.

Table 2. Themes

Opportunities for connection and continuity

The online intervention was delivered in 2020 during the first wave of the COVID-19 pandemic in SA. Participants reflected on this, noting that the pandemic and the measures required for its containment created a greater need for mental health services. Many participants noted that the intervention provided them with a much-needed therapeutic space for their mental health difficulties, as well as space to learn skills to deal effectively with the new stresses associated with the pandemic. Furthermore, the weekly group sessions provided opportunities for interacting with other students and created a sense of community. One participant explained:

There was some really hard-core isolation going on and the group was a nice way of hearing people’s voices … There was a cage effect, we were stuck not knowing what to do and were stuck with our own thoughts and in that way the content was good coz it gave us a structure of how to proceed going through these thoughts… I think a lot of people also felt that way, they used the group as a social platform as well.

May, 25-year-old female, in-depth interview

Many participants said the intervention served as a stabilising force in their lives, providing routine and predictability at a time of instability. As one participant noted:

I’m a person that likes to have things to do and likes processes, rules and methods. For me I found [the group] very helpful and reassuring… particularly in this context of Corona.

– James, 19-year-old male, in-depth interview

While many students felt that the context of the pandemic amplified the need for an online mental health intervention, others reflected on the way in which the reality of the pandemic and the broader socio-political context of the country butted up against the very pragmatic approach of CBT. One participant, for instance, felt that some of the content delivered through the intervention was not particularly sensitive to current realities or socio-political issues. She noted:

It was only through the check-ins and check-outs and the little bit of personal connection that we touched on the fact of COVID. I thought maybe to make it more practical, for example, this year they could have done a little bit more. Like, how do you then for example, do SMART goals in a situation of a pandemic or in the previous years where there were major student protests.

– Nomanono, 22-year-old female, in-depth interview

Reality versus expectations

In general, participants had low initial expectations for the online GCBT intervention. Since it was called a ‘psychological skills group’ in the advertising material circulated on-campus social media, participants appeared to expect a psychoeducational programme rather than therapy. One respondent described her experience, saying:

Initially my impression of it was, it was going to be a course, and I am generally interested in CBT and all things psychology.

– Sarabi, 22-year-old female, in-depth interview

Other participants had understood that the intervention was going to be group therapy, but admitted that they did not expect it to have any impact on their mental health. Despite low expectations, many participants noted that the intervention was surprisingly helpful and engaging. One young man explained:

I saw an email and I didn’t really expect much from it, I didn’t think it would be helpful because I was like okay, I am in need of some form of you know, we were in lockdown and not on campus anymore, that sort of thing. So, I didn’t expect it to be as helpful and such a nice safe environment as it was.

– Isaac, 23-year-old male, in-depth interview

Group format and online spaces as barriers and facilitators

Many participants found the online format of the group helped create a ‘safe space’ for self-disclosure and self-discovery, without the anxiety of direct face-to-face contact with others. This was particularly salient for participants with social anxiety, as exemplified in the following participant’s account:

I’m a relatively shy person and being online gives you a bit of confidence because there is a barrier between you and other people to an extent, you kind of feel a little braver and sharing.

– Kavitha, 20-year-old female, in-depth interview

In contrast, some participants experienced the online environment as a barrier to creating and maintaining interpersonal connections. One young man noted:

I think the fact that it is online is just kind of limited, so I think they did all they could to make it very, uhm to connect us well on the medium that was used… I think the human connection was lacking.

– Isaac, 23-year-old male, in-depth interview

Participants spoke about the lack of visual cues as a barrier to their engagement and an impediment to the therapeutic process, as one participant explained:

A lot of things seemed quite distant, like you’re still working through a computer and like with the screen when it came to like really practical like almost role play stuff it was difficult.

– Simon, 19-year-old male, in-depth interview

The online delivery of the intervention helped to make the intervention convenient compared to traditional in-person services. Participants said that this ease of use facilitated attendance and engagement. As one participant explained:

When you are online … then it’s easier to plan around your schedule because sometimes it’s difficult for everyone to meet in one place for an hour like that, like once a week.

– Tracey, 19-year-old female, in-depth interview

However, as much as convenience was a strength in many participants’ eyes, some individuals also noted that because they did not have to invest much effort in attending, they also easily forgot to join. This was also tied to the fact that the program was delivered online. As one respondent noted:

I felt a lot more accountable when I had to attend in person meetings because … it’s just a lot easier to forget an online meeting than to forget an in-person meeting.

– Gina, 22-year-old female, in-depth interview

Some participants also found themselves dividing their attention between the intervention and other internet media:

I could check Twitter or I could just reply to a WhatsApp or like you know there are 1000 other things. I am sitting in the comfort of my own home and there are so many things that distract you, you know so it’s really hard to … focus.

– Alex, 20-year-old female, in-depth interview

Many participants mentioned that the group provided a sense of community and cohesiveness that validated their own experiences and feelings. One respondent reflected on this, saying:

I think it’s the fact that you are not alone in what you’re going through. That kind of thing was quite eye opening to me is you always think that, you know, you’re facing this whole thing on your own.

– Gina, 22-year-old female, in-depth interview

Participants spoke specifically about the intervention helping normalise their experiences by allowing them to see the similarities between their own struggles and those of other students. Alice explained:

Even though they are going through very different things than any of us were, we had the same sort of things that were triggering our emotions or the reasons for us being there.

– James, 19-year-old male, in-depth interview

While most students were pleasantly surprised by the programme and found the content useful, there were some who spoke of not having an entirely positive experience. For instance, one participant described how her individual needs were not met due to the inability of the facilitators to manage individual versus group needs:

[The facilitators] didn’t make anyone speak but then they also didn’t allow everyone to speak … the one lady she almost at times would have her own one on one counselling therapy and we would just be spectators and she would happily talk on for 20 minutes.

– Nomanono, 22-year-old female, in-depth interview

Perception of therapeutic value of the intervention

Participants reported that skills-based group therapy which included a workbook was very helpful in their own learning and growth over 10 weeks. Although some mentioned that they may have preferred a more relaxed group to talk to people in general, they still benefitted from the skills even when not directly applicable to them. One respondent said:

I really liked the CBT approach to therapy and kind of helping to retrain your thoughts. It was very insightful and I learned a lot, I think the content was really great, very educational, easy to follow and well set out.

– Isaac, 23-year-old male, in-depth interview

Participants reported that they appreciated having the workbooks and found the practical activities in these helpful. One participant affirmed this saying:

The worksheets that students received after a group session were very helpful because students could always go through what was discussed during the group session using the worksheets later on.

– Mihlali, 17-year-old female, online questionnaire

The content of the workbooks served as a summary of the skills covered in the sessions, but also provided participants with opportunities to continue their engagement with the content in-between sessions. However, not all the participants found the structured worksheets and workbook helpful. One participant, for example, said:

I thought this group could be a place where we could just share and talk through our stuff, but there was a schedule and activities and stuff to read through and actually it just felt like it was contributing to my workload.

– Katie, 19-year-old female, online questionnaire

While many participants appreciated the content and skills-based approach of the intervention, many of them also expressed a wish for a less structured and less content-driven and directive approach. They had an unmet expectation that the groups would provide more space for personal disclosure, discussion and interaction with other participants. Having time and opportunities to interact with other students is a valued component of the intervention, as shown in the following feedback from one participant:

I enjoyed the interaction within the group and hearing people’s opinions or how they deal with certain situations.

– Shanice, 22-year-old female, online questionnaire

Finally, participants attributed the success of the intervention in large part to the skill and attitude of the facilitators, saying things like:

I felt that facilitators were open minded, encouraging and empathic and also knew a lot.

Jack, 27-year-old male, online questionnaire

It was evident that most participants felt a personal connection to the facilitators and perceived them to be warm, welcoming, non-judgmental, skilful and knowledgeable. These positive feelings towards the facilitators seemed to promote participants’ engagement in the process and enabled them to receive the skills being offered. One respondent articulated this by saying:

It was a great experience and a safe environment where I could share my feelings knowing I would not be judged.

Sabrina, 18-year-old female, online questionnaire

The feeling of safety and the non-judgmental environment created by the facilitators appears to have been an integral component of the success of the intervention.

Discussion

Students who participated in a 10-week online GCBT intervention delivered on university campus in South Africa largely found the intervention engaging and helpful. However, our findings highlight several key considerations for implementing these kinds of interventions with young people. These considerations are particularly important to address if such programmes are going to fulfil their potential to lessen the mental health treatment gap among university students in LMICs.

As noted in the introduction to this article, our team developed and tested an online GCBT programme as an intervention which took a middle road between digital and in-person services. Our findings suggest that, while the programme capitalises on some of the strengths of each approach, it also suffers from some of the weaknesses of both digital delivery and those associated with in-person therapies. Moreover, it appears that some of the features of digital delivery mean that those strengths which are associated with in-person therapies are diluted.

We also found that context plays a central role in determining how and for whom and when digital interventions work. The pandemic made online delivery of the intervention facilitative of engagement and students’ busy schedules meant that the flexibility afforded by the online group was appreciated. However, when it came to intervention content (as opposed to delivery), context posed a challenge: Because digital interventions are designed to be scalable, they are often manualised. However, manualisation made some students feel that the programme lacked personalisation. An important direction for future work refining scalable digital interventions will be to understand how to optimise opportunities to ensure that a programme has sufficient fidelity to the evidence-based treatment manual whilst also being able to respond to the needs and priorities of students. Flexibility and responsiveness in the here-and-now are potentially among the biggest strengths of online group interventions, compared to other digital interventions (like apps) where the content is often fixed. Finding ways to harness the flexibility of synchronous online group therapies while retaining fidelity to the core CBT skills which make up this online intervention is integral to maximising the benefits of this intervention.

Interestingly, most participants felt that the intervention exceeded their expectations. On the surface of things, this is a positive outcome for the intervention pilot. However, part of the reason that their expectations had been exceeded was because they thought that they were signing up for psychoeducation rather than therapy. This highlights the need to develop marketing messages that are better tailored to the target population of students, as well as to manage students’ expectations when recruiting them to digital interventions, a point that has been made by other authors (Gericke et al., Reference Gericke, Ebert, Breet, Auerbach and Bantjes2021). Expectations about psychotherapy are an important determinant of treatment outcome (Greenberg et al., Reference Greenberg, Constantino and Bruce2006), so managing students’ expectations of digital intervention is important.

Other weaknesses of online GCBT included the lack of accountability (students could easily forget about or miss the sessions), and the limited opportunities for interpersonal connection. Many initiatives in mental health are examining how peer support can be delivered online (Melling and Houguet-Pincham, Reference Melling and Houguet-Pincham2011; Ali et al., Reference Ali, Farrer, Gulliver and Griffiths2015). Some of the lessons from this literature might well be used to improve the peer-to-peer engagement aspect of synchronous therapeutic groups. Developing approaches to improve accountability, however, require additional research.

Relatedly, while the group setting seemed to have offered some participants a sense of kinship with their peers, which made them feel at ease, others felt that the group dynamics had not ‘worked’ online. Management of group dynamics in therapy is a well-established area of study in traditional psychotherapy (Bion, Reference Bion1952; Sutherland, Reference Sutherland1985; Scheidlinger, Reference Scheidlinger1997), and is receiving increasing attention for online groups (Biagianti et al., Reference Biagianti, Quraishi and Schlosser2018; Weinberg, Reference Weinberg2020, Reference Weinberg2021). Emerging studies have also found that back-and-forth interaction between peers within online platforms appears to promote retention (Sharma et al., Reference Sharma, Choudhury, Althoff and Sharma2020). For digital interventions to be effective, careful consideration needs to be given to how peer-to-peer interaction is facilitated. In online GCBT, this could be achieved, for example, by using break-out rooms on videoconferencing platforms.

At the start of this article, we introduced the idea that digital interventions exist at a range of intensities and levels of digitization, and that there are strengths and weaknesses afforded by different degrees of digitization and intensity. One of the questions raised by this study is whether – from an implementation perspective –it is more useful to think of online GCBT as a digital intervention (akin to an app) or simply as group therapy that happens to be held on a digital platform (akin to telepsychiatry). Many of the issues raised in this study are ones germane to the literature on mental health apps, including convenience (Carolan and de Visser, Reference Carolan and de Visser2018) and the personalisation of scalable interventions (Borghouts et al., Reference Borghouts, Eikey, Mark, De Leon, Schueller, Schneider, Stadnick, Zheng, Mukamel and Sorkin2021). However, other insights shared by respondents point to the importance of the relational elements of the intervention, and to group participants’ sense of the programme as ‘real’ (rather than virtual). Difficulties arise, however, because precisely the factors which give the programme its ‘real’ feel for participants, are those which pose barriers to scale (synchronous delivery, the requirement of a clinician to deliver content, and need for strong Internet bandwidth among users). These contrasting findings also point to the potential benefits of combining multiple forms of digital intervention, such as augmenting a mobile app with access to group therapy on a digital platform, which could enable further customization.

Finally, the experiences of students remind us that no intervention is likely to meet the needs and preferences of all users. While the online group appealed to many students, there were others who felt that the intervention was not well suited to them. This highlights the importance of person-centred approaches to digital solutions, and the need for including a range of interventions within student counselling centres so students can be matched with those that are potentially most helpful to them.

The implications of the findings of this study for the delivery of digital mental health interventions include:

  1. 1. There is a need for psychoeducation among students about different types of digital mental health intervention to set expectations prior to engagement;

  2. 2. Ways to increase opportunities for peer engagement in digital mental health interventions need to be identified;

  3. 3. Interventions must be developed with in-built mechanisms that support responsiveness to group needs and context;

  4. 4. Users need to be supported to minimise distractions during engagement with online mental health programming;

  5. 5. Efforts need to be made to expand the range of digital mental health intervention options available to students and allow students to excise autonomy in selecting the one best suited to their needs, and these options should include programmes of low- and high-intensity, and low- and high-digitisation and

  6. 6. It will be important for the field to improve understanding of the individual-level factors which predict engagement and treatment response with different types of digital interventions so that prediction algorithms can be developed to personalise triage.

We need to leverage these findings to support the uptake and implementation of various digital mental health resources across university campuses, capitalising on the flexibility of digital offerings to meet the demand for mental health support among students.

Despite the value of these findings, some limitations of the study must be noted. Firstly, only 12 out of 60 randomly recruited participants completed an in-depth interview, indicating some degree of selection bias. However, efforts were also made to assess the degree to which the codes and themes identified in the questionnaire dataset were mirrored in the in-depth interviews, suggesting that many of the major findings resonated across the majority of participants. Secondly, the university at which the study was conducted is significantly better resourced than many other universities in SA, and its student population is not representative of the broader demographics of the country’s university population, having a higher proportion of students who are White and from higher SES backgrounds, and a lower proportion of first-generation students than some other universities. Moreover, the sample (both for the intervention itself and the qualitative evaluation) was overwhelmingly female. In SA, this is often the case in voluntary psychosocial and well-being interventions, possibly related to gender norms regarding distress and help-seeking (Atik and Yalçin, Reference Atik and Yalçin2011; Juvrud and Rennels, Reference Juvrud and Rennels2017). As such, caution should be applied in considering the implications of these findings for digital mental health programming in SA and other LMICs, and efforts must be made to replicate this methodology across a range of different campuses. Specifically, this highlights the need to conduct similar research on the potential for digital mental health interventions in lower-resourced universities in SA to compare and contrast findings with the current study. Further, this is particularly important given that this study took place during the more acute phases of SA’s COVID-19 pandemic, and so it will also be important for future studies to understand how the factors identified in this study play out outside of the pandemic context. Finally, no formal inter-rater agreement reliability statistics were calculated. While disagreements were minimal, and resolved by a senior coder, as noted, this is a limitation of the present analysis.

Open peer review

To view the open peer review materials for this article, please visit http://doi.org/10.1017/gmh.2023.39.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/gmh.2023.39.

Data availability statement

All transcripts to support the study are available upon request to the authors.

Acknowledgements

We would like to acknowledge the hard work of the therapists who implemented the intervention, as well as the students who gave their time to complete the survey and participate in interviews.

Author contribution

Analysis: D.C.J., E.B., J.A.N., X.H.; Conceptualisation: J.B.; Editing and revising drafts: All authors; First draft: X.H.; Methodology and data collection: D.C.J., E.B.

Financial support

The work reported herein was made possible through funding by the South African Medical Research Council (SAMRC) through its Division of Research Capacity Development under the MCSP (awarded to J.B.) and the National Research Foundation (NRF) (Grant number 142143, awarded to J.B.). The content hereof is the sole responsibility of the authors and does not necessarily represent the official views of the SAMRC or NRF.

Competing interest

The authors declare none.

Ethics statement

These interviews were recorded, transcribed and analysed. Ethical approval was obtained from the Psychology Ethics Committee, Stellenbosch University (N19/10/145, Project ID: 12977). All participants provided informed consent.

References

Ali, K, Farrer, L, Gulliver, A and Griffiths, KM (2015) Online peer-to-peer support for young people with mental health problems: A systematic review. JMIR Mental Health 2(2), e4418.CrossRefGoogle ScholarPubMed
Alonso, J, Vilagut, G, Mortier, P, Auerbach, RP, Bruffaerts, R, Cuijpers, P, Demyttenaere, K, Ebert, DD, Ennis, E and Gutiérrez‐García, RA (2019) The role impairment associated with mental disorder risk profiles in the WHO world mental health international college student initiative. International Journal of Methods in Psychiatric Research 28(2), e1750.CrossRefGoogle ScholarPubMed
Al-Qaisy, LM (2011) The relation of depression and anxiety in academic achievement among group of university students. International Journal of Psychology and Counselling 3(5), 96100.Google Scholar
Atik, G and Yalçin, Y (2011) Help-seeking attitudes of university students: The role of personality traits and demographic factors. South Africa Journal of Psychology 41(3), 328338.CrossRefGoogle Scholar
Auerbach, RP, Mortier, P, Bruffaerts, R, Alonso, J, Benjet, C, Cuijpers, P, Demyttenaere, K, Ebert, DD, Green, JG and Hasking, P (2018) WHO world mental health surveys international college student project: Prevalence and distribution of mental disorders. Journal of Abnormal Psychology 127(7), 623.CrossRefGoogle ScholarPubMed
Bantjes, J (2022) Digital solutions to promote adolescent mental health: Opportunities and challenges for research and practice. PLoS Medicine 19(5), e1004008.CrossRefGoogle ScholarPubMed
Bantjes, J, Hunt, X and Stein, DJ (2022 ) Public health approaches to promoting university students’ mental health: A global perspective. Current Psychiatry Reports 24, 809818.CrossRefGoogle ScholarPubMed
Bantjes, J, Kazdin, AE, Cuijpers, P, Breet, E, Dunn-Coetzee, M, Davids, C, Stein, DJ and Kessler, RC (2021) A web-based group cognitive behavioral therapy intervention for symptoms of anxiety and depression among university students: Open-label, pragmatic trial. JMIR Mental Health 8(5), e27400.CrossRefGoogle ScholarPubMed
Bantjes, J, Lochner, C, Saal, W, Roos, J, Taljaard, L, Page, D, Auerbach, RP, Mortier, P, Bruffaerts, R and Kessler, RC (2019) Prevalence and sociodemographic correlates of common mental disorders among first-year university students in post-apartheid South Africa: Implications for a public mental health approach to student wellness. BMC Public Health 19(1), 112.CrossRefGoogle ScholarPubMed
Bantjes, J, Saal, W, Lochner, C, Roos, J, Auerbach, RP, Mortier, P, Bruffaerts, R, Kessler, RC and Stein, DJ (2020) Inequality and mental healthcare utilisation among first-year university students in South Africa. International Journal of Mental Health Systems 14(1), 111.CrossRefGoogle ScholarPubMed
Biagianti, B, Quraishi, SH and Schlosser, DA (2018) Potential benefits of incorporating peer-to-peer interactions into digital interventions for psychotic disorders: A systematic review. Psychiatric Services 69(4), 377388.CrossRefGoogle ScholarPubMed
Bion, WR (1952) Group dynamics: A review. International Journal of Psycho-Analysis 33, 235247.Google ScholarPubMed
Borghouts, J, Eikey, E, Mark, G, De Leon, C, Schueller, SM, Schneider, M, Stadnick, N, Zheng, K, Mukamel, D and Sorkin, DH (2021) Barriers to and facilitators of user engagement with digital mental health interventions: Systematic review. Journal of Medical Internet Research 23(3), e24387.CrossRefGoogle ScholarPubMed
Braun, V and Clarke, V (2006) Using thematic analysis in psychology. Qualitative Research in Psychology 3(2), 77101.CrossRefGoogle Scholar
Bruffaerts, R, Mortier, P, Auerbach, RP, Alonso, J, Hermosillo De la Torre, AE, Cuijpers, P, Demyttenaere, K, Ebert, DD, Green, JG and Hasking, P (2019) Lifetime and 12‐month treatment for mental disorders and suicidal thoughts and behaviors among first year college students. International Journal of Methods in Psychiatric Research 28(2), e1764.CrossRefGoogle ScholarPubMed
Bruwer, B, Sorsdahl, K, Harrison, J, Stein, DJ, Williams, D and Seedat, S (2011) Barriers to mental health care and predictors of treatment dropout in the south African stress and health study. Psychiatric Services 62(7), 774781.CrossRefGoogle ScholarPubMed
Carolan, S and de Visser, RO (2018) Employees’ perspectives on the facilitators and barriers to engaging with digital mental health interventions in the workplace: Qualitative study. JMIR Mental Health 5(1), e9146.CrossRefGoogle ScholarPubMed
Charles, NE, Strong, SJ, Burns, LC, Bullerjahn, MR and Serafine, KM (2021) Increased mood disorder symptoms, perceived stress, and alcohol use among college students during the COVID-19 pandemic. Psychiatry Research 296, 113706.CrossRefGoogle ScholarPubMed
Clarke, V and Braun, V (2013 ) Successful Qualitative Research: A Practical Guide for Beginners. Thousand Oaks, CA: Sage, pp. 1400.Google Scholar
Copeland, WE, McGinnis, E, Bai, Y, Adams, Z, Nardone, H, Devadanam, V, Rettew, J and Hudziak, JJ (2021) Impact of COVID-19 pandemic on college student mental health and wellness. Journal of the American Academy of Child & Adolescent Psychiatry 60(1), 134141.e132.CrossRefGoogle ScholarPubMed
Demyttenaere, K (2004) WHO world mental health survey consortium: Prevalence, severity, and unmeet need for treatment of mental disorders in the World Health Organization world mental health surveys. JAMA 291, 25812590.Google Scholar
Docrat, S, Besada, D, Cleary, S, Daviaud, E and Lund, C (2019) Mental health system costs, resources and constraints in South Africa: A national survey. Health Policy and Planning 34(9), 706719.CrossRefGoogle ScholarPubMed
Eisenberg, D, Hunt, J and Speer, N (2012) Help seeking for mental health on college campuses: Review of evidence and next steps for research and practice. Harvard Review of Psychiatry 20(4), 222232.CrossRefGoogle ScholarPubMed
Ennis, E, McLafferty, M, Murray, E, Lapsley, C, Bjourson, T, Armour, C, Bunting, B, Murphy, S and O’Neill, S (2019) Readiness to change and barriers to treatment seeking in college students with a mental disorder. Journal of Affective Disorders 252, 428434.CrossRefGoogle ScholarPubMed
Ford II, JH, Alagoz, E, Dinauer, S, Johnson, KA, Pe-Romashko, K and Gustafson, DH (2015) Successful organizational strategies to sustain use of A-CHESS: A mobile intervention for individuals with alcohol use disorders. Journal of Medical Internet Research 17(8), e3965.CrossRefGoogle Scholar
Gericke, F, Ebert, DD, Breet, E, Auerbach, RP and Bantjes, J (2021) A qualitative study of university students’ experience of internet‐based CBT for depression. Counselling and Psychotherapy Research 21(4), 792804.CrossRefGoogle Scholar
Greenberg, RP, Constantino, MJ and Bruce, N (2006) Are patient expectations still relevant for psychotherapy process and outcome? Clinical Psychology Review 26(6), 657678.CrossRefGoogle ScholarPubMed
Grist, R, Porter, J and Stallard, P (2017) Mental health mobile apps for preadolescents and adolescents: A systematic review. Journal of Medical Internet Research 19(5), e7332.CrossRefGoogle ScholarPubMed
Grøtan, K, Sund, ER and Bjerkeset, O (2019) Mental health, academic self-efficacy and study progress among college students – The SHoT study, Norway. Frontiers in Psychology 10, 45.CrossRefGoogle ScholarPubMed
Hilliard, RC, Watson, JC and Zizzi, SJ (2022) Stigma, attitudes, and intentions to seek mental health services in college student-athletes. Journal of American College Health 70, 14761485.CrossRefGoogle ScholarPubMed
Ibrahim, N, Amit, N, Shahar, S, Wee, LH, Ismail, R, Khairuddin, R, Siau, CS and Safien, AM (2019) Do depression literacy, mental illness beliefs and stigma influence mental health help-seeking attitude? A cross-sectional study of secondary school and university students from B40 households in Malaysia. BMC Public Health 19, 18.CrossRefGoogle ScholarPubMed
Juvrud, J and Rennels, JL (2017) “I don’t need help”: Gender differences in how gender stereotypes predict help-seeking. Sex Roles 76, 2739.CrossRefGoogle Scholar
Leech, T, Dorstyn, D, Taylor, A and Li, W (2021) Mental health apps for adolescents and young adults: A systematic review of randomised controlled trials. Children and Youth Services Review 127, 106073.CrossRefGoogle Scholar
Lehtimaki, S, Martic, J, Wahl, B, Foster, KT and Schwalbe, N (2021) Evidence on digital mental health interventions for adolescents and young people: Systematic overview. JMIR Mental Health 8(4), e25847.CrossRefGoogle Scholar
Martinez-Martin, N and Kreitmair, K (2018) Ethical issues for direct-to-consumer digital psychotherapy apps: Addressing accountability, data protection, and consent. JMIR Mental Health 5(2), e9423.CrossRefGoogle ScholarPubMed
Melling, B and Houguet-Pincham, T (2011) Online peer support for individuals with depression: A summary of current research and future considerations. Psychiatric Rehabilitation Journal 34(3), 252.CrossRefGoogle ScholarPubMed
Microsoft Corporation (2018) Microsoft Excel.Google Scholar
Newson, J, Sukhoi, O, Taylor, J, Topalo, O and Thiagarajan, T (2021) Mental state of the world report 2021.Google Scholar
Punukollu, M and Marques, M (2019) Use of mobile apps and technologies in child and adolescent mental health: A systematic review. Evidence-Based Mental Health 22(4), 161166.CrossRefGoogle ScholarPubMed
Saxena, S, Thornicroft, G, Knapp, M and Whiteford, H (2007) Resources for mental health: Scarcity, inequity, and inefficiency. The Lancet 370(9590), 878889.CrossRefGoogle ScholarPubMed
Scheidlinger, S (1997) Group dynamics and group psychotherapy revisited: Four decades later. International Journal of Group Psychotherapy 47(2), 141159.CrossRefGoogle ScholarPubMed
Sharma, A, Choudhury, M, Althoff, T and Sharma, A (2020) Engagement patterns of peer-to-peer interactions on mental health platforms. Proceedings of the International AAAI Conference on Web and Social Media 14, 614625.CrossRefGoogle Scholar
Stein, DJ, Naslund, JA and Bantjes, J (2022a) COVID-19 and the global acceleration of digital psychiatry. The Lancet Psychiatry 9(1), 89.CrossRefGoogle ScholarPubMed
Stein, DJ, Shoptaw, SJ, Vigo, DV, Lund, C, Cuijpers, P, Bantjes, J, Sartorius, N and Maj, M (2022b) Psychiatric diagnosis and treatment in the 21st century: Paradigm shifts versus incremental integration. World Psychiatry 21(3), 393414.CrossRefGoogle ScholarPubMed
Sutherland, JD (1985) Bion Revisited: Group Dynamics and Group Psychotherapy. London: Routledge and Kegan Paul.Google Scholar
van Olmen, J, Erwin, E, García-Ulloa, AC, Meessen, B, Miranda, JJ, Bobrow, K, Iwelunmore, J, Nwaozuru, U, Umeh, CO and Smith, C (2020) Implementation barriers for mHealth for non-communicable diseases management in low and middle income countries: A scoping review and field-based views from implementers. Wellcome Open Research 5, 7.Google ScholarPubMed
Vanheusden, K, van der Ende, J, Mulder, CL, van Lenthe, FJ, Verhulst, FC and Mackenbach, JP (2008) The use of mental health services among young adults with emotional and behavioural problems: Equal use for equal needs? Social Psychiatry and Psychiatric Epidemiology 43(10), 808815.CrossRefGoogle ScholarPubMed
Visser, M and Law-van Wyk, E (2021) University students’ mental health and emotional wellbeing during the COVID-19 pandemic and ensuing lockdown. South Africa Journal of Psychology 51(2), 229243.CrossRefGoogle Scholar
Waegemann, CP (2010) mHealth: The next generation of telemedicine? Telemedicine and e-Health 16(1), 2326.Google ScholarPubMed
Weinberg, H (2020) Online group psychotherapy: Challenges and possibilities during COVID-19 – A practice review. Group Dynamics: Theory, Research, and Practice 24(3), 201.CrossRefGoogle Scholar
Weinberg, H (2021) Obstacles, challenges, and benefits of online group psychotherapy. American Journal of Psychotherapy 74(2), 8388.CrossRefGoogle ScholarPubMed
Figure 0

Figure 1. Digital interventions as organized along the dimensions of digitization and intensity.

Figure 1

Table 1. Participant demographics for the whole sample

Figure 2

Table 2. Themes

Supplementary material: File

Hunt et al. supplementary material

Hunt et al. supplementary material

Download Hunt et al. supplementary material(File)
File 17.3 KB

Author comment: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R0/PR1

Comments

Dear Editors

It gives us great pleasure to submit our paper entitled “South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people”. We believe that the piece makes an important contribution to the literature on digital interventions among young people in low- and middle-income countries, as it shows how implementation and contextual factors shape delivery and perceived impact.

Additional highlights of the work include a conceptual contribution to thinking about online group therapy as a modality.

We have no conflicts of interest to disclose.

Thank you for your kind consideration of this piece.

Yours sincerely,

Dr Xanthe Hunt, on behalf of the authors

Review: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R0/PR2

Conflict of interest statement

I work at the same university as many of the authors of this paper, but in a different department, and have worked with the last author in the past, but the last time we worked together was in 2019 (over 3 years ago) and we are not currently collaborating on any projects.

Comments

This paper sets out to evaluate participants’ experiences of an online group cognitive behavioural intervention conducted over 10 weeks during the Covid-19 pandemic in 2020 in South Africa. The intervention itself addresses an important problem, with high levels of mental disorders among students in South Africa and substantial treatment gaps and resource constraints limiting the uptake of traditional in-person psychotherapy solutions. To understand the participants’ experiences of this intervention the authors collected responses to a qualitative questionnaire (n = 125) and further interviewed 12 participants.

The paper is clearly written and generally follows well-established conventions for this type of study. As such, I only have a few comments for the authors at this point.

Introduction:

The introduction and background provide a solid foundation and framing for the study. I appreciate the framework provided in figure 1 that conceptualises digital interventions as falling along two dimensions: digitisation and intensity. It is perhaps worth 1) returning to this in the discussion and 2) expanding the treatment of this framework. In particular, the authors should consider the necessity for a practitioner, the ability of interventions to scale or not, and the degree to which individuals can choose how and when to engage with the interventions (beyond simply the degree of digitalisation).

Following the description of the framework the authors describe the specific intervention adopted in this study - an online group behavioural therapy programme that took the form of video calls with groups of participants. The motivations for this choice—to capitalise on the cost- and scale-benefits of digital accessibility while not losing the effectiveness of in-person therapy—are sound. Despite this, the initial focus and discussion of apps and app-based interventions does appear somewhat out of place given the specific nature of the intervention in question.

The introduction concludes with the provision of a research objective for the study - “to conduct qualitative interviews with participants from the 2020 pilot study to explore their perspectives of the intervention’s content and delivery, and to provide insights to guide future implementation of the GCBT intervention.” Were there particular guiding research questions or aspects of their experiences that the study aimed to uncover other than merely exploring? Did these guide the construction of the interview guide? If not, what informed the development of the interview guide?

Methods:

The methods are well described and follow well-established norms and conventions for this type of study. To further understand the sample and the intervention itself, the paper would benefit from more information on how participants were originally recruited for the GCBT intervention in the first place. In particular, given that this comes up in the qualitative data, the initial framing of the purpose of the intervention would provide further context and understanding for these findings.

To help the reader further understand how the data collected, assuming that no ethics approval for sharing the actual study data was provided, I recommend including the interview guide as an appendix/in the online supplementary material. Further, the instruments section should motivate the factors that guided the selection of questions and areas to focus on in the interviews.

What software was used to code the interviews and survey responses?

Findings:

The description and presentation of the findings are good. However, the framing of the themes and sub-themes as they appear in the section titles and in table 1 appears to be misaligned with thematic as defined by Braun and Clarke. To me, the current themes read more like domain summaries than themes. Braun and Clarke define a theme as follows:

“A theme captures a common, recurring pattern across a dataset, organised around a central organising concept. A theme tends to describe the different facets of a pattern across the dataset. A subtheme exists ‘underneath’ the umbrella of a theme. It shares the same central organising concept as the theme, but focuses on one notable specific element. “

They then define a domain summary as follows:

“The difference between a theme and a domain summary is a source of frequent confusion in much published TA research. A domain summary is a summary of an area (domain) of the data; for example, a summary of everything the participants said in relation to a particular topic or interview question. Unlike themes, there isn’t anything that unifies the description of what participants said about this topic – there is no underlying concept that ties everything together and organises the analytic observations. In our approach to TA, themes are conceptualised as patterns in the data underpinned by a central concept that organises the analytic observations; this is rather different from a domain summary, and the two should ideally not be confused when using our approach. More simply put, a theme identifies an area of the data and tells the reader something about the shared meaning in it, whereas a domain summary simply summarises participant’s responses relating to a particular topic (so shared topic but not shared meaning). “

Currently the themes are:

- Context of the intervention

- Expectations

- Factors affecting uptake and engagement

- Perception of therapeutic value of the intervention.

These are domains within the data (topics) within which one would expect to find themes that capture shared meaning. I think the actual write-up of the findings is fine but the framing and organisation merits revision.

It is also important to acknowledge the role of the pandemic in the intervention and how this might differ going forward. Many of the benefits only existed due to the context at the time. This perhaps limits transferability of some of these findings to other times and contexts.

Similarly, themes around expectations will relate specifically to how the GCBT intervention was framed during the initial recruitment and advertising phases. As I noted above, more information on this is needed to provide the necessary context for these findings.

Overall Evaluation

Overall, this is an interesting and timely study that provides valuable insights into students’ experiences with online video-conferencing based cognitive behavioural interventions. My comments are generally fairly high level and primarily relate to the framing of the study and the organisation of the themes. In terms of implications, in addition to the six outline, I think it is also necessary to understand and manage the expectations of participants in any future such interventions.

Review: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R0/PR3

Conflict of interest statement

Reviewer declares none.

Comments

**Summary**

This paper presents a qualitative study to understand university students’ experience with an online group cognitive behavioral therapy intervention. While students overall had a positive experience, there were barriers to engagement related to the digital and group format of the therapy sessions.

The manuscript presents interesting findings and is overall straightforward to read. I have listed my suggestions for improvement below, which I believe can be addressed in a revision of the paper.

**Comments**

• L19: Spell out LMIC when it is first used in the impact statement.

• L125-126: I assume that the intervention is described in more detail in the other publication, but it would be helpful to provide a bit more detail on the intervention here to better contextualize the paper findings. For example, what efforts were made to ensure the intervention does “not lose the effectiveness and acceptability of in-person therapy” and how was it “found to show promise as an effective and sustainable intervention”?

• L149: how were students recruited? Was everyone from the university sent an email, was it promoted during a specific class, was it promoted on social media, etc.? How many students were reached and how many enrolled in the study?

• It is not clear how participants were randomly selected for the interviews. The Method section reads as if a random sample of 12 participants was selected, but the Discussion section implies that 60 participants were selected and that out of these, 12 responded to participate. A more detailed explanation would help here.

• L149: spell out SA when it is first used.

• L164: was one email sent to a different group of 12 participants 5 times, thus resulting in a group of 60? Were people sent reminder emails?

• Did participants have any prior experience with therapy? Participants with no mental health challenges or prior experience with therapy may have a different experience.

• L202-204: were any reliability checks done between coders?

• One of the interview themes related to people having low initial expectations. Is there any data available related to people’s motivations to participate, if they had low expectations of the intervention?

• I would move study limitations to the end of the Discussion, as it reduces the strength and impact of the paper by opening the Discussion with this.

• L455: do the authors have any suggestions on how to increase male participation in future studies?

• The paper presents manualization as a characteristic of the therapy being a digital format, but aren’t in-person CBT and group therapies often manualized as well?

• The paper uses quite a bit of abbreviations and it would be helpful to have a list of the abbreviations and what each stands for at the end of the paper.

Recommendation: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R0/PR4

Comments

Thank you for submitting your manuscript for review. The reviewers have provided valuable comments and feedback.

We invite you to address the comments and suggestions in your response.

Decision: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R0/PR5

Comments

No accompanying comment.

Author comment: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R1/PR6

Comments

Dear Editor and Reviewers

Thank you very much for the opportunity to revise our manuscript. We found the reviewers’ feedback helpful, and hope that you find the manuscript strengthened. We would be happy to make any further revisions if needed.

Our findings are detailed in the table below.

Yours sincerely,

Xanthe Hunt, on behalf of the authors

General

Please include the abstract in the main text document.

This was included in the original submission. We are not sure why it was not visible to the reader. However, these sections are reflecting on our proofs on the online ‘view submission’ document, too.

Please include an Impact Statement below the abstract (max. 300 words). This must not be a repetition of the abstract but a plain worded summary of the wider impact of the article.

This was included in the original submission. We are not sure why it was not visible to the reader. However, these sections are reflecting on our proofs on the online ‘view submission’ document, too. We have ensured that the Impact Statement so that it aligns with the journal’s specifications.

Submission of graphical abstracts is encouraged for all articles to help promote their impact online. A Graphical Abstract is a single image that summarises the main findings of a paper, allowing readers to quickly gain an overview and understanding of your work. Ideally, the graphical abstract should be created independently of the figures already in the paper, but it could include a (simplified version of) an existing figure or a combination thereof. If you do not wish to include a graphical abstract please let me know.

We have included a graphical abstract in the submission.

Please ensure references are correctly formatted. In text citations should follow the author and year style. When an article cited has three or more authors the style ‘Smith et al. 2013’ should be used on all occasions. At the end of the article, references should first be listed alphabetically, with a full title of each article, and the first and last pages. Journal titles should be given in full.

Statements of the following are required in the main text document at the end of all articles: ‘Author Contribution Statement’, ‘Financial Support’, ‘Conflict of Interest Statement’, ‘Ethics statement’ (if appropriate), ‘Data Availability Statement’. Please see the author guidelines for further information.

Thank you – these were also submitted on the original document. We hope that they are visible to the reader.

Reviewer 1

This paper sets out to evaluate participants’ experiences of an online group cognitive behavioural intervention conducted over 10 weeks during the Covid-19 pandemic in 2020 in South Africa. The intervention itself addresses an important problem, with high levels of mental disorders among students in South Africa and substantial treatment gaps and resource constraints limiting the uptake of traditional in-person psychotherapy solutions. To understand the participants’ experiences of this intervention the authors collected responses to a qualitative questionnaire (n = 125) and further interviewed 12 participants.

The paper is clearly written and generally follows well-established conventions for this type of study. As such, I only have a few comments for the authors at this point.

Thank you very much for your feedback on our paper, and the constructive comments. We hope that you find the submission strengthened following our revision of it.

Introduction:

The introduction and background provide a solid foundation and framing for the study. I appreciate the framework provided in figure 1 that conceptualises digital interventions as falling along two dimensions: digitisation and intensity. It is perhaps worth 1) returning to this in the discussion and 2) expanding the treatment of this framework. In particular, the authors should consider the necessity for a practitioner, the ability of interventions to scale or not, and the degree to which individuals can choose how and when to engage with the interventions (beyond simply the degree of digitalisation).

Thank you for this reflection. We have added a section to the discussion to draw this link – between the findings and this original framing – more clearly.

Following the description of the framework the authors describe the specific intervention adopted in this study - an online group behavioural therapy programme that took the form of video calls with groups of participants. The motivations for this choice—to capitalise on the cost- and scale-benefits of digital accessibility while not losing the effectiveness of in-person therapy—are sound. Despite this, the initial focus and discussion of apps and app-based interventions does appear somewhat out of place given the specific nature of the intervention in question.

Thank you for drawing our attention to this. We have refined and reduced some of the discussion of apps in the background section, and increased the discussion of online therapies, so that the specific intervention described is better aligned with the background.

The introduction concludes with the provision of a research objective for the study - “to conduct qualitative interviews with participants from the 2020 pilot study to explore their perspectives of the intervention’s content and delivery, and to provide insights to guide future implementation of the GCBT intervention.” Were there particular guiding research questions or aspects of their experiences that the study aimed to uncover other than merely exploring? Did these guide the construction of the interview guide? If not, what informed the development of the interview guide?

We have added detail on this to section 2.4. Please see the section which starts as follows, These questions were guided by our desire to understand the following…

Methods:

The methods are well described and follow well-established norms and conventions for this type of study. To further understand the sample and the intervention itself, the paper would benefit from more information on how participants were originally recruited for the GCBT intervention in the first place. In particular, given that this comes up in the qualitative data, the initial framing of the purpose of the intervention would provide further context and understanding for these findings.

Thank you for raising this. We have added some information about the recruitment for the intervention study, to the methods section. See the section starting, Recruitment for the pilot open label trial is…

To help the reader further understand how the data collected, assuming that no ethics approval for sharing the actual study data was provided, I recommend including the interview guide as an appendix/in the online supplementary material. Further, the instruments section should motivate the factors that guided the selection of questions and areas to focus on in the interviews. We have included the interview schedule as a Supplementary File and provided an explanation of the choice of focus areas for the questionnaire under 2.4 Instruments.

What software was used to code the interviews and survey responses?

We used Microsoft Excel to code the transcripts. We have added detail on this process under 2.6 Data Analysis.

Findings:

The description and presentation of the findings are good. However, the framing of the themes and sub-themes as they appear in the section titles and in table 1 appears to be misaligned with thematic as defined by Braun and Clarke. To me, the current themes read more like domain summaries than themes. Braun and Clarke define a theme as follows:

“A theme captures a common, recurring pattern across a dataset, organised around a central organising concept. A theme tends to describe the different facets of a pattern across the dataset. A subtheme exists ‘underneath’ the umbrella of a theme. It shares the same central organising concept as the theme, but focuses on one notable specific element. “

They then define a domain summary as follows:

“The difference between a theme and a domain summary is a source of frequent confusion in much published TA research. A domain summary is a summary of an area (domain) of the data; for example, a summary of everything the participants said in relation to a particular topic or interview question. Unlike themes, there isn’t anything that unifies the description of what participants said about this topic – there is no underlying concept that ties everything together and organises the analytic observations. In our approach to TA, themes are conceptualised as patterns in the data underpinned by a central concept that organises the analytic observations; this is rather different from a domain summary, and the two should ideally not be confused when using our approach. More simply put, a theme identifies an area of the data and tells the reader something about the shared meaning in it, whereas a domain summary simply summarises participant’s responses relating to a particular topic (so shared topic but not shared meaning).

Thank you for providing this detailed description and these recommendation for presentation of our findings. Please see our specific responses to your comments below.

Currently the themes are:

- Context of the intervention

- Expectations

- Factors affecting uptake and engagement

- Perception of therapeutic value of the intervention.

These are domains within the data (topics) within which one would expect to find themes that capture shared meaning. I think the actual write-up of the findings is fine but the framing and organisation merits revision.

Thank you very much for this useful reflection. We have renamed the themes to reflect, more accurately, what is ‘going on’ within the domain. So, the theme titles have moved from being a designation of the area of information covered and are now more descriptive of the findings encompassed in the themes. These new theme names are:

• Opportunities for connection and continuity

• Reality versus expectations

• Group format and online spaces as barriers and facilitators

• Perception of therapeutic value of the intervention

It is also important to acknowledge the role of the pandemic in the intervention and how this might differ going forward. Many of the benefits only existed due to the context at the time. This perhaps limits transferability of some of these findings to other times and contexts.

This is an important caveat to our findings, and we have mentioned it in the limitations section, now (see the paragraph starting ‘Before proceeding with an in-depth…”

Similarly, themes around expectations will relate specifically to how the GCBT intervention was framed during the initial recruitment and advertising phases. As I noted above, more information on this is needed to provide the necessary context for these findings. We have revised the section on expectations to more explicitly discuss how the intervention was initially framed to the participants.

We have also added detail on the original recruitment process, to the methods section (see above).

Overall, this is an interesting and timely study that provides valuable insights into students’ experiences with online video-conferencing based cognitive behavioural interventions. My comments are generally fairly high level and primarily relate to the framing of the study and the organisation of the themes. In terms of implications, in addition to the six outline, I think it is also necessary to understand and manage the expectations of participants in any future such interventions.

Thank you so much for your comments on the paper – they were extremely helpful. We have also added some more information, in the limitations section, regarding how managing expectations and appropriate advertising/recruitment can be managed in the context of digital interventions like this one.

Reviewer 2

This paper presents a qualitative study to understand university students’ experience with an online group cognitive behavioural therapy intervention. While students overall had a positive experience, there were barriers to engagement related to the digital and group format of the therapy sessions. The manuscript presents interesting findings and is overall straightforward to read. I have listed my suggestions for improvement below, which I believe can be addressed in a revision of the paper.

Thank you so much for your helpful comments. We hope that you find the revised paper strengthened.

L19: Spell out LMIC when it is first used in the impact statement.

Amended in text.

L125-126: I assume that the intervention is described in more detail in the other publication, but it would be helpful to provide a bit more detail on the intervention here to better contextualize the paper findings. For example, what efforts were made to ensure the intervention does “not lose the effectiveness and acceptability of in-person therapy” and how was it “found to show promise as an effective and sustainable intervention”?

We have added this detail to the methods section.

L149: how were students recruited? Was everyone from the university sent an email, was it promoted during a specific class, was it promoted on social media, etc.? How many students were reached and how many enrolled in the study?

Thank you for raising this point. We have added a detailed piece on recruitment to the section 2.3 Participants and Procedure. 175 students were enrolled in the pilot RCT from which the qualitative sub-sample for this analysis were taken.

It is not clear how participants were randomly selected for the interviews. The Method section reads as if a random sample of 12 participants was selected, but the Discussion section implies that 60 participants were selected and that out of these, 12 responded to participate. A more detailed explanation would help here.

We have significantly revised the section 2.3 Participants and Procedure to provide greater clarity on the process.

L149: spell out SA when it is first used.

Amended in text.

L164: was one email sent to a different group of 12 participants 5 times, thus resulting in a group of 60? Were people sent reminder emails?

This detail has also been added to section 2.3:

Five rounds of recruitment emails were sent to 12 participants each time (60 in total). In each recruitment batch, the 12 participant email addresses were randomly selected from the total sample using a random number generator. One reminder email was sent to each participant before a new batch of emails was sent out. The process of sending recruitment emails was continued until the sample size for the in-depth interviews had been achieved (n=12).

Did participants have any prior experience with therapy? Participants with no mental health challenges or prior experience with therapy may have a different experience.

This is an important point. We have added context on the recruitment, as noted above, and in this section, we have specified that participants did not need to have any symptoms in order to participate (there were no symptom thresholds for participation).

L202-204: were any reliability checks done between coders?

No quantitative reliability checks were conducted – so no inter-rater agreement statistics can be reported (and this is not always recommended for qualitative work). This has been added to the limitations section. See the section starting Finally, because coding was conducted in Microsoft Excel, no formal inter-rater agreement…

One of the interview themes related to people having low initial expectations. Is there any data available related to people’s motivations to participate, if they had low expectations of the intervention?

Unfortunately, we do not have data on motivations to participate (although this is something we would like to explore in future work). Our sense, however, regarding the expectation-setting piece, is that this particular group of students assumed ‘psychological skills training’ to be more like a course, than like therapy. Perhaps this is in part due to the fact that it was delivered on a university campus, and so the context cues are heavily weighted towards educational offerings rather than therapeutic ones.

I would move study limitations to the end of the Discussion, as it reduces the strength and impact of the paper by opening the Discussion with this.

Amended in text, thank you.

L455: do the authors have any suggestions on how to increase male participation in future studies?

This is the million-dollar question. One option, which we have now mentioned in text, is that specific social marketing strategies need to be tested and their effectiveness evaluated using gender-disaggregated data.

The paper presents manualization as a characteristic of the therapy being a digital format, but aren’t in-person CBT and group therapies often manualized as well?

This is a good point. Many in-person CBT groups are manualised, too. However, with digitisation comes immense opportunities to scale. And it is this scaling quickly and to massive proportions that means that manualisation risks becoming reductive. Where an intervention – say the in-person versions of Thinking Healthy – is manualised but training is still done at a relatively small scale, there are opportunities to refine delivery for audience, and by context, setting etc. Whereas, with the opportunities afforded by digital platforms, these opportunities may be lacking.

The paper uses quite a bit of abbreviations and it would be helpful to have a list of the abbreviations and what each stands for at the end of the paper.

We are asking the journal if they permit a list of abbreviations to be included at the start of the article. If they do, we will certainly make one.

Review: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R1/PR7

Conflict of interest statement

No known conflicts of interest. I know two of the authors but they are at different institutions and we haven’t worked together in over 4 years.

Comments

I am satisfied with the revisions made in response to my comments and am happy to recommend acceptance.

Review: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R1/PR8

Conflict of interest statement

Reviewer declares none.

Comments

I thank the authors for their responses to reviewer comments and revision of the paper.

I am happy with the changes made and suggest acceptance of the paper.

Recommendation: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R1/PR9

Comments

Thank you for responding to the reviewers comments, suggestions and submitting an updated manuscript. Both reviewers are satisfied with your response and the updated manuscript and have recommended that we accept your manuscript.

Decision: South African university students’ experiences of online group cognitive behavioural therapy: Implications for delivering digital mental health interventions to young people — R1/PR10

Comments

No accompanying comment.