We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Learner training is essential for the realization of the potentials of CALL. It can not only safeguard the smooth implementation of CALL activities and facilitate the concomitant learning, but also enhance learners’ active and effective engagement with technological resources for language learning on their own in the informal learning contexts. This chapter gives an overview of how it is conceptualized and operationalized in existing literature. It argues for greater research attention to learner training in and for informal language learning in technological spaces and in-depth exploration into the intersection of contextual factors and learner training. It further advocates more differentiated and personalized approaches to learner training.
The prevalence of digital technologies, augmented by the emergence of generative AI, expands opportunities for language learning and use, empowers new modes of learning, and blurs the boundaries of in-class and out-of-class language learning. The language education community is challenged to reconceptualize the paradigm of language learning and utilize the affordances of technologies to synergize in-class and out-of-class language learning. To achieve this, in-depth understanding of in-class learning and out-of-class digital experiences in relation to one another is needed to inform curriculum and pedagogy conceptualization and implementation. With this aim in mind, we put forth a research agenda around six research themes. We hope that this Thinking Allowed piece can stimulate and guide systematic research efforts towards unleashing the potential of technologies to synergize in-class and out-of-class language learning and create holistic and empowering learning experiences for language learners.
Terminal cancer patients often endure significant distress, impacting their quality of life. Spiritual well-being provides peace and meaning during this challenging period.
Objectives. This study explored the spiritual well-being of terminally ill patients and their next-of-kin caregivers in hospice care, focusing on factors influencing their spiritual experiences.
Methods
This mixed-methods study included 30 terminally ill patients and 17 next-of-kin caregivers in hospice care. Spiritual well-being was assessed using the Functional Assessment of Chronic Illness Therapy – Spiritual Well-Being Scale (FACIT-Sp-12), and symptom distress with the Edmonton Symptom Assessment Scale. Qualitative data were collected through semi-structured interviews at baseline, 1 week, and 1 month. Data were analyzed using quantitative methods and thematic analysis.
Results
Patients showed a significant improvement in spiritual well-being over time, with FACIT-Sp-12 scores increasing from 28.6 at baseline to 31.3 at 1 month (p < .01). Symptoms such as shortness of breath (β = –1.19, p < .001), drowsiness (β = –1.27, p = .01), and anxiety (β = –0.60, p = .03) were negatively associated with spiritual well-being. Caregiver spiritual well-being positively influenced patient scores, especially with female caregivers (β = 0.26, p < .001). Qualitative findings supported these results, revealing themes of spiritual adjustment, the impact of physical symptoms on spiritual well-being, and the crucial role of caregivers in providing emotional and spiritual support.
Significance of results
Early palliative care facilitates spiritual adjustment in terminally ill patients. A holistic approach addressing physical symptoms and psychological distress is essential. Supporting caregivers, particularly female ones, positively impacts patient spiritual well-being. Tailored interventions considering the unique needs of patients and caregivers are recommended to enhance palliative care quality.
Previous studies investigating the effectiveness of augmentation therapy have been limited.
Aims
To evaluate the effectiveness of antipsychotic augmentation therapies among patients with treatment-resistant depression.
Method
We included patients diagnosed with depression receiving two antidepressant courses within 1 year between 2009 and 2020 and used the clone-censor-weight approach to address time-lag bias. Participants were assigned to either an antipsychotic or a third-line antidepressant. Primary outcomes were suicide attempt and suicide death. Cardiovascular death and all-cause mortality were considered as safety outcomes. Weighted pooled logistic regression and non-parametric bootstrapping were used to estimate approximate hazard ratios and 95% confidence intervals.
Results
The cohort included 39 949 patients receiving antipsychotics and the same number of matched antidepressant patients. The mean age was 51.2 (standard deviation 16.0) years, and 37.3% of participants were male. Compared with patients who received third-line antidepressants, those receiving antipsychotics had reduced risk of suicide attempt (sub-distribution hazard ratio 0.77; 95% CI 0.72–0.83) but not suicide death (adjusted hazard ratio 1.08; 95% CI 0.93–1.27). After applying the clone-censor-weight approach, there was no association between antipsychotic augmentation and reduced risk of suicide attempt (hazard ratio 1.06; 95% CI 0.89–1.29) or suicide death (hazard ratio 1.22; 95% CI 0.91–1.71). However, antipsychotic users had increased risk of all-cause mortality (hazard ratio 1.21; 95% CI 1.07–1.33).
Conclusions
Antipsychotic augmentation was not associated with reduced risk of suicide-related outcomes when time-lag bias was addressed; however, it was associated with increased all-cause mortality. These findings do not support the use of antipsychotic augmentation in patients with treatment-resistant depression.
Supporting family caregivers (FCs) is a critical core function of palliative care. Brief, reliable tools suitable for busy clinical work in Taiwan are needed to assess bereavement risk factors accurately. The aim is to develop and evaluate a brief bereavement scale completed by FCs and applicable to medical staff.
Methods
This study adopted convenience sampling. Participants were approached through an intentional sampling of patients’ FCs at 1 palliative care center in Taiwan. This cross-sectional study referred to 4 theories to generate the initial version of the Hospice Foundation of Taiwan Bereavement Assessment Scale (HFT-BAS). A 9-item questionnaire was initially developed by 12 palliative care experts through Delphi and verified by content validity. A combination of exploratory factor analysis (EFA), reliability measures including items analysis, Cronbach’s alpha and inter-subscale correlations, and confirmatory factor analysis (CFA) was employed to test its psychometric properties.
Results
Two hundred seventy-eight participants conducted the questionnaire. Three dimensions were subsequently extracted by EFA: “Intimate relationship,” “Existential meaning,” and “Disorganization.” The Cronbach’s alpha of the HFT-BAS scale was 0.70, while the 3 dimensions were all significantly correlated with total scores. CFA was the measurement model: chi-squared/degrees of freedom ratio = 1.9, Goodness of Fit Index = 0.93, Comparative Fit Index = 0.92, root mean square error of approximation = 0.08. CFA confirmed the scale’s construct validity with a good model fit.
Significance of results
This study developed an HFT-BAS and assessed its psychometric properties. The scale can evaluate the bereavement risk factors of FCs in clinical palliative care.
In this paper, the authors introduce a new notion called the quantum wreath product, which is the algebra $B \wr _Q \mathcal {H}(d)$ produced from a given algebra B, a positive integer d and a choice $Q=(R,S,\rho ,\sigma )$ of parameters. Important examples that arise from our construction include many variants of the Hecke algebras, such as the Ariki–Koike algebras, the affine Hecke algebras and their degenerate version, Wan–Wang’s wreath Hecke algebras, Rosso–Savage’s (affine) Frobenius Hecke algebras, Kleshchev–Muth’s affine zigzag algebras and the Hu algebra that quantizes the wreath product $\Sigma _m \wr \Sigma _2$ between symmetric groups.
In the first part of the paper, the authors develop a structure theory for the quantum wreath products. Necessary and sufficient conditions for these algebras to afford a basis of suitable size are obtained. Furthermore, a Schur–Weyl duality is established via a splitting lemma and mild assumptions on the base algebra B. Our uniform approach encompasses many known results which were proved in a case by case manner. The second part of the paper involves the problem of constructing natural subalgebras of Hecke algebras that arise from wreath products. Moreover, a bar-invariant basis of the Hu algebra via an explicit formula for its extra generator is also described.
Game riskiness is an index to describe the variance of outcomes of choosing cooperation relative to that of choosing defection in prisoner’s dilemmas (PD). When the variance of cooperation is larger (smaller) than that of defection, the PD is labeled as a more-risky PD (less-risky PD). This article extends the previous work on game riskiness by examining its moderating role on the effect of expectation on cooperation under various PDs. We found across three studies that game riskiness moderated the effect of expectation on cooperation such that the effect of expectation on cooperation was larger in more-risky PDs than in less-risky counterparts. This effect was observed in N-person PD (Study 1), PD presented in both gain and loss domains (Study 2), and PD where expectation was manipulated instead of measured (Study 3). Furthermore, we found that participants cooperated more in PDs presented in the gain domain compared to those presented in the loss domain, and this effect was again moderated by game riskiness. In addition, we illustrated mathematically that game riskiness is related to other established indices of PD, including the index of cooperation, fear index, and greed index. This article identified game riskiness as a robust situational factor that can impact decisions in social dilemmas. It also provided insights into the underlying motivations of cooperation and defection under different expectations and how game riskiness can be utilized in cooperation research.
The research field of online informal English learning has revealed associations of various informal digital English activities and second language vocabulary development. However, most of these studies have regarded digital resources as uniform entities when investigating their potential for vocabulary development and have failed to consider learners’ idiosyncratic interaction with the resources driven by self-defined purposes of use. Informed by the uses and gratifications theory, this study explored how three purposes of extramural digital experience (entertainment, socialization and information) relate to vocabulary knowledge, based on the survey responses from 322 undergraduate Chinese EFL learners and their receptive vocabulary knowledge. PLS-SEM analysis uncovered differential associations of the three media use purposes with receptive vocabulary knowledge. The study also revealed that the associations between the purposes of informal digital activities and vocabulary knowledge differed depending on whether the vocabulary was high frequency or low frequency. Additionally, it was found that the strategic use of digital resources, in terms of cognitive attention to and processing of lexical information that are facilitative of vocabulary learning during and/or after the interaction, played a significant moderating role in the relationship between digital activities for information purposes and receptive knowledge of high-frequency vocabulary. The findings highlight the importance of considering media use purposes in future research and pedagogical practices.
This study aimed to develop and evaluate a scenario-based nutrition literacy (NL) online programme for Taiwanese college students.
Design:
A randomised pilot trial design was used in this study.
Setting:
The study was conducted at a university in Taiwan. The intervention consisted of a five-unit web-based NL programme including videos of real-life scenario-based stories, situational analysis teaching and after-unit quizzes. Theme-related website information and smartphone apps (both iOS and Android systems) were offered for reference in every unit. The NL measure consisted of a self-rated scale, a scenario-based test and a healthy eating behaviour survey. Paired sample t-tests and ANCOVA were performed to test the effects on NL and healthy eating behaviour.
Participants:
Participants were ninety-eight students, with a retention rate of 98 %. The ratio of men to women was 0·2:1. Most students were freshmen (48 %).
Results:
Compared with the control group, the experimental group showed significant post-intervention improvements in the NL and healthy eating behaviours after controlling for pretest scores.
Conclusions:
This pilot study offers preliminary evidence of the potential positive effects of implementing a scenario-based NL online programme for college students. It offers a possibly novel strategy to enhance health-promoting behaviours in Taiwanese universities. Further research with larger sample sizes and more rigorous designs is warranted to confirm and build upon these initial findings.
Alcohol use is influenced by genetic and environmental factors. We examined the interactive effects between genome-wide polygenic risk scores for alcohol use (alc-PRS) and social support in relation to alcohol use among European American (EA) and African American (AA) adults across sex and developmental stages (emerging adulthood, young adulthood, and middle adulthood). Data were drawn from 4,011 EA and 1,274 AA adults from the Collaborative Study on the Genetics of Alcoholism who were between ages 18–65 and had ever used alcohol. Participants completed the Semi-Structured Assessment for the Genetics of Alcoholism and provided saliva or blood samples for genotyping. Results indicated that social support from friends, but not family, moderated the association between alc-PRS and alcohol use among EAs and AAs (only in middle adulthood for AAs); alc-PRS was associated with higher levels of alcohol use when friend support was low, but not when friend support was high. Associations were similar across sex but differed across developmental stages. Findings support the important role of social support from friends in buffering genetic risk for alcohol use among EA and AA adults and highlight the need to consider developmental changes in the role of social support in relation to alcohol use.
This study investigates the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission potential in North Dakota, South Dakota, Montana, Wyoming, and Idaho from March 2020 through January 2021.
Methods:
Time-varying reproduction numbers, Rt, of a 7-d-sliding-window and of non-overlapping-windows between policy changes were estimated using the instantaneous reproduction number method. Linear regression was performed to evaluate if per-capita cumulative case-count varied across counties with different population size or density.
Results:
The median 7-d-sliding-window Rt estimates across the studied region varied between 1 and 1.25 during September through November 2020. Between November 13 and 18, Rt was reduced by 14.71% (95% credible interval, CrI, [14.41%, 14.99%]) in North Dakota following a mask mandate; Idaho saw a 1.93% (95% CrI [1.87%, 1.99%]) reduction and Montana saw a 9.63% (95% CrI [9.26%, 9.98%]) reduction following the tightening of restrictions. High-population and high-density counties had higher per-capita cumulative case-count in North Dakota on June 30, August 31, October 31, and December 31, 2020. In Idaho, North Dakota, South Dakota, and Wyoming, there were positive correlations between population size and per-capita weekly incident case-count, adjusted for calendar time and social vulnerability index variables.
Conclusions:
Rt decreased after mask mandate during the region’s case-count spike suggested reduction in SARS-CoV-2 transmission.
We obtained 24 air samples in 8 general wards temporarily converted into negative-pressure wards admitting coronavirus disease 2019 (COVID-19) patients infected with severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) omicron variant BA.2.2 in Hong Kong. SARS-CoV-2 RNA was detected in 19 (79.2%) of 24 samples despite enhanced indoor air dilution. It is difficult to prevent airborne transmission of SARS-CoV-2 in hospitals.
We aimed to examine how public health policies influenced the dynamics of coronavirus disease 2019 (COVID-19) time-varying reproductive number (Rt) in South Carolina from February 26, 2020, to January 1, 2021.
Methods:
COVID-19 case series (March 6, 2020, to January 10, 2021) were shifted by 9 d to approximate the infection date. We analyzed the effects of state and county policies on Rt using EpiEstim. We performed linear regression to evaluate if per-capita cumulative case count varies across counties with different population size.
Results:
Rt shifted from 2-3 in March to <1 during April and May. Rt rose over the summer and stayed between 1.4 and 0.7. The introduction of statewide mask mandates was associated with a decline in Rt (−15.3%; 95% CrI, −13.6%, −16.8%), and school re-opening, an increase by 12.3% (95% CrI, 10.1%, 14.4%). Less densely populated counties had higher attack rates (P < 0.0001).
Conclusions:
The Rt dynamics over time indicated that public health interventions substantially slowed COVID-19 transmission in South Carolina, while their relaxation may have promoted further transmission. Policies encouraging people to stay home, such as closing nonessential businesses, were associated with Rt reduction, while policies that encouraged more movement, such as re-opening schools, were associated with Rt increase.
Contrasting the well-described effects of early intervention (EI) services for youth-onset psychosis, the potential benefits of the intervention for adult-onset psychosis are uncertain. This paper aims to examine the effectiveness of EI on functioning and symptomatic improvement in adult-onset psychosis, and the optimal duration of the intervention.
Methods
360 psychosis patients aged 26–55 years were randomized to receive either standard care (SC, n = 120), or case management for two (2-year EI, n = 120) or 4 years (4-year EI, n = 120) in a 4-year rater-masked, parallel-group, superiority, randomized controlled trial of treatment effectiveness (Clinicaltrials.gov: NCT00919620). Primary (i.e. social and occupational functioning) and secondary outcomes (i.e. positive and negative symptoms, and quality of life) were assessed at baseline, 6-month, and yearly for 4 years.
Results
Compared with SC, patients with 4-year EI had better Role Functioning Scale (RFS) immediate [interaction estimate = 0.008, 95% confidence interval (CI) = 0.001–0.014, p = 0.02] and extended social network (interaction estimate = 0.011, 95% CI = 0.004–0.018, p = 0.003) scores. Specifically, these improvements were observed in the first 2 years. Compared with the 2-year EI group, the 4-year EI group had better RFS total (p = 0.01), immediate (p = 0.01), and extended social network (p = 0.05) scores at the fourth year. Meanwhile, the 4-year (p = 0.02) and 2-year EI (p = 0.004) group had less severe symptoms than the SC group at the first year.
Conclusions
Specialized EI treatment for psychosis patients aged 26–55 should be provided for at least the initial 2 years of illness. Further treatment up to 4 years confers little benefits in this age range over the course of the study.
This study examined the pattern of medical utilization and the distribution of comorbidities shortly before death among adolescents who died from suicide and compared these data with those of living controls.
Methods
From Taiwan's National Health Insurance Research Database, this study identified adolescents aged 10–19 years who died from suicide (n = 935) between 1 January 2000, and 31 December 2016, by linking each patient with the national mortality database. The researchers conducted a nested case–control study through risk set sampling, and for each case, 20 age- and sex-matched controls (n = 18 700) were selected from the general population. The researchers applied conditional logistic regression to investigate differences in medical utilization and physical and psychiatric comorbidities between cases and controls.
Results
Cases had a higher proportion of contact with the psychiatric department but a similar proportion of contact with any non-psychiatric medical department within 1 year before suicide compared with controls. There were 18.6% of adolescent suicide victims who only had contacted with a psychiatric department 3 months before suicide. Moreover, cases had a higher proportion of contact with non-psychiatric services within 3 months before suicide, particularly with emergency, surgery, and internal medicine departments. Cases had higher risks of several psychiatric disorders and physical illnesses, including heart diseases, pneumonia, and ulcer disease, than did controls.
Conclusions
The findings of increased medical utilization and higher risks of physical and psychiatric comorbidities in adolescent suicide victims are crucial for developing specific interventions to prevent suicide in this population.
This study aimed to investigate coronavirus disease (COVID-19) epidemiology in Alberta, British Columbia, and Ontario, Canada.
Methods:
Using data through December 1, 2020, we estimated time-varying reproduction number, Rt, using EpiEstim package in R, and calculated incidence rate ratios (IRR) across the 3 provinces.
Results:
In Ontario, 76% (92 745/121 745) of cases were in Toronto, Peel, York, Ottawa, and Durham; in Alberta, 82% (49 878/61 169) in Calgary and Edmonton; in British Columbia, 90% (31 142/34 699) in Fraser and Vancouver Coastal. Across 3 provinces, Rt dropped to ≤ 1 after April. In Ontario, Rt would remain < 1 in April if congregate-setting-associated cases were excluded. Over summer, Rt maintained < 1 in Ontario, ~1 in British Columbia, and ~1 in Alberta, except early July when Rt was > 1. In all 3 provinces, Rt was > 1, reflecting surges in case count from September through November. Compared with British Columbia (684.2 cases per 100 000), Alberta (IRR = 2.0; 1399.3 cases per 100 000) and Ontario (IRR = 1.2; 835.8 cases per 100 000) had a higher cumulative case count per 100 000 population.
Conclusions:
Alberta and Ontario had a higher incidence rate than British Columbia, but Rt trajectories were similar across all 3 provinces.
Studies suggest that alcohol consumption and alcohol use disorders have distinct genetic backgrounds.
Methods
We examined whether polygenic risk scores (PRS) for consumption and problem subscales of the Alcohol Use Disorders Identification Test (AUDIT-C, AUDIT-P) in the UK Biobank (UKB; N = 121 630) correlate with alcohol outcomes in four independent samples: an ascertained cohort, the Collaborative Study on the Genetics of Alcoholism (COGA; N = 6850), and population-based cohorts: Avon Longitudinal Study of Parents and Children (ALSPAC; N = 5911), Generation Scotland (GS; N = 17 461), and an independent subset of UKB (N = 245 947). Regression models and survival analyses tested whether the PRS were associated with the alcohol-related outcomes.
Results
In COGA, AUDIT-P PRS was associated with alcohol dependence, AUD symptom count, maximum drinks (R2 = 0.47–0.68%, p = 2.0 × 10−8–1.0 × 10−10), and increased likelihood of onset of alcohol dependence (hazard ratio = 1.15, p = 4.7 × 10−8); AUDIT-C PRS was not an independent predictor of any phenotype. In ALSPAC, the AUDIT-C PRS was associated with alcohol dependence (R2 = 0.96%, p = 4.8 × 10−6). In GS, AUDIT-C PRS was a better predictor of weekly alcohol use (R2 = 0.27%, p = 5.5 × 10−11), while AUDIT-P PRS was more associated with problem drinking (R2 = 0.40%, p = 9.0 × 10−7). Lastly, AUDIT-P PRS was associated with ICD-based alcohol-related disorders in the UKB subset (R2 = 0.18%, p < 2.0 × 10−16).
Conclusions
AUDIT-P PRS was associated with a range of alcohol-related phenotypes across population-based and ascertained cohorts, while AUDIT-C PRS showed less utility in the ascertained cohort. We show that AUDIT-P is genetically correlated with both use and misuse and demonstrate the influence of ascertainment schemes on PRS analyses.
We conducted a survey of 16,914 patients to determine the point prevalence of healthcare-associated catheter-associated urinary tract infection (HA-CAUTI) and urinary catheter care in public hospitals in Hong Kong. Overall HA-CAUTI prevalence was 0.27%. Compliance was generally good, except for documenting the date of planned removal and securing the catheter properly.
This article discusses some of the current research on technology in relation to learner autonomy, outlining major findings on the relationship between technology and learner autonomy in formal and informal learning contexts. Extant literature has discussed both teacher-initiated technology-enhanced formal learning environments and learner-constructed self-directed learning experience in informal learning contexts. Although valuable in the insights it provides into how technology aids learner autonomy, the two bodies of literature have largely been independent from each other, which may constrain our understanding.
To determine the efficacy of 2 types of antimicrobial privacy curtains in clinical settings and the costs involved in replacing standard curtains with antimicrobial curtains.
Design
A prospective, open-labeled, multicenter study with a follow-up duration of 6 months.
Setting
This study included 12 rooms of patients with multidrug-resistant organisms (MDROs) (668 patient bed days) and 10 cubicles (8,839 patient bed days) in the medical, surgical, neurosurgical, orthopedics, and rehabilitation units of 10 hospitals.
Method
Culture samples were collected from curtain surfaces twice a week for 2 weeks, followed by weekly intervals.
Results
With a median hanging time of 173 days, antimicrobial curtain B (quaternary ammonium chlorides [QAC] plus polyorganosiloxane) was highly effective in reducing the bioburden (colony-forming units/100 cm2, 1 vs 57; P < .001) compared with the standard curtain. The percentages of MDRO contamination were also significantly lower on antimicrobial curtain B than the standard curtain: methicillin-resistant Staphylococcus aureus, 0.5% vs 24% (P < .001); carbapenem-resistant Acinetobacter spp, 0.2% vs 22.1% (P < .001); multidrug-resistant Acinetobacter spp, 0% vs 13.2% (P < .001). Notably, the median time to first contamination by MDROs was 27.6 times longer for antimicrobial curtain B than for the standard curtain (138 days vs 5 days; P = .001).
Conclusions
Antimicrobial curtain B (QAC plus polyorganosiloxane) but not antimicrobial curtain A (built-in silver) effectively reduced the microbial burden and MDRO contamination compared with the standard curtain, even after extended use in an active clinical setting. The antimicrobial curtain provided an opportunity to avert indirect costs related to curtain changing and laundering in addition to improving patient safety.