We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Securing Democracies examines the attacks on voting processes and the broader informational environment in which elections take place. The volume's global cadre of scholars and practitioners highlight the interconnections among efforts to target vulnerable democratic systems and identify ways to prevent, defend against, and mitigate their effects on both the technical and the informational aspects of cybersecurity. The work takes a wider view of defending democracy by recognizing that both techniques—attacking infrastructure and using misinformation and disinformation—are means to undermine trust and confidence in democratic institutions. As such, the book proposes a wide range of policy responses to tackle these cyber-enabled threats focusing on the geopolitical front lines, namely Eastern Europe, the Middle East, and East Asia. This title is also available as open access on Cambridge Core.
The next-generation radio astronomy instruments are providing a massive increase in sensitivity and coverage, largely through increasing the number of stations in the array and the frequency span sampled. The two primary problems encountered when processing the resultant avalanche of data are the need for abundant storage and the constraints imposed by I/O, as I/O bandwidths drop significantly on cold storage. An example of this is the data deluge expected from the SKA Telescopes of more than 60 PB per day, all to be stored on the buffer filesystem. While compressing the data is an obvious solution, the impacts on the final data products are hard to predict. In this paper, we chose an error-controlled compressor – MGARD – and applied it to simulated SKA-Mid and real pathfinder visibility data, in noise-free and noise-dominated regimes. As the data have an implicit error level in the system temperature, using an error bound in compression provides a natural metric for compression. MGARD ensures the compression incurred errors adhere to the user-prescribed tolerance. To measure the degradation of images reconstructed using the lossy compressed data, we proposed a list of diagnostic measures, exploring the trade-off between these error bounds and the corresponding compression ratios, as well as the impact on science quality derived from the lossy compressed data products through a series of experiments. We studied the global and local impacts on the output images for continuum and spectral line examples. We found relative error bounds of as much as 10%, which provide compression ratios of about 20, have a limited impact on the continuum imaging as the increased noise is less than the image RMS, whereas a 1% error bound (compression ratio of 8) introduces an increase in noise of about an order of magnitude less than the image RMS. For extremely sensitive observations and for very precious data, we would recommend a $0.1\%$ error bound with compression ratios of about 4. These have noise impacts two orders of magnitude less than the image RMS levels. At these levels, the limits are due to instabilities in the deconvolution methods. We compared the results to the alternative compression tool DYSCO, in both the impacts on the images and in the relative flexibility. MGARD provides better compression for similar error bounds and has a host of potentially powerful additional features.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
Individuals with schizophrenia experience significantly higher rates of chronic physical health conditions, driving a 20-year reduction in life expectancy. Poor diet quality is a key modifiable risk factor; however, owing to side-effects of antipsychotic medication, cognitive challenges and food insecurity, standard dietary counselling may not be sufficient for this population group.
Aim
To evaluate the feasibility, acceptability and preliminary effectiveness of two dietary interventions – pre-prepared meals and meal kits – for individuals with schizophrenia.
Method
The Schizophrenia, Nutrition and Choices in Kilojoules (SNaCK) study is a 12-week, three-arm, cross-over, randomised controlled trial. Eighteen participants aged 18–64 years diagnosed with schizophrenia or schizoaffective disorder will be recruited from community mental health services in Australia. Participants will be randomised to receive pre-prepared meals, meal kits or a supermarket voucher as a control, crossing-over at the end of weeks 4 and 8, so that all participants experience all three study arms. Primary outcomes include feasibility (recruitment rate and retention, number of days participants use pre-prepared meals or meal kits, adherence to meals as prescribed, difficulty in meal preparation and meal wastage) and acceptability (meal provision preference ranking and implementation) of the nutrition interventions. Secondary outcomes include the effects of the intervention on metabolic syndrome components, dietary intake, quality of life and food security measures.
Conclusions
Feasible, acceptable and effective dietary interventions for people with schizophrenia are urgently needed. Findings from this trial will inform future larger randomised controlled trials that have the potential to influence policy and improve health outcomes for this vulnerable population.
Indaziflam (Rejuvra®), a preemergence herbicide first registered in vine and tree nut crops, was recently approved for applications to rangeland for winter annual grass control. Indaziflam controls cheatgrass (Bromus tectorum L.) for at least 3 yr, and control can extend into a fourth and fifth year; however, it is very difficult to find indaziflam residues in the soil 2 yr after application. Indaziflam could be absorbed by seeds still retained on the plant and on the soil surface in sufficient concentrations to stop establishment. To test this hypothesis, B. tectorum seeds and jointed goatgrass (Aegilops cylindrica Host) spikelets were treated with indaziflam and imazapic at rates from 5.4 to 175 g ai ha−1 using a greenhouse track sprayer delivering 187 L ha−1. Treated seeds were planted into field soil, and plants were allowed to grow for 21 d under greenhouse conditions. Growth was compared with growth of non-treated controls. In addition, a second set of treated seeds were exposed to rainfall 1 and 24 h after treatment and rainfall amounts ranging from 3 to 24 mm to determine whether rainfall impacted herbicide performance. Bromus tectorum was so sensitive to indaziflam that establishment was eliminated at all rates. Imazapic inhibited B. tectorum establishment with an ED90 of 67 g ai ha−1. Indaziflam effectively inhibits A. cylindrica establishment with an ED90 of 7.4 g ai ha−1 compared with imazapic with an ED50 of 175 g ai ha−1. Indaziflam’s impact on A. cylindrica establishment was not significantly impacted by rainfall, indicating that the herbicide was absorbed to the seed coat. These findings support the hypothesis that indaziflam’s long-term control could result from its ability to inhibit establishment of seeds retained in the canopy and those on the soil surface at the time of application.
Cattle (Bos spp.) grazing on weed–mixed forage biomass may potentially spread weed seeds, leading to plant invasions across pasturelands. Understanding the possibility and intensity of this spread is crucial for developing effective weed control methods in grazed areas. This research undertook an in vitro experiment to evaluate the germination and survival of five dominant weed species in the southern United States [Palmer amaranth (Amaranthus palmeri S. Watson), yellow foxtail [Setaria pumila (Poir.) Roem. & Schult.], johnsongrass [Sorghum halepense (L.) Pers.], field bindweed (Convolvulus arvensis L.) and pitted morningglory (Ipomoea lacunosa L.)] upon incubation in rumen fluid for eight time periods (0, 4, 8, 12, 24, 24, 48, 72, and 96 h). For the 96-h treatment, a full Tilley and Terry procedure was applied after 48 h for stopping fermentation, followed by incubation for another 48 h simulating abomasum digestion. Seed germination, upon incubation, varied significantly among weed species, with I. lacunosa reaching zero germination after only 24 h of incubation, whereas A. palmeri and S. halepense retained up to 3% germination even after 96 h of incubation. The hard seed coats of A. palmeri and S. halepense likely made them highly resistant, whereas the I. lacunosa seed coat became easily permeable and ruptured under rumen fluid incubation. This suggests that cattle grazing can selectively affect seed distribution and invasiveness of weeds in grazed grasslands and rangelands, including the designated invasive and noxious weed species. As grazing is a significant component in animal husbandry, a major economic sector in the U.S. South, our research provides important insights into the potential role of grazing as a dispersal mechanism for some of the troublesome arable weeds in the United States. The results offer opportunities for devising customized feeding and grazing practices combined with timely removal of weeds in grazeable lands at the pre-flowering stage for effective containment of weeds.
Background: Nipocalimab is a human IgG1 monoclonal antibody targeting FcRn that selectively reduces IgG levels without impacting antigen presentation, T- and B-cell functions. This study describes the effect of nipocalimab on vaccine response. Methods: Open-label, parallel, interventional study randomized participants 1:1 to receive intravenous 30mg/kg nipocalimab at Week0 and 15mg/kg at Week2 and Week4 (active) or no drug (control). On Day 3, participants received Tdap and PPSV®23 vaccinations and were followed through Wk16. Results: Twenty-nine participants completed the study and are included (active, n=15; control, n=14). Participants with a positive anti-tetanus IgG response was comparable between groups at Wk2 and Wk16, but lower at Wk4 (nipocalimab 3/15 [20%] vs control 7/14 [50%]; P=0.089). All maintained anti-tetanus IgG above the protective threshold (0.16IU/mL) through Wk16. While anti-pneumococcal-capsular-polysaccharide (PCP) IgG levels were lower during nipocalimab treatment, the percent increase from baseline at Wk2 and Wk16 was comparable between groups. Post-vaccination, anti-PCP IgG remained above 50mg/L and showed a 2-fold increase from baseline throughout the study in both groups. Nipocalimab co-administration with vaccines was safe and well-tolerated. Conclusions: These findings suggest that nipocalimab does not impact the development of an adequate IgG response to T-cell–dependent/independent vaccines and that nipocalimab-treated patients can follow recommended vaccination schedules.
The goal of this study was to unpack processes that may lead to child emotional insecurity. Guided by the emotional security theory (EST/EST-R), we examined the mediational role of parental depressive symptomology between interparental conflict (IPC), both constructive and destructive, and child emotional insecurity at age 36-months. We partitioned unique variance of IPC from shared using an extension of the common fate model. We used two-wave data from the Building Strong Families project, which consisted of racially diverse couples/parents (N = 4,424) who were low income and unmarried at the conception of their child. We found gendered differences for how mothers and fathers experience IPC, with mothers more influenced by their relational circumstances. We also found that fathers were vulnerable to experiencing depressive symptoms following aspects of destructive IPC. Consistent with EST-R, constructive IPC did not promote emotional security in children. Rather, both destructive and constructive IPC related to greater levels of emotional insecurity, with destructive IPC showing stronger effects. Proposed mediation was found for fathers only. Our findings may appeal to scholars who focus on untangling the complexity of IPC and intervention specialists and clinicians interested in a process-oriented approaches to the development of child psychopathology.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
Malnutrition is prevalent in older adults and frequently coexists with sarcopenia(1), a condition characterised by low muscle mass and physical performance(2). Malnutrition and sarcopenia are associated with adverse outcomes in older adults including mortality(3), and thus require early detection. The Mini Nutrition Assessment (MNA) is a validated nutrition screening and assessment tool in older adults(4), but its ability to predict poor muscle mass and physical performance is unclear. This study aimed to determine the association between MNA-determined (risk of) malnutrition and muscle mass and physical performance in community-dwelling older adults. This is a cross-sectional analysis of baseline data from the Capacity of Older Individuals after Nut Supplementation (COINS) study, a randomised controlled trial investigating the effect of peanut butter on functional capacity in older adults. Participants were generally healthy and at risk for falls (simplified fall risk screening score ≥ 2). Participants were screened for malnutrition risk (MNA-Screening score range 0–14; at-risk or malnourished if < 11), followed by assessment (MNA-Assessment score range 0–16) to obtain MNA-Total score (range 0–30; at-risk or malnourished if score < 23.5). Skeletal muscle mass index (SMMI) was derived from bio-impedance analysis. Physical performance including muscle strength, gait speed, balance and power, was objectively measured using multiple standard tests. Linear regression analyses were performed and adjusted for age and sex. A total of 120 participants were included (70% females, age 74.8 ± 4.5 years). MNA-Screening, MNA-Assessment and MNA-Total scores were (median [IQR]) 14 [12–14], 14 [13.5–15] and 27.5 [26.0–28.4] respectively. Malnutrition (or risk) was found in 18 (15.0%) and 8 (6.7%) participants according to MNA-Screening and MNA-Total, respectively. A higher MNA-Screening score was associated with higher knee extension strength [β = 1.36 (Standard Error, SE 0.65) kg, p = 0.039]. A higher MNA-Assessment score was associated with higher gait speed [β = 0.04 (0.01) kg/m2, p = 0.007] and shorter timed up-and-go test time [β = −0.19 (0.09) seconds, p = 0.035]. MNA-Total score was not significantly associated with muscle mass or physical performance. Malnutrition status as determined by MNA-Screening score (but not MNA-Total score) was associated with lower muscle mass (SMMI [β = −0.52 (0.26) kg/m2, p = 0.043]), but not strength and physical performance. In summary, the MNA-Screening score was predictive of muscle mass and strength, whereas the MNA-Assessment score predicted physical performance, particularly gait speed, in community-dwelling older adults at risk of falls. Periodic malnutrition screening by MNA may help early detection of poor muscle mass and function in generally healthy older adults.
Metabolic dysfunction-associated fatty liver disease (MAFLD) is the most common liver disease globally, affecting 1 in 3 Australian adults and up to 39% in rural communities(1). Behaviour changes targeting diet and physical activity to achieve weight loss are considered the cornerstones of MAFLD management. A Mediterranean diet (MedDiet) rich in wholegrains, vegetables, fruits, fish, olives, raw nuts and seeds is recommended in key global guidelines as the optimal dietary pattern for MAFLD(2). Additionally, research evidence indicates moderate-intensity aerobic exercise is effective in reducing liver fat and improving cardiometabolic health(3). Given the higher rates of MAFLD in rural communities and their limited access to healthcare services, digital health interventions present a valuable opportunity to improve the accessibility, availability and personalisation of healthcare services to address this important unmet need. However, no digital interventions to address health risk behaviours in MAFLD including diet and physical activity, are currently available. This research aimed to use best practice co-design methodology to develop a web-based healthy living intervention for people with MAFLD. An iterative co-design process using the Double Diamond Framework, including four key phases was undertaken over 12 months. Twenty-seven adults (≥ 18 years) were recruited from The Alfred Hospital, Australia. This included people with MAFLD (n = 10; 50% female; mean age: 63.6 years), healthcare professionals (HCPs) (n = 17; 59% female; mean age: 37.1 years) [dietitians (n = 5), exercise professionals (n = 6), and clinicians/hepatologists (n = 6)]. Phase 1–discover. Barriers and facilitators were explored through semi-structured interviews to understand the needs of the target population regarding accessibility, appearance, resources and application of the web-based intervention. Interviews were virtual, conducted one-on-one via ZoomTM, transcribed and inductively analysed using NVivo. Phase 2–define. A reflexive thematic analysis identified five key themes within the data. These included: i) web-based functionality, navigation and formatting, ii) holistic behaviour change including MedDiet and physical activity, iii) digital health accessibility, iv) knowledge and resources, and v) intervention duration and reminders. Phase 3–develop. The knowledge gained from this process lead to the development of the web-based intervention taking into consideration expressed preferences for features that can enhance knowledge about the condition, offer dietary and physical activity support via targeted resources and videos, and increase engagement via chat group and frequent reminders. Phase 4–deliver. The co-design has led to the development of a web-based healthy living intervention that will be further evaluated for feasibility and implementation in a pilot trial. The resulting intervention aims to achieve behavioural change and promote healthier living amongst Australians with MAFLD. This knowledge has the potential to drive strategies to reduce barriers to accessing healthcare remotely, making the web-based intervention a valuable tool for both patients and professionals.
Older adults are at an increased risk for both malnutrition and cognitive decline(1,2). However, the relationship between nutritional status and cognitive decline remains unclear, and was investigated in this study. This is a cross-sectional analysis of baseline data from the Capacity of Older Individuals after Nut Supplementation (COINS) study, a randomised controlled trial investigating the effect of peanut butter on functional capacity in older adults. Older adults aged 65 years and over, who were community-dwelling, generally healthy and at risk for falls (simplified fall risk screening score ≥ 2) were recruited as part of COINS study. Nutritional status was measured using the Mini Nutritional Assessment (MNA) tool (score range 0 to 30). An MNA score of ≥ 24 indicated normal nutrition status, while a MNA score of < 24 was indicative of at-risk for malnutrition. Cognitive performance was measured by the validated Montreal Cognitive Assessment MoCA (range 0 to 30), and Trail Making Tests-A and B (TMT-A, TMT-B) (as time taken to complete tasks) tools. The MoCA test further provided scores on visuospatial/executive function, naming, language, attention, abstraction, delayed recall, and orientation domains. Multivariable linear regression analysis was used to investigate the association between nutritional status and cognitive function, adjusted for age, sex and BMI. A total of 118 older adults with complete data were analysed (83% females, age (mean ± SD) = 74 ± 4 years; BMI = 27.5 ± 4.2 kg/m2), of which 93.2% (n = 110) were considered to have normal nutritional status, and the remaining 6.8% (n = 8) were deemed at risk of malnutrition. In terms of cognitive function status, 40.7% (n = 48) had normal cognitive function (MoCA score ≥ 26), 56.7% (n = 68) had mild cognitive impairment (MoCA score 18–25), and 1.7% (n = 2) had severe cognitive impairment (MoCA score 10–17). After adjusting for age, sex, and BMI, MNA score was positively associated with both overall MoCA scores (β (95% CI): 0.29 (0.04, 0.54), p = 0.024) and the visuospatial/executive function (β (95% CI): 0.16 (0.05, 0.28), p = 0.006), but not with other cognitive domains or TMT performance. In summary, our findings suggest that nutritional status assessed via MNA may be predictive of global cognitive function. Future studies are needed to determine if MNA could be a surrogate marker or risk factor for cognitive declines.
Posttraumatic stress disorder (PTSD) is often chronic and impairing. Mechanisms that maintain symptoms remain poorly understood because of heterogenous presentation. We parsed this heterogeneity by examining how individual differences in stress-symptom dynamics relate to the long-term maintenance of PTSD.
Methods
We studied 7,308 trauma-exposed World Trade Center responders who self-reported PTSD symptoms and stressful life events at annual monitoring visits for up to 20 years (average = 8.8 visits; [range = 4–16]). We used multilevel structural equation models to separate the stable and time-varying components of symptoms and stressors. At the within-person level, we modeled stress reactivity by cross-lagged associations between stress and future symptoms, stress generation by cross-lagged associations between symptoms and future stress, and autoregressive effects represented symptom persistence and stress persistence. The clinical utility of the stress-symptom dynamics was evaluated by associations with PTSD chronicity and mental health care use.
Results
Stress reactivity, stress generation, and symptom persistence were significant on average (bs = 0.03–0.16). There were significant individual differences in the strength of each dynamic (interquartile ranges = 0.06–0.12). Correlations among within-person processes showed some dynamics are intertwined (e.g. more reactive people also generate stress in a vicious cycle) and others represent distinct phenotypes (e.g. people are reactive or have persistent symptoms). Initial trauma severity amplified some dynamics. People in the top deciles of most dynamics had clinically significant symptom levels across the monitoring period and their health care cost 6–17× more per year than people at median levels.
Conclusions
Individual differences in stress-symptom dynamics contribute to the chronicity and clinical burden of PTSD.
Reliable population estimates are one of the most elementary needs for the management of wildlife, particularly for introduced ungulates on oceanic islands. We aimed to produce accurate and precise density estimates of Philippine deer (Rusa marianna) and wild pigs (Sus scrofa) on Guam using motion-triggered cameras combined with distance sampling to estimate densities from observations of unmarked animals while accounting for imperfect detection. We used an automated digital data processing pipeline for species recognition and to estimate the distance to detected species. Our density estimates were slightly lower than published estimates, consistent with management to reduce populations. We estimated the number of camera traps needed to obtain a 0.1 coefficient of variation was substantial, requiring > ten-fold increase in camera traps, while estimates with precision of 0.2 or 0.3 were more achievable, requiring doubling to quadrupling the number of camera traps. We provide best practices for establishing and conducting distance sampling with camera trap surveys for density estimation based on lessons learned during this study. Future studies should consider distance sampling with camera traps to efficiently survey and monitor unmarked animals, particularly medium-sized ungulates, in tropical, oceanic island ecosystems.
Despite advances in antiretroviral treatment (ART), human immunodeficiency virus (HIV) can detrimentally affect everyday functioning. Neurocognitive impairment (NCI) and current depression are common in people with HIV (PWH) and can contribute to poor functional outcomes, but potential synergies between the two conditions are less understood. Thus, the present study aimed to compare the independent and combined effects of NCI and depression on everyday functioning in PWH. We predicted worse functional outcomes with comorbid NCI and depression than either condition alone.
Methods:
PWH enrolled at the UCSD HIV Neurobehavioral Research Program were assessed for neuropsychological performance, depression severity (≤minimal, mild, moderate, or severe; Beck Depression Inventory-II), and self-reported everyday functioning.
Results:
Participants were 1,973 PWH (79% male; 66% racial/ethnic minority; Age: M = 48.6; Education: M = 13.0, 66% AIDS; 82% on ART; 42% with NCI; 35% BDI>13). ANCOVA models found effects of NCI and depression symptom severity on all functional outcomes (ps < .0001). With NCI and depression severity included in the same model, both remained significant (ps < .0001), although the effects of each were attenuated, and yielded better model fit parameters (i.e., lower AIC values) than models with only NCI or only depression.
Conclusions:
Consistent with prior literature, NCI and depression had independent effects on everyday functioning in PWH. There was also evidence for combined effects of NCI and depression, such that their comorbidity had a greater impact on functioning than either alone. Our results have implications for informing future interventions to target common, comorbid NCI and depressed mood in PWH and thus reduce HIV-related health disparities.
In the analysis of late tonal music, analytical approaches which attempt to understand tonal function on the one hand, and harmonic transformation viewed through a neo-Riemannian lens on the other, often stand in an uneasy relation. Through analysis of Act 1, Scene 3 of Götterdämmerung, this chapter attempts to bring neo-Riemannian theory closer to its origin in Hugo Riemann’s functional theory, and so to point the way towards a new theoretical frame for understanding the tonal function of chromatic music. We urge this return to Riemann because it enables twenty-first-century listeners and theorists to appreciate the complex power of tonality as a system which, like the great socio-economic, legal, religious and scientific systems that have endured into the twenty-first century, has an indefatigable ability to subsume anything that might seem to pose a challenge to it back into itself, as a source of further power.
To date, the NIH Helping to End Addiction Long-term (HEAL) Initiative has funded over 1,000 projects that aim to identify new therapeutic targets for pain and substance use disorder (SUD), develop nonpharmacological strategies for pain management, and improve overdose and addiction treatment across settings. This study conducted a portfolio analysis of HEAL’s research to assess opportunities to advance translation and implementation.
Methods:
HEAL projects (FY 2018–2022) were classified into early (T0–T1) and later (T2–T4) translational stages. Eleven coders used a 54-item data collection tool based on the Consolidated Framework for Implementation Research (CFIR) to extract project characteristics (e.g., population, research setting) relevant to translation and implementation. Descriptive statistics and visualization techniques were employed to analyze and map aggregate characteristics onto CFIR’s domains (e.g., outer setting).
Results:
HEAL’s portfolio comprised 923 projects (33.7% T0–T1; 67.3% T2–T4), ranging from basic science (27.1%) and preclinical research (21.4%) to clinical (36.8%), implementation (27.1%), and dissemination research (13.1%). Most projects primarily addressed either addiction (46.3%) or pain (37.4%). Implementation-related gaps included the underrepresentation of certain populations (e.g., sexual/gender minorities: 0.5%). T0–T1 projects occurred primarily in laboratory settings (35.1%), while T2–T4 projects were concentrated in healthcare settings (e.g., hospitals: 21.6%) with limited transferability to other contexts (e.g., community: 12.9%).
Conclusion:
Opportunities to advance translational and implementation efforts include fostering interdisciplinary collaboration, prioritizing underserved populations, engaging with community leaders and policy stakeholders, and targeting evidence-based practices in nonclinical settings. Ongoing analyses can guide strategic investments to maximize HEAL’s impact on substance use and pain crises.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Objectives/Goals: This study’s objective was to explore how a personal cancer diagnosis impacts the social connectedness (i.e., quality, structure, and functions of social relationships) of adolescent/young adult cancer survivors (AYACS, patients diagnosed with cancer between 15 and 39 years old), to inform intervention development fostering social health. Methods/Study Population: In this qualitative study (part of larger study assessing AYACS’ psychosocial challenges), participants were 15–25 years old at the time of cancer diagnosis and within 6 years of cancer diagnosis. Participants (and consenting parents of participants 18 years old and older) had to have fluency in written and spoken English and access to a computer or smartphone. Qualitative interviewers utilized an interview guide to conduct individual participant interviews. Interviews were audio-recorded and transcribed verbatim. Thematic analysis was used to analyze data using a phenomenological approach to explore how a personal cancer diagnosis impacted social connectedness. Qualitative data related to social connectedness (corresponding to code “Relationships and Support”) are presented. Results/Anticipated Results: Three themes emerged through thematic analysis: (1) AYACS experience substantial heterogeneity related to social support needs; (2) AYACS leverage multiple relationships and resources when seeking support after a personal cancer diagnosis; (3) AYACS’ individual experiences were unique in that some noted positive changes, whereas others noted negative changes in relationships within social networks, specifically with peers. Discussion/Significance of Impact: AYACS experience various social support needs, and leverage multiple relationships when seeking social support. These translational findings create a foundation to develop AYACS social programming, foster peer relationships, and incorporate social science methods to aid intervention development to strengthen AYACS’ social connectedness.