We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Single ventricle patients undergoing comprehensive stage II palliation have higher incidence of severe acute kidney injury compared to the bidirectional Glenn palliation; however, the optimal method for early detection remains unknown. Several urinary biomarkers are increased in other patient populations with postoperative kidney injury. We explored the kinetics of these biomarkers in this high-risk population.
We conducted prospective, observational study of 20 patients with single ventricle physiology who underwent second stage palliation (July 2019–December 2021). Acute kidney injury was defined by Kidney Diseases Improving Global Guidelines, based on peak serum creatinine value and urine output. Urine samples were collected pre-operatively and at 1-, 6-, and 24-hours post-surgery. Urinary biomarkers neutrophil gelatinase-associated lipocalin, interleukin-18, liver fatty acid-binding protein, kidney injury molecule-1, and cystatin C were quantified by enzyme linked immunosorbent assay, normalised to urinary creatinine, and shown as median [interquartile range].
Four patients (50%) undergoing comprehensive stage II and 1 patient (8%) undergoing bidirectional Glenn palliation developed stage ≥ 2 acute kidney injury. Comprehensive stage II compared to bidirectional Glenn group had higher median neutrophil gelatinase-associated lipocalin (1769 [1309–1961] versus 91[18–1120] ng/mg) and liver fatty acid-binding protein (12,836 [5016–19798] versus 1272 [220–5172] ng/mg) that peaked 1-hour post-surgery. Kidney injury molecule-1 was significantly greater at 1-, 6-, and 24-hours (greatest) post-surgery in comprehensive stage II than bidirectional Glenn (24h: 11[9–23]) versus 2 [1–6] ng/mg).
Elevated urinary neutrophil gelatinase-associated lipocalin, liver fatty acid-binding protein, and kidney injury molecule-1 may be useful biomarkers for early detection of acute kidney injury in children following comprehensive stage II palliation.
Neuropsychiatry training in the UK currently lacks a formal scheme or qualification, and its demand and availability have not been systematically explored. We conducted the largest UK-wide survey of psychiatry trainees to examine their experiences in neuropsychiatry training.
Results
In total, 185 trainees from all UK training regions completed the survey. Although 43.6% expressed interest in a neuropsychiatry career, only 10% felt they would gain sufficient experience by the end of training. Insufficient access to clinical rotations was the most common barrier, with significantly better access in London compared with other regions. Most respondents were in favour of additional neurology training (83%) and a formal accreditation in neuropsychiatry (90%).
Clinical implications
Strong trainee interest in neuropsychiatry contrasts with the limited training opportunities currently available nationally. Our survey highlights the need for increased neuropsychiatry training opportunities, development of a formalised training programme and a clinical accreditation pathway for neuropsychiatry in the UK.
Duchenne muscular dystrophy is a devastating neuromuscular disorder characterized by the loss of dystrophin, inevitably leading to cardiomyopathy. Despite publications on prophylaxis and treatment with cardiac medications to mitigate cardiomyopathy progression, gaps remain in the specifics of medication initiation and optimization.
Method:
This document is an expert opinion statement, addressing a critical gap in cardiac care for Duchenne muscular dystrophy. It provides thorough recommendations for the initiation and titration of cardiac medications based on disease progression and patient response. Recommendations are derived from the expertise of the Advance Cardiac Therapies Improving Outcomes Network and are informed by established guidelines from the American Heart Association, American College of Cardiology, and Duchenne Muscular Dystrophy Care Considerations. These expert-derived recommendations aim to navigate the complexities of Duchenne muscular dystrophy-related cardiac care.
Results:
Comprehensive recommendations for initiation, titration, and optimization of critical cardiac medications are provided to address Duchenne muscular dystrophy-associated cardiomyopathy.
Discussion:
The management of Duchenne muscular dystrophy requires a multidisciplinary approach. However, the diversity of healthcare providers involved in Duchenne muscular dystrophy can result in variations in cardiac care, complicating treatment standardization and patient outcomes. The aim of this report is to provide a roadmap for managing Duchenne muscular dystrophy-associated cardiomyopathy, by elucidating timing and dosage nuances crucial for optimal therapeutic efficacy, ultimately improving cardiac outcomes, and improving the quality of life for individuals with Duchenne muscular dystrophy.
Conclusion:
This document seeks to establish a standardized framework for cardiac care in Duchenne muscular dystrophy, aiming to improve cardiac prognosis.
Preliminary evidence suggests that a ketogenic diet may be effective for bipolar disorder.
Aims
To assess the impact of a ketogenic diet in bipolar disorder on clinical, metabolic and magnetic resonance spectroscopy outcomes.
Method
Euthymic individuals with bipolar disorder (N = 27) were recruited to a 6- to 8-week single-arm open pilot study of a modified ketogenic diet. Clinical, metabolic and MRS measures were assessed before and after the intervention.
Results
Of 27 recruited participants, 26 began and 20 completed the ketogenic diet. For participants completing the intervention, mean body weight fell by 4.2 kg (P < 0.001), mean body mass index fell by 1.5 kg/m2 (P < 0.001) and mean systolic blood pressure fell by 7.4 mmHg (P < 0.041). The euthymic participants had average baseline and follow-up assessments consistent with them being in the euthymic range with no statistically significant changes in Affective Lability Scale-18, Beck Depression Inventory and Young Mania Rating Scale. In participants providing reliable daily ecological momentary assessment data (n = 14), there was a positive correlation between daily ketone levels and self-rated mood (r = 0.21, P < 0.001) and energy (r = 0.19 P < 0.001), and an inverse correlation between ketone levels and both impulsivity (r = −0.30, P < 0.001) and anxiety (r = −0.19, P < 0.001). From the MRS measurements, brain glutamate plus glutamine concentration decreased by 11.6% in the anterior cingulate cortex (P = 0.025) and fell by 13.6% in the posterior cingulate cortex (P = <0.001).
Conclusions
These findings suggest that a ketogenic diet may be clinically useful in bipolar disorder, for both mental health and metabolic outcomes. Replication and randomised controlled trials are now warranted.
Increasing resources are devoted to osteoarthritis surgical care in Australia annually, with significant expenditure attributed to hip and knee arthroplasties. Safe, efficient, and sustainable models of care are required. This study aimed to determine the impact on healthcare costs of implementing an enhanced short-stay model of care (ESS-MOC) for arthroplasty at a national level.
Methods
Budget impact analysis was conducted for hospitals providing arthroplasty surgery over the years 2023 to 2030. Population-based sample projections obtained from clinical registry and administrative datasets of individuals receiving hip or knee arthroplasty for osteoarthritis were applied. The ESS-MOC assigned 30 percent of eligible patients to a shortened acute-ward-stay pathway and outpatient rehabilitation. The remaining 70 percent received a current practice pathway. The primary outcome was total healthcare cost savings post-implementation of the ESS-MOC, with return on investment (ROI) ratio and hospital bed-days utilized also estimated. Costs are presented in Australian dollars (AUD) and United States dollars (USD), at 2023 prices.
Results
Estimated hospital cost savings for the years 2023 to 2030 from implementing the ESS-MOC were AUD641 million (USD427 million) (95% CI: AUD99 million [USD66 million] to AUD1,250 million) [USD834 million]). This corresponds to a ROI ratio of 8.88 (1.3 to 17.9) dollars returned for each dollar invested in implementing the care model. For the period 2023 to 2030, an estimated 337,000 (261,000 to 412,000) acute surgical ward bed-days, and 721,000 (471,000 to 1,028,000) rehabilitation bed-days could be saved. Total implementation costs for the ESS-MOC were estimated at AUD72 million (USD46 million) over eight years.
Conclusions
Implementation of an ESS-MOC for eligible arthroplasty patients in Australia would generate significant cost and healthcare resource savings. This budget impact analysis demonstrates a best practice approach to comprehensively assessing value, at a national level, of implementing sustainable models of care in high-burden healthcare contexts. Findings are relevant to other settings where hospital stay following joint arthroplasty remains excessively long.
Studies using the dietary inflammatory index often perform complete case analyses (CCA) to handle missing data, which may reduce the sample size and increase the risk of bias. Furthermore, population-level socio-economic differences in the energy-adjusted dietary inflammatory index (E-DII) have not been recently studied. Therefore, we aimed to describe socio-demographic differences in E-DII scores among American adults and compare the results using two statistical approaches for handling missing data, i.e. CCA and multiple imputation (MI).
Design:
Cross-sectional analysis. E-DII scores were computed using a 24-hour dietary recall. Linear regression was used to compare the E-DII scores by age, sex, race/ethnicity, education and income using both CCA and MI.
Setting:
USA.
Participants:
This study included 34 547 non-Hispanic White, non-Hispanic Black and Hispanic adults aged ≥ 20 years from the 2005–2018 National Health and Nutrition Examination Survey.
Results:
The MI and CCA subpopulations comprised 34 547 and 23 955 participants, respectively. Overall, 57 % of the American adults reported 24-hour dietary intakes associated with inflammation. Both methods showed similar patterns wherein 24-hour dietary intakes associated with high inflammation were commonly reported among males, younger adults, non-Hispanic Black adults and those with lower education or income. Differences in point estimates between CCA and MI were mostly modest at ≤ 20 %.
Conclusions:
The two approaches for handling missing data produced comparable point estimates and 95 % CI. Differences in the E-DII scores by age, sex, race/ethnicity, education and income suggest that socio-economic disparities in health may be partially explained by the inflammatory potential of diet.
Background: Nursing home (NH) residents are at high risk of COVID-19 from exposure to infected staff and other residents. Understanding SARS-CoV-2 viral RNA kinetics in residents and staff can guide testing, isolation, and return to work recommendations. We sought to determine the duration of antigen test and polymerase chain reaction (PCR) positivity in a cohort of NH residents and staff. Methods: We prospectively collected data on SARS-CoV-2 viral kinetics from April 2023 through November 2023. Staff and residents could enroll prospectively or upon a positive test (identified through routine clinical testing, screening, or outbreak response testing). Participating facilities performed routine clinical testing; asymptomatic testing of contacts was performed within 48 hours if an outbreak or known exposure occurred and upon (re-) admission. Enrolled participants who tested positive for SARS-CoV-2 were re-tested daily for 14 days with both nasal antigen and nasal PCR tests. All PCR tests were run by a central lab with the same assay. We conducted a Kaplan-Meier survival analysis on time to first negative test restricted to participants who initially tested positive (day zero) and had at least one test ≥10 days after initially testing positive with the same test type; a participant could contribute to both antigen and PCR survival curves. We compared survival curves for staff and residents using the log-rank test. Results: Twenty-four nursing homes in eight states participated; 587 participants (275 residents, 312 staff) enrolled in the evaluation, participants were only tested through routine clinical or outbreak response testing. Seventy-two participants tested positive for antigen; of these, 63 tested PCR-positive. Residents were antigen- and PCR-positive longer than staff (Figure 1), but this finding is only statistically significant (p=0.006) for duration of PCR positivity. Five days after the first positive test, 56% of 50 residents and 59% of 22 staff remained antigen-positive; 91% of 44 residents and 79% of 19 staff were PCR-positive. Ten days after the first positive test, 22% of 50 residents and 5% of 22 staff remained antigen-positive; 61% of 44 residents and 21% of 19 staff remained PCR-positive. Conclusions: Most NH residents and staff with SARS-CoV-2 remained antigen- or PCR-positive 5 days after the initial positive test; however, differences between staff and resident test positivity were noted at 10 days. These data can inform recommendations for testing, duration of NH resident isolation, and return to work guidance for staff. Additional viral culture data may strengthen these conclusions.
Disclosure: Stefan Gravenstein: Received consulting and speaker fees from most vaccine manufacturers (Sanofi, Seqirus, Moderna, Merck, Janssen, Pfizer, Novavax, GSK, and have or expect to receive grant funding from several (Sanofi, Seqirus, Moderna, Pfizer, GSK). Lona Mody: NIH, VA, CDC, Kahn Foundation; Honoraria: UpToDate; Contracted Research: Nano-Vibronix
Recent evidence from case reports suggests that a ketogenic diet may be effective for bipolar disorder. However, no clinical trials have been conducted to date.
Aims
To assess the recruitment and feasibility of a ketogenic diet intervention in bipolar disorder.
Method
Euthymic individuals with bipolar disorder were recruited to a 6–8 week trial of a modified ketogenic diet, and a range of clinical, economic and functional outcome measures were assessed. Study registration number: ISRCTN61613198.
Results
Of 27 recruited participants, 26 commenced and 20 completed the modified ketogenic diet for 6–8 weeks. The outcomes data-set was 95% complete for daily ketone measures, 95% complete for daily glucose measures and 95% complete for daily ecological momentary assessment of symptoms during the intervention period. Mean daily blood ketone readings were 1.3 mmol/L (s.d. = 0.77, median = 1.1) during the intervention period, and 91% of all readings indicated ketosis, suggesting a high degree of adherence to the diet. Over 91% of daily blood glucose readings were within normal range, with 9% indicating mild hypoglycaemia. Eleven minor adverse events were recorded, including fatigue, constipation, drowsiness and hunger. One serious adverse event was reported (euglycemic ketoacidosis in a participant taking SGLT2-inhibitor medication).
Conclusions
The recruitment and retention of euthymic individuals with bipolar disorder to a 6–8 week ketogenic diet intervention was feasible, with high completion rates for outcome measures. The majority of participants reached and maintained ketosis, and adverse events were generally mild and modifiable. A future randomised controlled trial is now warranted.
In this article we put forward an alternative account of the famous wristguards, or bracers, of the European Early Bronze Age. Combining new materialism with empirical microwear analysis, we study 15 examples from Britain in detail and suggest a different way of conceptualizing these objects. Rather than demanding they have a singular function, we treat these objects as ‘multiplicities’ and as always in process. This, in turn, has significant implications for the important archaeological concepts of typology and object biography and our understandings of material culture more widely.
We analyzed efficacy of a centralized surveillance infection prevention (CSIP) program in a healthcare system on healthcare-associated infection (HAI) rates amid the coronavirus disease 2019 (COVID-19) pandemic. HAI rates were variable in CSIP and non-CSIP facilities. Central-line–associated bloodstream infection (CLABSI), C. difficile infection (CSI), and surgical-site infection (SSI) rates were negatively correlated with COVID-19 intensity in CSIP facilities.
To develop, implement, and evaluate the effectiveness of a unique centralized surveillance infection prevention (CSIP) program.
Design:
Observational quality improvement project.
Setting:
An integrated academic healthcare system.
Intervention:
The CSIP program comprises senior infection preventionists who are responsible for healthcare-associated infection (HAI) surveillance and reporting, allowing local infection preventionists (LIPs) a greater portion of their time to non-surveillance patient safety activities. Four CSIP team members accrued HAI responsibilities at 8 facilities.
Methods:
We evaluated the effectiveness of the CSIP program using 4 measures: recovery of LIP time, efficiency of surveillance activities by LIPs and CSIP staff, surveys characterizing LIP perception of their effectiveness in HAI reduction, and nursing leaders’ perception of LIP effectiveness.
Results:
The amount of time spent by LIP teams on HAI surveillance was highly variable, while CSIP time commitment and efficiency was steady. Post-CSIP implementation, 76.9% of LIPs agreed that they spend adequate time on inpatient units, compared to 15.4% pre-CSIP; LIPs also reported more time to allot to non-surveillance activities. Nursing leaders reported greater satisfaction with LIP involvement with HAI reduction practices.
Conclusion:
CSIP programs are a little-reported strategy to ease burden on LIPs with reallocation of HAI surveillance. The analyses presented here will aid health systems in anticipating the benefit of CSIP programs.
Excavated over two centuries ago, the Upton Lovell G2a ‘Wessex Culture’ burial has held a prominent place in research on Bronze Age Britain. In particular, was it the grave of a ‘shaman’ or a metalworker? We take a new approach to the grave goods, employing microwear analysis and scanning electron microscopy to map a history of interactions between people and materials, identifying evidence for the presence of Bronze Age gold on five artefacts, four for the first time. Advancing a new materialist approach, we identify a goldworking toolkit, linking gold, stone and copper objects within a chaîne opératoire, concluding that modern categorisations of these materials miss much of their complexity.
To identify the informatics educational needs of clinical and translational research professionals whose primary focus is not informatics.
Introduction:
Informatics and data science skills are essential for the full spectrum of translational research, and an increased understanding of informatics issues on the part of translational researchers can alleviate the demand for informaticians and enable more productive collaborations when informaticians are involved. Identifying the level of interest in different topics among various types of of translational researchers will help set priorities for development and dissemination of informatics education.
Methods:
We surveyed clinical and translational science researchers in Clinical and Translational Science Award (CTSA) programs about their educational needs and preferences.
Results:
Researchers from 23 out of the 62 CTSA hubs responded to the survey. 67% of respondents across roles and topics expressed interest in learning about informatics topics. There was high interest in all 30 topics included in the survey, with some variation in interest depending on the role of the respondents.
Discussion:
Our data support the need to advance training in clinical and biomedical informatics. As the complexity and use of information technology and data science in research studies grows, informaticians will continue to be a limited resource for research collaboration, education, and training. An increased understanding of informatics issues across translational research teams can alleviate this burden and allow for more productive collaborations. To inform a roadmap for informatics education for research professionals, we suggest strategies to use the results of this needs assessment to develop future informatics education.
The appearance of Beaker pottery in Britain and Ireland during the twenty-fifth century bc marks a significant archaeological horizon, being synchronous with the first metal artefacts. The adoption of arsenical copper, mostly from Ireland, was followed by that of tin-bronze around 2200 bc. However, whilst the copper mine of Ross Island in Ireland is securely dated to the Early Bronze Age, and further such mines in the UK have been dated to the Early and Middle Bronze Age, the evidence for the exploitation of tin ores, the other key ingredient to make bronze, has remained circumstantial. This article contains the detailed analyses of seven stone artefacts from securely dated contexts, using a combination of surface pXRF and microwear analysis. The results provide strong evidence that the tools were used in cassiterite processing. The combined analysis of these artefacts documents in detail the exploitation of Cornish tin during this early phase of metal use in Britain and Ireland.
The aim of this project was to create a Pan-London event to increase awareness and enthusiasm of medical students for Psychiatry as a specialty. In addition to a longer term goal of ultimately increasing recruitment to the specialty once students qualify, this event aimed to bring Mental Health to the forefront of the minds of future doctors.
Methods
Psychiatry Teaching Fellows from different trusts created a virtual educational event targeted at medical students in all years across London universities. It was co-produced with the student Psychiatry Societies across the London Universities. This encouraged student engagement from the ground level and fostered an environment of collaboration between students and Doctors. The event was free to attend and was supported by the Royal College of Psychiatry, London Division. The conference programme showcased the various facets Psychiatry has to offer from a global perspective, including Women's Mental Health, Forensic Psychiatry, research and volunteering around the world.
Results
The conference welcomed 263 attendees. 92 of the attendees completed a feedback questionnaire at the end of the session. The majority of respondents were from London universities and fairly evenly distributed amongst medical school year groups. 99% of those completing the questionnaire found the session interesting (scoring 3 or more out of 5 on a 5 point Likert scale). 98% of respondents reported that they found the session widened their view of Psychiatry. 78% were already considering a career in Psychiatry. 96% felt more likely to pursue a career in Psychiatry following the conference (scoring 3 or more out of 5 on a 5 point Likert scale). Open-text feedback indicated that attendees had found the sessions interesting and particularly valued the range of topics.
Conclusion
Extra-curricular events are a fantastic chance to broaden medical students’ views of the specialty of Psychiatry. A virtual platform creates opportunities for audiences to hear from a vast array of expert speakers, which might not otherwise be possible in person, and creates a community of like-minded students in a safe environment. Whether or not students go on to pursue the field themselves later on in their training, events such as this bring awareness of Psychiatry and its impacts to the foreground. It is hoped that, in future, further co-produced events between the Royal College of Psychiatry and university Psychiatry societies, can continue to inspire medical students.
To improve maternal health outcomes, increased diversity is needed among pregnant people in research studies and community surveillance. To expand the pool, we sought to develop a network encompassing academic and community obstetrics clinics. Typical challenges in developing a network include site identification, contracting, onboarding sites, staff engagement, participant recruitment, funding, and institutional review board approvals. While not insurmountable, these challenges became magnified as we built a research network during a global pandemic. Our objective is to describe the framework utilized to resolve pandemic-related issues.
Methods:
We developed a framework for site-specific adaptation of the generalized study protocol. Twice monthly video meetings were held between the lead academic sites to identify local challenges and to generate ideas for solutions. We identified site and participant recruitment challenges and then implemented solutions tailored to the local workflow. These solutions included the use of an electronic consent and videoconferences with local clinic leadership and staff. The processes for network development and maintenance changed to address issues related to the COVID-19 pandemic. However, aspects of the sample processing/storage and data collection elements were held constant between sites.
Results:
Adapting our consenting approach enabled maintaining study enrollment during the pandemic. The pandemic amplified issues related to contracting, onboarding, and IRB approval. Maintaining continuity in sample management and clinical data collection allowed for pooling of information between sites.
Conclusions:
Adaptability is key to maintaining network sites. Rapidly changing guidelines for beginning and continuing research during the pandemic required frequent intra- and inter-institutional communication to navigate.
In this paper we argue that to understand the difference Posthumanism makes to the relationship between archaeology, agency and ontology, several misconceptions need to be corrected. First, we emphasize that Posthumanism is multiple, with different elements, meaning any critique needs to be carefully targeted. The approach we advocate is a specifically Deleuzian and explicitly feminist approach to Posthumanism. Second, we examine the status of agency within Posthumanism and suggest that we may be better off thinking about affect. Third, we explore how the approach we advocate treats difference in new ways, not as a question of lack, or as difference ‘from’, but rather as a productive force in the world. Finally, we explore how Posthumanism allows us to re-position the role of the human in archaeology,
Our objective was to describe, for the first time in an English-speaking Caribbean country, the contribution of ultra-processed foods (UPFs) to nutrients linked to non-communicable disease. Using a cross-sectional study design, dietary data were collected from two non-consecutive 24-h dietary recalls. Recorded food items were then classified according to their degree of processing by the NOVA system. The present study took place in Barbados (2012–13). A representative population-based sample of 364 adult Barbadians (161 males and 203 females) aged 25–64 years participated in the study. UPFs represented 40⋅5 % (838 kcal/d; 95 % CI 791, 885) of mean energy intake. Sugar-sweetened beverages made the largest contribution to energy within the UPF category. Younger persons (25–44 years) consumed a significantly higher proportion of calories from UPF (NOVA group 4) compared with older persons (45–64 years). The mean energy shares of UPF ranged from 22⋅0 to 58⋅9 % for those in the lowest tertile to highest tertile. Within each tertile, the energy contribution was significantly higher in the younger age group (25–44 years) compared with the older (45–64 years). One-quarter of persons consume ≥50 % of their daily calories from UPF, this being significantly higher in younger persons. The ultra-processed diet fraction contained about six times the mean of free sugars and about 0⋅8 times the dietary fibre of the non-ultra-processed fraction (NOVA groups 1–3). Targeted interventions to decrease the consumption of UPF especially in younger persons is thus of high priority to improve the diet quality of Barbadians.
The COVID-19 pandemic prompted the development and implementation of hundreds of clinical trials across the USA. The Trial Innovation Network (TIN), funded by the National Center for Advancing Translational Sciences, was an established clinical research network that pivoted to respond to the pandemic.
Methods:
The TIN’s three Trial Innovation Centers, Recruitment Innovation Center, and 66 Clinical and Translational Science Award Hub institutions, collaborated to adapt to the pandemic’s rapidly changing landscape, playing central roles in the planning and execution of pivotal studies addressing COVID-19. Our objective was to summarize the results of these collaborations and lessons learned.
Results:
The TIN provided 29 COVID-related consults between March 2020 and December 2020, including 6 trial participation expressions of interest and 8 community engagement studios from the Recruitment Innovation Center. Key lessons learned from these experiences include the benefits of leveraging an established infrastructure, innovations surrounding remote research activities, data harmonization and central safety reviews, and early community engagement and involvement.
Conclusions:
Our experience highlighted the benefits and challenges of a multi-institutional approach to clinical research during a pandemic.