We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The brain can be represented as a network, with nodes as brain regions and edges as region-to-region connections. Nodes with the most connections (hubs) are central to efficient brain function. Current findings on structural differences in Major Depressive Disorder (MDD) identified using network approaches remain inconsistent, potentially due to small sample sizes. It is still uncertain at what level of the connectome hierarchy differences may exist, and whether they are concentrated in hubs, disrupting fundamental brain connectivity.
Methods
We utilized two large cohorts, UK Biobank (UKB, N = 5104) and Generation Scotland (GS, N = 725), to investigate MDD case–control differences in brain network properties. Network analysis was done across four hierarchical levels: (1) global, (2) tier (nodes grouped into four tiers based on degree) and rich club (between-hub connections), (3) nodal, and (4) connection.
Results
In UKB, reductions in network efficiency were observed in MDD cases globally (d = −0.076, pFDR = 0.033), across all tiers (d = −0.069 to −0.079, pFDR = 0.020), and in hubs (d = −0.080 to −0.113, pFDR = 0.013–0.035). No differences in rich club organization and region-to-region connections were identified. The effect sizes and direction for these associations were generally consistent in GS, albeit not significant in our lower-N replication sample.
Conclusion
Our results suggest that the brain's fundamental rich club structure is similar in MDD cases and controls, but subtle topological differences exist across the brain. Consistent with recent large-scale neuroimaging findings, our findings offer a connectomic perspective on a similar scale and support the idea that minimal differences exist between MDD cases and controls.
Older adults have low levels of mental health literacy relating to anxiety which may contribute to delaying or not seeking help. Lifestyle interventions, including physical activity (PA), have increasing evidence supporting their effectiveness in reducing anxiety. The COVID-19 pandemic also highlighted the potential for technology to facilitate healthcare provision. This study aimed to investigate perspectives of older adults about their understanding of anxiety, possible use of PA interventions to reduce anxiety, and whether technology could help this process.
Methods:
The INDIGO trial evaluated a PA intervention for participants aged 60 years and above at risk of cognitive decline and not meeting PA guidelines. Twenty-nine of the INDIGO trial completers, including some with anxiety and/or cognitive symptoms, attended this long-term follow-up study including semi-structured qualitative interviews. Transcripts were analyzed thematically.
Results:
There was quite a diverse understanding of anxiety amongst participants. Some participants were able to describe anxiety as involving worry, uncertainty and fear, as well as relating it to physical manifestations and feeling out of control. Others had less understanding of the concept of anxiety or found it confusing. Participants generally believed that PA could potentially reduce anxiety and thought that this could occur through a “mindfulness” and/or “physiological” process. Technology use was a more controversial topic with some participants quite clearly expressing a dislike or distrust of technology or else limited access or literacy in relation to technology. Participants who were supportive of using technology described that it could help with motivation, information provision and health monitoring. Wearable activity monitors were described favorably, with online platforms and portable devices also being options.
Conclusion:
Our results highlight the importance of increasing information and education about anxiety to older adults. This may increase awareness of anxiety and reduce delays in seeking help or not seeking help at all. Findings also emphasize the need for clinicians to support understanding of anxiety in older adults that they are seeing and provide information and education where needed. It is likely that PA interventions to reduce anxiety, with the option of a technology component with support, will be acceptable to most older adults.
Early environmental experience can have significant effects on an animal's ability to adapt to challenges in later life. Prior experience of specific situations may facilitate the development of behavioural skills which can be applied in similar situations to later life. In addition, exposure to a more complex environment may enhance cognitive development (eg increased synaptic density), which can then speed the acquisition of new behavioural responses when faced with novel challenges (Grandin 1989).
This paper provides an overview and appraisal of the International Design Engineering Annual (IDEA) challenge - a virtually hosted design hackathon run with the aim of generating a design research dataset that can provide insights into design activities at virtually hosted hackathons. The resulting dataset consists of 200+ prototypes with over 1300 connections providing insights into the products, processes and people involved in the design process. The paper also provides recommendations for future deployments of virtual hackathons for design research.
Theories of early cooperation in human society often draw from a small sample of ethnographic studies of surviving populations of hunter–gatherers, most of which are now sedentary. Borneo hunter–gatherers (Punan, Penan) have seldom figured in comparative research because of a decades-old controversy about whether they are the descendants of farmers who adopted a hunting and gathering way of life. In 2018 we began an ethnographic study of a group of still-nomadic hunter–gatherers who call themselves Punan Batu (Cave Punan). Our genetic analysis clearly indicates that they are very unlikely to be the descendants of neighbouring agriculturalists. They also preserve a song language that is unrelated to other languages of Borneo. Dispersed travelling groups of Punan Batu with fluid membership use message sticks to stay in contact, co-operate and share resources as they journey between rock shelters and forest camps. Message sticks were once widespread among nomadic Punan in Borneo, but have largely disappeared in sedentary Punan villages. Thus the small community of Punan Batu offers a rare glimpse of a hunting and gathering way of life that was once widespread in the forests of Borneo, where prosocial behaviour extended beyond the face-to-face community, facilitating successful collective adaptation to the diverse resources of Borneo's forests.
Due to shortages of N95 respirators during the coronavirus disease 2019 (COVID-19) pandemic, it is necessary to estimate the number of N95s required for healthcare workers (HCWs) to inform manufacturing targets and resource allocation.
Methods:
We developed a model to determine the number of N95 respirators needed for HCWs both in a single acute-care hospital and the United States.
Results:
For an acute-care hospital with 400 all-cause monthly admissions, the number of N95 respirators needed to manage COVID-19 patients admitted during a month ranges from 113 (95% interpercentile range [IPR], 50–229) if 0.5% of admissions are COVID-19 patients to 22,101 (95% IPR, 5,904–25,881) if 100% of admissions are COVID-19 patients (assuming single use per respirator, and 10 encounters between HCWs and each COVID-19 patient per day). The number of N95s needed decreases to a range of 22 (95% IPR, 10–43) to 4,445 (95% IPR, 1,975–8,684) if each N95 is used for 5 patient encounters. Varying monthly all-cause admissions to 2,000 requires 6,645–13,404 respirators with a 60% COVID-19 admission prevalence, 10 HCW–patient encounters, and reusing N95s 5–10 times. Nationally, the number of N95 respirators needed over the course of the pandemic ranges from 86 million (95% IPR, 37.1–200.6 million) to 1.6 billion (95% IPR, 0.7–3.6 billion) as 5%–90% of the population is exposed (single-use). This number ranges from 17.4 million (95% IPR, 7.3–41 million) to 312.3 million (95% IPR, 131.5–737.3 million) using each respirator for 5 encounters.
Conclusions:
We quantified the number of N95 respirators needed for a given acute-care hospital and nationally during the COVID-19 pandemic under varying conditions.
A classic example of microbiome function is its role in nutrient assimilation in both plants and animals, but other less obvious roles are becoming more apparent, particularly in terms of driving infectious and non-infectious disease outcomes and influencing host behaviour. However, numerous biotic and abiotic factors influence the composition of these communities, and host microbiomes can be susceptible to environmental change. How microbial communities will be altered by, and mitigate, the rapid environmental change we can expect in the next few decades remain to be seen. That said, given the enormous range of functional diversity conferred by microbes, there is currently something of a revolution in microbial bioengineering and biotechnology in order to address real-world problems including human and wildlife disease and crop and biofuel production. All of these concepts are explored in further detail throughout the book.
The adsorption of the insecticide methomyl (S-methyl N-(methylcarbamoyloxy)thioacetimidate) by smectites with different layer charge (SWy and SAz montmorillonites and SH-Ca hectorite) has been determined. Adsorption has been expressed as the adsorbent/adsorbate distribution coefficient Kd, which increased when the layer charge of the smectite decreased. The Kd values for homoionic SWy montmorillonite samples increased when the ionic potential of the interlayer cation decreased, except for SWy-Fe3+. Infrared (IR) spectra and X-ray diffraction (XRD) analysis of the SWy-K+ and SWy-Na+ montmorillonites treated with methomyl suggest that pesticide molecules interact through polar bonds with the interlamellar cation. In the case of SWy-Fe3+, kinetic studies, high performance liquid chromatography (HPLC), IR and XRD data indicated that adsorption and degradation both account for the high Kd value obtained.
Sorption of the polar insecticide imidacloprid on organic-saturated octadecylammonium (C18) and dioctadecyldimethylammonium (DOD) and inorganic- (Fe- ) saturated Wyoming (W) and Arizona (A) montmorillonites has been investigated. Sorption isotherms were fitted to the Freundlich equation. Imidacloprid-montmorillonite complexes were studied by X-ray diffraction and FT-IR techniques. Imidacloprid sorption coefficients, Kf, decreased in the order WC18> AC18> WFe> WDOD≥ ADOD. The low layer charge and saturation by primary alkylammonium cation facilitates sorption of imidacloprid in the interlayer of the smectite, corroborated by the increase in basal spacing observed in X-ray diffraction patterns and by the presence of absorption band shifts in FT-IR spectra. Imidacloprid sorbs in the interlayer space of smectite mainly by hydrophobic interactions with the alkyl chains in organic smectites and with the uncharged siloxane surface in Fe(III)-smectite. Further polar bonds between the NO2 group of imidacloprid and the NH of the primary alkyl cations and protonation of imidacloprid in Fe-smectites enhanced sorption in these cases.
We hypothesized that a computerized clinical decision support tool for Clostridium difficile testing would reduce unnecessary inpatient tests, resulting in fewer laboratory-identified events. Census-adjusted interrupted time-series analyses demonstrated significant reductions of 41% fewer tests and 31% fewer hospital-onset C. difficile infection laboratory-identified events following this intervention.
Recent evidence suggests that exercise plays a role in cognition and that the posterior cingulate cortex (PCC) can be divided into dorsal and ventral subregions based on distinct connectivity patterns.
Aims
To examine the effect of physical activity and division of the PCC on brain functional connectivity measures in subjective memory complainers (SMC) carrying the epsilon 4 allele of apolipoprotein E (APOE 4) allele.
Method
Participants were 22 SMC carrying the APOE ɛ4 allele (ɛ4+; mean age 72.18 years) and 58 SMC non-carriers (ɛ4–; mean age 72.79 years). Connectivity of four dorsal and ventral seeds was examined. Relationships between PCC connectivity and physical activity measures were explored.
Results
ɛ4+ individuals showed increased connectivity between the dorsal PCC and dorsolateral prefrontal cortex, and the ventral PCC and supplementary motor area (SMA). Greater levels of physical activity correlated with the magnitude of ventral PCC–SMA connectivity.
Conclusions
The results provide the first evidence that ɛ4+ individuals at increased risk of cognitive decline show distinct alterations in dorsal and ventral PCC functional connectivity.
Introduction: Point of care ultrasound (PoCUS) has become an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). Current established protocols (e.g. RUSH and ACES) were developed by expert user opinion, rather than objective, prospective data. Recently the SHoC Protocol was published, recommending 3 core scans; cardiac, lung, and IVC; plus other scans when indicated clinically. We report the abnormal ultrasound findings from our international multicenter randomized controlled trial, to assess if the recommended 3 core SHoC protocol scans were chosen appropriately for this population. Methods: Recruitment occurred at seven centres in North America (4) and South Africa (3). Screening at triage identified patients (SBP<100 or shock index>1) who were randomized to PoCUS or control (standard care with no PoCUS) groups. All scans were performed by PoCUS-trained physicians within one hour of arrival in the ED. Demographics, clinical details and study findings were collected prospectively. A threshold incidence for positive findings of 10% was established as significant for the purposes of assessing the appropriateness of the core recommendations. Results: 138 patients had a PoCUS screen completed. All patients had cardiac, lung, IVC, aorta, abdominal, and pelvic scans. Reported abnormal findings included hyperdynamic LV function (59; 43%); small collapsing IVC (46; 33%); pericardial effusion (24; 17%); pleural fluid (19; 14%); hypodynamic LV function (15; 11%); large poorly collapsing IVC (13; 9%); peritoneal fluid (13; 9%); and aortic aneurysm (5; 4%). Conclusion: The 3 core SHoC Protocol recommendations included appropriate scans to detect all pathologies recorded at a rate of greater than 10 percent. The 3 most frequent findings were cardiac and IVC abnormalities, followed by lung. It is noted that peritoneal fluid was seen at a rate of 9%. Aortic aneurysms were rare. This data from the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients, supports the use of the prioritized SHoC protocol, though a larger study is required to confirm these findings.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
Method
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
Results
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Conclusions
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
Q fever patients are often reported to experience a long-term impaired health status, including fatigue, which can persist for many years. During the large Q fever epidemic in The Netherlands, many patients with a laboratory-confirmed Coxiella burnetii infection were not notified as acute Q fever because they did not fulfil the clinical criteria of the acute Q fever case definition (fever, pneumonia and/or hepatitis). Our study assessed and compared the long-term health status of notified and non-notified Q fever patients at 4 years after onset of illness, using the Nijmegen Clinical Screening Instrument (NCSI). The study included 448 notified and 193 non-notified Q fever patients. The most severely affected subdomain in both patient groups was ‘Fatigue’ (50·5% of the notified and 54·6% of the non-notified patients had severe fatigue). Long-term health status did not differ significantly between the notified and non-notified patient groups, and patients scored worse on all subdomains compared to a healthy reference group. Our findings suggest that the magnitude of the 2007–2009 Q fever outbreak in The Netherlands was underestimated when only notified patients according to the European Union case definition are considered.
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) has found that the proportional elevation in the US Army enlisted soldier suicide rate during deployment (compared with the never-deployed or previously deployed) is significantly higher among women than men, raising the possibility of gender differences in the adverse psychological effects of deployment.
Method
Person-month survival models based on a consolidated administrative database for active duty enlisted Regular Army soldiers in 2004–2009 (n = 975 057) were used to characterize the gender × deployment interaction predicting suicide. Four explanatory hypotheses were explored involving the proportion of females in each soldier's occupation, the proportion of same-gender soldiers in each soldier's unit, whether the soldier reported sexual assault victimization in the previous 12 months, and the soldier's pre-deployment history of treated mental/behavioral disorders.
Results
The suicide rate of currently deployed women (14.0/100 000 person-years) was 3.1–3.5 times the rates of other (i.e. never-deployed/previously deployed) women. The suicide rate of currently deployed men (22.6/100 000 person-years) was 0.9–1.2 times the rates of other men. The adjusted (for time trends, sociodemographics, and Army career variables) female:male odds ratio comparing the suicide rates of currently deployed v. other women v. men was 2.8 (95% confidence interval 1.1–6.8), became 2.4 after excluding soldiers with Direct Combat Arms occupations, and remained elevated (in the range 1.9–2.8) after adjusting for the hypothesized explanatory variables.
Conclusions
These results are valuable in excluding otherwise plausible hypotheses for the elevated suicide rate of deployed women and point to the importance of expanding future research on the psychological challenges of deployment for women.
We describe the efficacy of enhanced infection control measures, including those recommended in the Centers for Disease Control and Prevention’s 2012 carbapenem-resistant Enterobacteriaceae (CRE) toolkit, to control concurrent outbreaks of carbapenemase-producing Enterobacteriaceae (CPE) and extensively drug-resistant Acinetobacter baumannii (XDR-AB).
Design
Before-after intervention study.
Setting
Fifteen-bed surgical trauma intensive care unit (ICU).
Methods
We investigated the impact of enhanced infection control measures in response to clusters of CPE and XDR-AB infections in an ICU from April 2009 to March 2010. Polymerase chain reaction was used to detect the presence of blaKPC and resistance plasmids in CRE. Pulsed-field gel electrophoresis was performed to assess XDR-AB clonality. Enhanced infection-control measures were implemented in response to ongoing transmission of CPE and a new outbreak of XDR-AB. Efficacy was evaluated by comparing the incidence rate (IR) of CPE and XDR-AB before and after the implementation of these measures.
Results
The IR of CPE for the 12 months before the implementation of enhanced measures was 7.77 cases per 1,000 patient-days, whereas the IR of XDR-AB for the 3 months before implementation was 6.79 cases per 1,000 patient-days. All examined CPE shared endemic blaKPC resistance plasmids, and 6 of the 7 XDR-AB isolates were clonal. Following institution of enhanced infection control measures, the CPE IR decreased to 1.22 cases per 1,000 patient-days (P = .001), and no more cases of XDR-AB were identified.
Conclusions
Use of infection control measures described in the Centers for Disease Control and Prevention’s 2012 CRE toolkit was associated with a reduction in the IR of CPE and an interruption in XDR-AB transmission.