We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cardiometabolic diseases, including type 2 diabetes (T2DM) and cardiovascular disease (CVD), are common. Approximately one in three deaths annually are caused by CVD in Aotearoa New Zealand (AoNZ)(1). The Mediterranean dietary pattern is associated with a reduced risk of cardiometabolic disease in epidemiological and interventional studies(2,3). However, implementing the Mediterranean diet into non-Mediterranean populations can be challenging(4). Some of these challeanges include facilitating consumption of unfamiliar foods and the cultural and social context of food consumption. AoNZ produces a rich source of high-quality foods consistent with a Mediterranean dietary pattern. He Rourou Whai Painga is collaborative project combining contributions from food industry partners into a Mediterranean Diet pattern and providing foods, recipes and other support to whole household/whānau. The aim was to test if a New Zealand food-based Mediterranean diet (NZMedDiet) with behavioural intervention improves cardiometabolic health and wellbeing in individuals at risk. This presentation will review the background to the research, the process of forming a collaboration between researchers and the food industry, the design and implementation of a complex study design (see protocol paper)(5), with results from the initial randomised controlled trial. We conducted several pilot studies(6,7,8) to inform the final design of the research, which was a combination of two randomised controlled trials (RCT 1 and 2) and a longitudinal cohort study. RCT-1 compared 12-weeks of the NZMedDiet to usual diet in participants with increased cardiometabolic risk (metabolic syndrome severity score (MetSSS) >0.35). The intervention group were provided with food and recipes to meet 75% of their energy requirements, supported by a behavioural intervention to improve adherence. The primary outcome measure was MetSSS after 12 weeks. Two hundred individuals with mean (SD) age 49.9 (10.9)yrs with 62% women were enrolled with their household/whānau. After 12 weeks, the mean (SD) MetSSS was 1.0 (0.7) in the control (n = 98) and 0.8 (0.5) in the intervention (n = 102) group; estimated difference (95% CI) of -0.05 (-0.16 to 0.06), p=0.35. A Mediterranean diet score (PyrMDS) was greater in the intervention group 1.6 (1.1 to 2.1), p<0.001, consistent with a change to a more Mediterranean dietary pattern. Weight reduced in the NZMedDiet group compared with control (-1.9 kg (-2.0 to -0.34)), p=0.006 and wellbeing, assessed by the SF-36 quality of life questionnaire, improved across all domains p<0.001. In participants with increased cardiometabolic risk, food provision with a Mediterranean dietary pattern and a behavioural intervention did not improve a metabolic risk score but was associated with reduced weight and improved quality of life.
We present the Evolutionary Map of the Universe (EMU) survey conducted with the Australian Square Kilometre Array Pathfinder (ASKAP). EMU aims to deliver the touchstone radio atlas of the southern hemisphere. We introduce EMU and review its science drivers and key science goals, updated and tailored to the current ASKAP five-year survey plan. The development of the survey strategy and planned sky coverage is presented, along with the operational aspects of the survey and associated data analysis, together with a selection of diagnostics demonstrating the imaging quality and data characteristics. We give a general description of the value-added data pipeline and data products before concluding with a discussion of links to other surveys and projects and an outline of EMU’s legacy value.
Precision or “Personalized Medicine” and “Big Data” are growing trends in the biomedical research community and highlight an increased focus on access to larger datasets to effectively explore disease processes at the molecular level versus the previously common one-size-fits all approach. This focus necessitated a local transition from independent lab and siloed projects to a single software application utilizing a common ontology to create access to data from multiple repositories. Use of a common system has allowed for increased ease of collaboration and access to quality biospecimens that are extensively annotated with clinical, molecular, and patient associated data. The software needed to function at an enterprise level while continuing to allow investigators the autonomy and security access they desire. To identify a solution, a working group comprised of representation from independent repositories and areas of research focus across departments was established and responsible for review and implementation of an enterprise-wide biospecimen management system. Central to this process was the creation of a unified vocabulary across all repositories, including consensus around source of truth, standardized field definitions, and shared terminology.
This paper reports an experiment designed to assess the effects of a rotation in the marginal cost curve on convergence in a repeated Cournot triopoly. Increasing the cost curve's slope both reduces the serially-undominated set to the Nash prediction, and increases the peakedness of earnings. We observe higher rates of Nash equilibrium play in the design with the steeper marginal cost schedule, but only when participants are also rematched after each decision. Examination of response patterns suggests that the treatment with a steeper marginal cost curve and with a re-matching of participants across periods induces the selection of Nash Consistent responses.
This article replicates and “stress tests” a recent finding by Eckel and Grossman (2003) that matching subsidies generate substantially higher Charity Receipts than theoretically comparable rebate subsidies. In a first replication treatment, we show that most choices are consist with a “constant (gross) contribution” rule, suggesting that inattention to the subsidies’ differing net consequences may explain the higher revenues elicited with matching subsidies. Results of additional treatments suggest that (a) the charity dimension of the decision problems has little to do with the result, and (b) extra information regarding the net consequences of decisions reduces but does not eliminate the result.
Racial and ethnic variations in antibiotic utilization are well-reported in outpatient settings but little is known about inpatient settings. Our objective was to describe national inpatient antibiotic utilization among children by race and ethnicity.
Methods:
This study included hospital visit data from the Pediatric Health Information System between 01/01/2022 and 12/31/2022 for patients <20 years. Primary outcomes were the percentage of hospitalization encounters that received an antibiotic and antibiotic days of therapy (DOT) per 1000 patient days. Mixed-effect regression models were used to determine the association of race-ethnicity with outcomes, adjusting for covariates.
Results:
There were 846,530 hospitalizations. 45.2% of children were Non-Hispanic (NH) White, 27.1% were Hispanic, 19.2% were NH Black, 4.5% were NH Other, 3.5% were NH Asian, 0.3% were NH Native Hawaiian/Other Pacific Islander (NHPI) and 0.2% were NH American Indian. Adjusting for covariates, NH Black children had lower odds of receiving antibiotics compared to NH White children (aOR 0.96, 95%CI 0.94–0.97), while NH NHPI had higher odds of receiving antibiotics (aOR 1.16, 95%CI 1.05–1.29). Children who were Hispanic, NH Asian, NH American Indian, and children who were NH Other received antibiotic DOT compared to NH White children, while NH NHPI children received more antibiotic DOT.
Conclusions:
Antibiotic utilization in children’s hospitals differs by race and ethnicity. Hospitals should assess policies and practices that may contribute to disparities in treatment; antibiotic stewardship programs may play an important role in promoting inpatient pharmacoequity. Additional research is needed to examine individual diagnoses, clinical outcomes, and drivers of variation.
Attention-deficit/hyperactivity disorder (ADHD) is a highly prevalent psychiatric condition that frequently originates in early development and is associated with a variety of functional impairments. Despite a large functional neuroimaging literature on ADHD, our understanding of the neural basis of this disorder remains limited, and existing primary studies on the topic include somewhat divergent results.
Objectives
The present meta-analysis aims to advance our understanding of the neural basis of ADHD by identifying the most statistically robust patterns of abnormal neural activation throughout the whole-brain in individuals diagnosed with ADHD compared to age-matched healthy controls.
Methods
We conducted a meta-analysis of task-based functional magnetic resonance imaging (fMRI) activation studies of ADHD. This included, according to PRISMA guidelines, a comprehensive PubMed search and predetermined inclusion criteria as well as two independent coding teams who evaluated studies and included all task-based, whole-brain, fMRI activation studies that compared participants diagnosed with ADHD to age-matched healthy controls. We then performed multilevel kernel density analysis (MKDA) a well-established, whole-brain, voxelwise approach that quantitatively combines existing primary fMRI studies, with ensemble thresholding (p<0.05-0.0001) and multiple comparisons correction.
Results
Participants diagnosed with ADHD (N=1,550), relative to age-matched healthy controls (N=1,340), exhibited statistically significant (p<0.05-0.0001; FWE-corrected) patterns of abnormal activation in multiple brains of the cerebral cortex and basal ganglia across a variety of cognitive control tasks.
Conclusions
This study advances our understanding of the neural basis of ADHD and may aid in the development of new brain-based clinical interventions as well as diagnostic tools and treatment matching protocols for patients with ADHD. Future studies should also investigate the similarities and differences in neural signatures between ADHD and other highly comorbid psychiatric disorders.
This article examines the development, early operation and subsequent failure of the Tot-Kolowa Red Cross irrigation scheme in Kenya’s Kerio Valley. Initially conceived as a technical solution to address regional food insecurity, the scheme aimed to scale up food production through the implementation of a fixed pipe irrigation system and the provision of agricultural inputs for cash cropping. A series of unfolding circumstances, however, necessitated numerous modifications to the original design as the project became increasingly entangled with deep and complex histories of land use patterns, resource allocation and conflict. Failure to understand the complexity of these dynamics ultimately led to the project’s collapse as the region spiralled into a period of significant unrest. In tracing these events, we aim to foreground the lived realities of imposed development, including both positive and negative responses to the scheme’s participatory obligations and its wider impact on community resilience.
The status of the genera Euparagonimus Chen, 1963 and Pagumogonimus Chen, 1963 relative to Paragonimus Braun, 1899 was investigated using DNA sequences from the mitochondrial cytochrome c oxidase subunit I (CO1) gene (partial) and the nuclear ribosomal DNA second internal transcribed spacer (ITS2). In the phylogenetic trees constructed, the genus Pagumogonimus is clearly not monophyletic and therefore not a natural taxon. Indeed, the type species of Pagumogonimus,P. skrjabini from China, is very closely related to Paragonimusmiyazakii from Japan. The status of Euparagonimus is less obvious. Euparagonimus cenocopiosus lies distant from other lungflukes included in the analysis. It can be placed as sister to Paragonimus in some analyses and falls within the genus in others. A recently published morphological study placed E. cenocopiosus within the genus Paragonimus and probably this is where it should remain.
Soil amelioration via strategic deep tillage is occasionally utilized within conservation tillage systems to alleviate soil constraints, but its impact on weed seed burial and subsequent growth within the agronomic system is poorly understood. This study assessed the effects of different strategic deep-tillage practices, including soil loosening (deep ripping), soil mixing (rotary spading), or soil inversion (moldboard plow), on weed seed burial and subsequent weed growth, compared with a no-till control. The tillage practices were applied in 2019 at Yerecoin and Darkan, WA, and data on weed seed burial and growth were collected during the following 3-yr winter crop rotation (2019 to 2021). Soil inversion buried 89% of rigid ryegrass (Lolium rigidum Gaudin) and ripgut brome (Bromus diandrus Roth) seeds to a depth of 10 to 20 cm at both sites, while soil loosening and mixing left between 31% and 91% of the seeds in the top 0 to 10 cm of soil, with broad variation between sites. Few seeds were buried beyond 20 cm despite tillage working depths exceeding 30 cm at both sites. Soil inversion reduced the density of L. rigidum to <1 plant m−2 for 3 yr after strategic tillage. Bromus diandrus density was initially reduced to 0 to 1 plant m−2 by soil inversion, but increased to 4 plants m−2 at Yerecoin in 2020 and 147 plants at Darkan in 2021. Soil loosening or mixing did not consistently decrease weed density. The field data were used to parameterize a model that predicted weed density following strategic tillage with greater accuracy for soil inversion than for loosening or mixing. The findings provide important insights into the effects of strategic deep tillage on weed management in conservational agricultural systems and demonstrate the potential of models for optimizing weed management strategies.
Different fertilization strategies can be adopted to optimize the productive components of an integrated crop–livestock systems. The current research evaluated how the application of P and K to soybean (Glycine max (L.) Merr.) or Urochloa brizantha (Hochst. ex A. Rich.) R. D. Webster cv. BRS Piatã associated with nitrogen or without nitrogen in the pasture phase affects the accumulation and chemical composition of forage and animal productivity. The treatments were distributed in randomized blocks with three replications. Four fertilization strategies were tested: (1) conventional fertilization with P and K in the crop phase (CF–N); (2) conventional fertilization with nitrogen in the pasture phase (CF + N); (3) system fertilization with P and K in the pasture phase (SF–N); (4) system fertilization with nitrogen in the pasture phase (SF + N). System fertilization increased forage accumulation from 15 710 to 20 920 kg DM ha/year compared to conventional without nitrogen. Stocking rate (3.1 vs. 2.8 AU/ha; SEM = 0.12) and gain per area (458 vs. 413 kg BW/ha; SEM = 27.9) were higher in the SF–N than CF–N, although the average daily gain was lower (0.754 vs. 0.792 kg LW/day; SEM = 0.071). N application in the pasture phase, both, conventional and system fertilization resulted in higher crude protein, stocking rate and gain per area. Applying nitrogen and relocate P and K from crop to pasture phase increase animal productivity and improve forage chemical composition in integrated crop–livestock system.
Area-based conservation is a widely used approach for maintaining biodiversity, and there are ongoing discussions over what is an appropriate global conservation area coverage target. To inform such debates, it is necessary to know the extent and ecological representativeness of the current conservation area network, but this is hampered by gaps in existing global datasets. In particular, although data on privately and community-governed protected areas and other effective area-based conservation measures are often available at the national level, it can take many years to incorporate these into official datasets. This suggests a complementary approach is needed based on selecting a sample of countries and using their national-scale datasets to produce more accurate metrics. However, every country added to the sample increases the costs of data collection, collation and analysis. To address this, here we present a data collection framework underpinned by a spatial prioritization algorithm, which identifies a minimum set of countries that are also representative of 10 factors that influence conservation area establishment and biodiversity patterns. We then illustrate this approach by identifying a representative set of sampling units that cover 10% of the terrestrial realm, which included areas in only 25 countries. In contrast, selecting 10% of the terrestrial realm at random included areas across a mean of 162 countries. These sampling units could be the focus of future data collation on different types of conservation area. Analysing these data could produce more rapid and accurate estimates of global conservation area coverage and ecological representativeness, complementing existing international reporting systems.
Depression and anxiety are common and highly comorbid, and their comorbidity is associated with poorer outcomes posing clinical and public health concerns. We evaluated the polygenic contribution to comorbid depression and anxiety, and to each in isolation.
Methods
Diagnostic codes were extracted from electronic health records for four biobanks [N = 177 865 including 138 632 European (77.9%), 25 612 African (14.4%), and 13 621 Hispanic (7.7%) ancestry participants]. The outcome was a four-level variable representing the depression/anxiety diagnosis group: neither, depression-only, anxiety-only, and comorbid. Multinomial regression was used to test for association of depression and anxiety polygenic risk scores (PRSs) with the outcome while adjusting for principal components of ancestry.
Results
In total, 132 960 patients had neither diagnosis (74.8%), 16 092 depression-only (9.0%), 13 098 anxiety-only (7.4%), and 16 584 comorbid (9.3%). In the European meta-analysis across biobanks, both PRSs were higher in each diagnosis group compared to controls. Notably, depression-PRS (OR 1.20 per s.d. increase in PRS; 95% CI 1.18–1.23) and anxiety-PRS (OR 1.07; 95% CI 1.05–1.09) had the largest effect when the comorbid group was compared with controls. Furthermore, the depression-PRS was significantly higher in the comorbid group than the depression-only group (OR 1.09; 95% CI 1.06–1.12) and the anxiety-only group (OR 1.15; 95% CI 1.11–1.19) and was significantly higher in the depression-only group than the anxiety-only group (OR 1.06; 95% CI 1.02–1.09), showing a genetic risk gradient across the conditions and the comorbidity.
Conclusions
This study suggests that depression and anxiety have partially independent genetic liabilities and the genetic vulnerabilities to depression and anxiety make distinct contributions to comorbid depression and anxiety.
Rabies virus (RABV) is a deadly zoonosis that circulates in wild carnivore populations in North America. Intensive management within the USA and Canada has been conducted to control the spread of the raccoon (Procyon lotor) variant of RABV and work towards elimination. We examined RABV occurrence across the northeastern USA and southeastern Québec, Canada during 2008–2018 using a multi-method, dynamic occupancy model. Using a 10 km × 10 km grid overlaid on the landscape, we examined the probability that a grid cell was occupied with RABV and relationships with management activities (oral rabies vaccination (ORV) and trap-vaccinate-release efforts), habitat, neighbour effects and temporal trends. We compared raccoon RABV detection probabilities between different surveillance samples (e.g. animals that are strange acting, road-kill, public health samples). The management of RABV through ORV was found to be the greatest driver in reducing the occurrence of rabies on the landscape. Additionally, RABV occupancy declined further with increasing duration of ORV baiting programmes. Grid cells north of ORV management were at or near elimination ($\hat{\psi }_{{\rm north}}$ = 0.00, s.e. = 0.15), managed areas had low RABV occupancy ($\hat{\psi }_{{\rm managed}}$ = 0.20, s.e. = 0.29) and enzootic areas had the highest level of RABV occupancy ($\hat{\psi }_{{\rm south}}$ = 0.83, s.e. = 0.06). These results provide evidence that past management actions have been being successful at the goals of reducing and controlling the raccoon variant of RABV. At a finer scale we also found that vaccine bait type and bait density impacted RABV occupancy. Detection probabilities varied; samples from strange acting animals and public health had the highest detection rates. Our results support the movement of the ORV zone south within the USA due to high elimination probabilities along the US border with Québec. Additional enhanced rabies surveillance is still needed to ensure elimination is maintained.
We provide an overview of diagnostic stewardship with key concepts that include the diagnostic pathway and the multiple points where interventions can be implemented, strategies for interventions, the importance of multidisciplinary collaboration, and key microbiologic diagnostic tests that should be considered for diagnostic stewardship. The document focuses on microbiologic laboratory testing for adult and pediatric patients and is intended for a target audience of healthcare workers involved in diagnostic stewardship interventions and all workers affected by any step of the diagnostic pathway (ie, ordering, collecting, processing, reporting, and interpreting results of a diagnostic test). This document was developed by the Society for Healthcare Epidemiology of America Diagnostic Stewardship Taskforce.
Consumers now demand evidence of welfare assurance at all stages of animal production, marketing, transport and slaughter. In response, retailers have increasingly adopted preferred supply chain relationships which preclude sourcing animals via livestock auction markets. One of the criteria dictating this action is a perceived improvement in animal welfare resulting from direct transport from farm to abattoir.
A survey of complete journey structures of 18 393 slaughterweight lambs from farm to abattoir was conducted between April and July 1997. Journeys were characterized in terms of distances travelled, duration and the number of discrete components within a whole journey which comprised: transport; trans-shipping (when animals were transferred from one vehicle to another); multiple pickups from a number of farms; and holding at either assembly points, lairages or auction markets. The results identified that journeys in the livestock distribution system are diverse and range in complexity, irrespective of marketing channel. Journey complexity was found to be positively related to distance travelled.
The study demonstrates that discussions concerning welfare of livestock in transit should consider the journey structure and not just the marketing channel per se. Furthermore, it also shows that changes taking place in the infrastructure of the marketing and meat processing sectors may result in a reduction in animal welfare.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.