We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A loglinear IRT model is proposed that relates polytomously scored item responses to a multidimensional latent space. The analyst may specify a response function for each response, indicating which latent abilities are necessary to arrive at that response. Each item may have a different number of response categories, so that free response items are more easily analyzed. Conditional maximum likelihood estimates are derived and the models may be tested generally or against alternative loglinear IRT models.
Head and neck squamous cell carcinomas (HNSCCs) are aggressive tumours lacking a standardised timeline for treatment initiation post-diagnosis. Delays beyond 60 days are linked to poorer outcomes and higher recurrence risk.
Methods:
A retrospective review was conducted on patients over 18 with HNSCC treated with (chemo)radiation at a rural tertiary care centre (September 2020–2022). Data on patient demographics, oncologic characteristics, treatment details and delay causes were analysed using SPSS.
Results:
Out of 93 patients, 35.5% experienced treatment initiation delays (TTIs) over 60 days. Median TTI was 73 days for delayed cases, compared to 41.5 days otherwise. No significant differences in demographics or cancer characteristics were observed between groups. The primary reasons for the delay were care coordination (69.7%) and patient factors (18.2%). AJCC cancer stage showed a trend towards longer delays in advanced stages.
Conclusion:
One-third of patients faced delayed TTI, primarily due to care coordination and lack of social support. These findings highlight the need for improved multidisciplinary communication and patient support mechanisms, suggesting potential areas for quality improvement in HNSCC treatment management.
Prior studies evaluating the impact of discontinuation of contact precautions (DcCP) on methicillin-resistant Staphylococcus aureus (MRSA) outcomes have characterized all healthcare-associated infections (HAIs) rather than those likely preventable by contact precautions. We aimed to analyze the impact of DcCP on the rate of MRSA HAI including transmission events identified through whole genome sequencing (WGS) surveillance.
Design:
Quasi experimental interrupted time series.
Setting:
Acute care medical center.
Participants:
Inpatients.
Methods:
The effect of DcCP (use of gowns and gloves) for encounters among patients with MRSA carriage was evaluated using time series analysis of MRSA HAI rates from January 2019 through December 2022, compared to WGS-defined attributable transmission events before and after DcCP in December 2020.
Results:
The MRSA HAI rate was 4.22/10,000 patient days before and 2.98/10,000 patient days after DcCP (incidence rate ratio [IRR] 0.71 [95% confidence interval 0.56–0.89]) with a significant immediate decrease (P = .001). There were 7 WGS-defined attributable transmission events before and 11 events after DcCP (incident rate ratio 0.90 [95% confidence interval 0.30–2.55]).
Conclusions:
DcCP did not result in an increase in MRSA HAI or, in WGS-defined attributable transmission events. Comprehensive analyses of the effect of transmission prevention measures should include outcomes specifically measuring transmission-associated HAI.
NASA’s all-sky survey mission, the Transiting Exoplanet Survey Satellite (TESS), is specifically engineered to detect exoplanets that transit bright stars. Thus far, TESS has successfully identified approximately 400 transiting exoplanets, in addition to roughly 6 000 candidate exoplanets pending confirmation. In this study, we present the results of our ongoing project, the Validation of Transiting Exoplanets using Statistical Tools (VaTEST). Our dedicated effort is focused on the confirmation and characterisation of new exoplanets through the application of statistical validation tools. Through a combination of ground-based telescope data, high-resolution imaging, and the utilisation of the statistical validation tool known as TRICERATOPS, we have successfully discovered eight potential super-Earths. These planets bear the designations: TOI-238b (1.61$^{+0.09} _{-0.10}$ R$_\oplus$), TOI-771b (1.42$^{+0.11} _{-0.09}$ R$_\oplus$), TOI-871b (1.66$^{+0.11} _{-0.11}$ R$_\oplus$), TOI-1467b (1.83$^{+0.16} _{-0.15}$ R$_\oplus$), TOI-1739b (1.69$^{+0.10} _{-0.08}$ R$_\oplus$), TOI-2068b (1.82$^{+0.16} _{-0.15}$ R$_\oplus$), TOI-4559b (1.42$^{+0.13} _{-0.11}$ R$_\oplus$), and TOI-5799b (1.62$^{+0.19} _{-0.13}$ R$_\oplus$). Among all these planets, six of them fall within the region known as ‘keystone planets’, which makes them particularly interesting for study. Based on the location of TOI-771b and TOI-4559b below the radius valley we characterised them as likely super-Earths, though radial velocity mass measurements for these planets will provide more details about their characterisation. It is noteworthy that planets within the size range investigated herein are absent from our own solar system, making their study crucial for gaining insights into the evolutionary stages between Earth and Neptune.
Data compilations expand the scope of research; however, data citation practice lags behind advances in data use. It remains uncommon for data users to credit data producers in professionally meaningful ways. In paleontology, databases like the Paleobiology Database (PBDB) enable assessment of patterns and processes spanning millions of years, up to global scale. The status quo for data citation creates an imbalance wherein publications drawing data from the PBDB receive significantly more citations (median: 4.3 ± 3.5 citations/year) than the publications producing the data (1.4 ± 1.3 citations/year). By accounting for data reuse where citations were neglected, the projected citation rate for data-provisioning publications approached parity (4.2 ± 2.2 citations/year) and the impact factor of paleontological journals (n = 55) increased by an average of 13.4% (maximum increase = 57.8%) in 2019. Without rebalancing the distribution of scientific credit, emerging “big data” research in paleontology—and science in general—is at risk of undercutting itself through a systematic devaluation of the work that is foundational to the discipline.
As coronavirus disease 2019 (COVID-19) spread, efforts were made to preserve resources for the anticipated surge of COVID-19 patients in British Columbia, Canada. However, the relationship between COVID-19 hospitalizations and access to cancer surgery is unclear. In this project, we analyze the impact of COVID-19 patient volumes on wait time for cancer surgery.
Methods:
We conducted a retrospective study using population-based datasets of regional surgical wait times and COVID-19 patient volumes. Weekly median wait times for urgent, nonurgent, cancer, and noncancer surgeries, and maximum volumes of hospitalized patients with COVID-19 were studied. The results were qualitatively analyzed.
Results:
A sustained association between weekly median wait time for priority and other cancer surgeries and increase hospital COVID-19 patient volumes was not qualitatively discernable. In response to the first phase of COVID-19 patient volumes, relative to pre-COVID-19 pandemic levels, wait time were shortened for urgent cancer surgery but increased for nonurgent surgeries. During the second phase, for all diagnostic groups, wait times returned to pre-COVID-19 pandemic levels. During the third phase, wait times for all surgeries increased.
Conclusion:
Cancer surgery access may have been influenced by other factors, such as policy directives and local resource issues, independent of hospitalized COVID-19 patient volumes. The initial access limitations gradually improved with provincial and institutional resilience, and vaccine rollout.
Artificial illumination is a fundamental human need. Burning wood and other materials usually in hearths and fireplaces extended daylight hours, whilst the use of flammable substances in torches offered light on the move. It is increasingly understood that pottery played a role in light production. In this study, we focus on ceramic oval bowls, made and used primarily by hunter-gatherer-fishers of the circum-Baltic over a c. 2000 year period beginning in the mid-6th millennium cal bc. Oval bowls commonly occur alongside larger (cooking) vessels. Their function as ‘oil lamps’ for illumination has been proposed on many occasions but only limited direct evidence has been secured to test this functional association. This study presents the results of molecular and isotopic analysis of preserved organic residues obtained from 115 oval bowls from 25 archaeological sites representing a wide range of environmental settings. Our findings confirm that the oval bowls of the circum-Baltic were used primarily for burning fats and oils, predominantly for the purposes of illumination. The fats derive from the tissues of marine, freshwater, and terrestrial organisms. Bulk isotope data of charred surface deposits show a consistently different pattern of use when oval bowls are compared to other pottery vessels within the same assemblage. It is suggested that hunter-gatherer-fishers around the 55th parallel commonly deployed material culture for artificial light production but the evidence is restricted to times and places where more durable technologies were employed, including the circum-Baltic.
U.S. veterans report high rates of traumatic experiences and mental health symptomology [e.g. posttraumatic stress disorder (PTSD)]. The stress sensitization hypothesis posits experiences of adversity sensitize individuals to stress reactions which can lead to greater psychiatric problems. We extend this hypothesis by exploring how multiple adversities such as early childhood adversity, combat-related trauma, and military sexual trauma related to heterogeneity in stress over time and, subsequently, greater risk for PTSD.
Methods
1230 veterans were recruited for an observational, longitudinal study. Veterans responded to questionnaires on PTSD, stress, and traumatic experiences five times over an 18-month study period. We used latent transition analysis to understand how heterogeneity in adverse experiences is related to transition into stress trajectory classes. We also explored how transition patterns related to PTSD symptomology.
Results
Across all models, we found support for stress sensitization. In general, combat trauma in combinations with other types of adverse experiences, namely early childhood adversity and military sexual trauma, imposed a greater probability of transitioning into higher risk stress profiles. We also showed differential effects of early childhood and military-specific adversity on PTSD symptomology.
Conclusion
The present study rigorously integrates both military-specific and early life adversity into analysis on stress sensitivity, and is the first to examine how sensitivity might affect trajectories of stress over time. Our study provides a nuanced, and specific, look at who is risk for sensitization to stress based on previous traumatic experiences as well as what transition patterns are associated with greater PTSD symptomology.
Research indicates that sexual harassment and assault commonly occur during archaeological field research, and students, trainees, and early career professionals are more frequently subjected to harassing behaviors compared to mid-career and senior scientists. Specific to archaeological education, the undergraduate educational requirement of a field school puts students and trainees in situations where harassment historically has been unchecked. We present the results of a systematic content analysis of 24 sets of field school documents. We analyzed these documents with attention to how field school policies, procedures, and language may impact students’ perceptions of their expected behaviors, logistics and means of reporting, and stated policies surrounding sexual harassment and assault. Coding was conducted using an a priori coding scheme to identify practices that should lead to a safe and supportive field learning environment. Our coding scheme resulted in 11 primary codes that we summarized as three primary themes: (1) field school organization and expected student behavior, (2) logistics of the course, and (3) stated policies surrounding sexual harassment and assault. Based on these themes, we provide recommendations to modify field school documents and practices to create a field school that provides safe opportunities for students to learn.
Prompt diagnosis and intervention for ventilator-associated pneumonia (VAP) is critical but can lead to overdiagnosis and overtreatment.
Objectives:
We investigated healthcare provider (HCP) perceptions and challenges associated with VAP diagnosis, and we sought to identify opportunities for diagnostic stewardship.
Methods:
We conducted a qualitative study of 30 HCPs at a tertiary-care hospital. Participants included attending physicians, residents and fellows (trainees), advanced practice providers (APPs), and pharmacists. Interviews were composed of open-ended questions in 4 sections: (1) clinical suspicion and thresholds for respiratory culture ordering, (2) preferences for respiratory sample collection, (3) culture report interpretation, and (4) VAP diagnosis and treatment. Interviews transcripts were analyzed using Nvivo 12 software, and responses were organized into themes.
Results:
Overall, 10 attending physicians (75%) and 16 trainees (75%) trainees and APPs believed they were overdiagnosing VAP; this response was frequent among HCPs in practice 5–10 years (91%, n = 12). Increased identification of bacteria as a result of frequent respiratory culturing, misinterpretation of culture data, and fear of missing diagnosis were recognized as drivers of overdiagnosis and overtreatment. Although most HCPs rely on clinical and radiographic changes to initiate work-up, the fear of missing a diagnosis leads to sending cultures even in the absence of those changes.
Conclusions:
HCPs believe that VAP overdiagnosis and overtreatment are common due to fear of missing diagnosis, overculturing, and difficulty distinguishing colonization from infection. Although we identified opportunities for diagnostic stewardship, interventions influencing the ordering of cultures and starting antimicrobials will need to account for strongly held beliefs and ICU practices.
Previous research on the depression scale of the Patient Health Questionnaire (PHQ-9) has found that different latent factor models have maximized empirical measures of goodness-of-fit. The clinical relevance of these differences is unclear. We aimed to investigate whether depression screening accuracy may be improved by employing latent factor model-based scoring rather than sum scores.
Methods
We used an individual participant data meta-analysis (IPDMA) database compiled to assess the screening accuracy of the PHQ-9. We included studies that used the Structured Clinical Interview for DSM (SCID) as a reference standard and split those into calibration and validation datasets. In the calibration dataset, we estimated unidimensional, two-dimensional (separating cognitive/affective and somatic symptoms of depression), and bi-factor models, and the respective cut-offs to maximize combined sensitivity and specificity. In the validation dataset, we assessed the differences in (combined) sensitivity and specificity between the latent variable approaches and the optimal sum score (⩾10), using bootstrapping to estimate 95% confidence intervals for the differences.
Results
The calibration dataset included 24 studies (4378 participants, 652 major depression cases); the validation dataset 17 studies (4252 participants, 568 cases). In the validation dataset, optimal cut-offs of the unidimensional, two-dimensional, and bi-factor models had higher sensitivity (by 0.036, 0.050, 0.049 points, respectively) but lower specificity (0.017, 0.026, 0.019, respectively) compared to the sum score cut-off of ⩾10.
Conclusions
In a comprehensive dataset of diagnostic studies, scoring using complex latent variable models do not improve screening accuracy of the PHQ-9 meaningfully as compared to the simple sum score approach.
The EAT–Lancet Commission promulgated a universal reference diet. Subsequently, researchers constructed an EAT–Lancet diet score (0–14 points), with minimum intake values for various dietary components set at 0 g/d, and reported inverse associations with risks of major health outcomes in a high-income population. We assessed associations between EAT–Lancet diet scores, without or with lower bound values, and the mean probability of micronutrient adequacy (MPA) among nutrition-insecure women of reproductive age (WRA) from low- and middle-income countries (LMIC). We analysed single 24-h diet recall data (n 1950) from studies in rural DRC, Ecuador, Kenya, Sri Lanka and Vietnam. Associations between EAT–Lancet diet scores and MPA were assessed by fitting linear mixed-effects models. Mean EAT–Lancet diet scores were 8·8 (SD 1·3) and 1·9 (SD 1·1) without or with minimum intake values, respectively. Pooled MPA was 0·58 (SD 0·22) and energy intake was 10·5 (SD 4·6) MJ/d. A one-point increase in the EAT–Lancet diet score, without minimum intake values, was associated with a 2·6 (SD 0·7) percentage points decrease in MPA (P < 0·001). In contrast, the EAT–Lancet diet score, with minimum intake values, was associated with a 2·4 (SD 1·3) percentage points increase in MPA (P = 0·07). Further analysis indicated positive associations between EAT–Lancet diet scores and MPA adjusted for energy intake (P < 0·05). Our findings indicate that the EAT–Lancet diet score requires minimum intake values for nutrient-dense dietary components to avoid positively scoring non-consumption of food groups and subsequently predicting lower MPA of diets, when applied to rural WRA in LMIC.
Archaeologists have struggled to combine remotely sensed datasets with preexisting information for landscape-level analyses. In the American Southeast, for example, analyses of lidar data using automated feature extraction algorithms have led to the identification of over 40 potential new pre-European-contact Native American shell ring deposits in Beaufort County, South Carolina. Such datasets are vital for understanding settlement distributions, yet a comprehensive assessment requires remotely sensed and previously surveyed archaeological data. Here, we use legacy data and airborne lidar-derived information to conduct a series of point pattern analyses using spatial models that we designed to assess the factors that best explain the location of shell rings. The results reveal that ring deposit locations are highly clustered and best explained through a combination of environmental conditions such as distance to water and elevation as well as social factors.
Given the hierarchical nature and structure of field schools, enrolled students are particularly susceptible to harassment and assault. In 2018, the National Academies of Sciences, Engineering, and Medicine (NASEM) released recommendations to help prevent sexual harassment and assault of women in academia. Although these recommendations are specific to higher education and exclusive to women, some can be modified and applied to the context of archaeological field schools. We review the NASEM's recommendations, with particular attention to those applicable to the field school setting, and provide suggestions for making field schools safer and more inclusive learning environments for all students. Although we present recommendations for practices that can be implemented at field schools, additional research is needed to understand how sexual harassment occurs at field schools and how the implementation of these recommendations can make learning safer.
Intermittent energy restriction (IER) involves short periods of severe energy restriction interspersed with periods of adequate energy intake, and can induce weight loss. Insulin sensitivity is impaired by short-term, complete energy restriction, but the effects of IER are not well known. In randomised order, fourteen lean men (age: 25 (sd 4) years; BMI: 24 (sd 2) kg/m2; body fat: 17 (4) %) consumed 24-h diets providing 100 % (10 441 (sd 812) kJ; energy balance (EB)) or 25 % (2622 (sd 204) kJ; energy restriction (ER)) of estimated energy requirements, followed by an oral glucose tolerance test (OGTT; 75 g of glucose drink) after fasting overnight. Plasma/serum glucose, insulin, NEFA, glucagon-like peptide-1 (GLP-1), glucose-dependent insulinotropic peptide (GIP) and fibroblast growth factor 21 (FGF21) were assessed before and after (0 h) each 24-h dietary intervention, and throughout the 2-h OGTT. Homoeostatic model assessment of insulin resistance (HOMA2-IR) assessed the fasted response and incremental AUC (iAUC) or total AUC (tAUC) were calculated during the OGTT. At 0 h, HOMA2-IR was 23 % lower after ER compared with EB (P<0·05). During the OGTT, serum glucose iAUC (P<0·001), serum insulin iAUC (P<0·05) and plasma NEFA tAUC (P<0·01) were greater during ER, but GLP-1 (P=0·161), GIP (P=0·473) and FGF21 (P=0·497) tAUC were similar between trials. These results demonstrate that severe energy restriction acutely impairs postprandial glycaemic control in lean men, despite reducing HOMA2-IR. Chronic intervention studies are required to elucidate the long-term effects of IER on indices of insulin sensitivity, particularly in the absence of weight loss.
Nearby star-forming galaxies offer a unique environment to study the populations of young (<100 Myr) accreting binaries. These systems are tracers of past populations of massive stars that heavily affect their immediate environment and parent galaxies. Using a Chandra X-ray Visionary program, we investigate the young neutron-star binary population in the low metallicity of the Small Magellanic Cloud (SMC) by reaching quiescent X-ray luminosity levels (~few times 1032 erg/s). We present the first measurement of the formation efficiency of high-mass X-ray binaries (HMXBs) as a function of the age of their parent stellar populations by using 3 indicators: the number ratio of HMXBs to OB stars, to the SFR, and to the stellar mass produced during the specific star-formation burst they are associated with. In all cases, we find that the HMXB formation efficiency increases as a function of time up to ~40–60 Myr, and then gradually decreases.
At the heart of surgical care needs to be the education and training of staff, particularly in the low-income and/or resource-poor setting. This is the primary means by which self-sufficiency and sustainability will ultimately be achieved. As such, training and education should be integrated into any surgical programme that is undertaken. Numerous resources are available to help provide such a goal, and an open approach to novel, inexpensive training methods is likely to be helpful in this type of setting.
The need for appropriately trained audiologists in low-income countries is well recognised and clearly goes beyond providing support for ear surgery. However, where ear surgery is being undertaken, it is vital to have audiology services established in order to correctly assess patients requiring surgery, and to be able to assess and manage outcomes of surgery. The training requirements of the two specialties are therefore intimately linked.
Objective
This article highlights various methods, resources and considerations, for both otolaryngology and audiology training, which should prove a useful resource to those undertaking and organising such education, and to those staff members receiving it.
Objectives: Rates of cognitive, academic and behavioral comorbidities are elevated in children with epilepsy. The contribution of environmental and genetic influences to comorbidity risk is not fully understood. This study investigated children with epilepsy, their unaffected siblings, and controls to determine the presence and extent of risk associated with family relatedness across a range of epilepsy comorbidities. Methods: Participants were 346 children (8–18 years), n=180 with recent-onset epilepsy, their unaffected siblings (n=67), and healthy first-degree cousin controls (n=99). Assessments included: (1) Child Behavior Checklist/6-18 (CBCL), (2) Behavior Rating Inventory of Executive Function (BRIEF), (3) history of education and academic services, and (4) lifetime attention deficit hyperactivity disorder (ADHD) diagnosis. Analyses consisted of linear mixed effect models for continuous variables, and logistic mixed models for binary variables. Results: Differences were detected between the three groups of children across all measures (p<.001). For ADHD, academic problems, and executive dysfunction, children with epilepsy exhibited significantly more problems than unaffected siblings and controls; siblings and controls did not differ statistically significantly from each other. For social competence, children with epilepsy and their unaffected siblings displayed more abnormality compared with controls, with no statistically significant difference between children with epilepsy and unaffected siblings. For behavioral problems, children with epilepsy had more abnormality than siblings and controls, but unaffected siblings also exhibited more abnormalities than controls. Conclusions: The contribution of epilepsy and family relatedness varies across specific neurobehavioral comorbidities. Family relatedness was not significantly associated with rates of ADHD, academic problems and executive dysfunction, but was associated with competence and behavioral problems. (JINS, 2018, 24, 1–9)
Drug use during pregnancy and lactation remain underdeveloped areas of clinical pharmacology and drug research. Pregnancy risk factors together with an increased incidence of chronic diseases and a rise in the average maternal age predict medication use will continue to rise during gestation. Common exposure categories include over-the-counter (OTC) medication, psychiatric agents, gastrointestinal medications, herbals, vitamins, antibiotics, and topical products. Only a few medications have been tested specifically for safety and efficacy during human gestation. Profound physiologic changes occur during both normal and pathologic pregnancy that may dramatically alter drug clearance, efficacy, and safety. Under such circumstances, the danger of a drug to mothers, their fetuses, and nursing infants cannot be determined with any confidence until it has been widely used. It is important that women with medical disorders such as diabetes, hypertension, epilepsy, and inflammatory bowel disease continue necessary therapy while pregnant. Unfortunately, many physicians stop or delay medically important agents precisely because of the lack of information.