We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To clarify incidence, progression and effect on quality of life of shoulder/neck disability, oral asymmetry, neuropathic pain and numbness following neck dissection.
Methods
This prospective telephone-interview study delivered the Neck Dissection Impairment Index, Neuropathic Pain Questionnaire, House–Brackmann Scale and questions assessing numbness to patients before and three times after neck dissection.
Results
Mean Neck Dissection Impairment Index (6.43 vs 22.17; p = 0.004) and Neuropathic Pain Questionnaire scores (0.76 vs 2.30; p = 0.004), proportions of patients with oral asymmetry (3 per cent vs 33.3 per cent; p = 0.016), ear (5.9 per cent vs 46.7 per cent; p = 0.002), jaw (5.9 per cent vs 53.3 per cent; p < 0.001) and neck numbness (5.9 per cent vs 53.3 per cent; p < 0.001) each increased significantly from pre-operation versus 12 weeks after. Neuropathic pain diagnoses did not reach significance. No outcome returned to baseline and progression of each was illustrated over time.
Conclusion
The findings demonstrated that these complications are common and persist throughout short-term recovery. Screening to identify and manage complications could improve post-operative care.
Understanding complex three-dimensional cardiac structures is the key to knowing CHD. Many learners have limited access to cadaveric specimens, and most alternative teaching modalities are two-dimensional. Therefore, we have developed virtual cardiac models using photogrammetry of actual heart specimens to address this educational need.
Methods:
A descriptive study was conducted at a single institution during a week-long cardiac morphology conference in October 2022 and 2023. Conference attendees viewed virtual cardiac models via laptop screen and virtual reality headset. Learners were surveyed on their opinions of the virtual models and their perceived effectiveness compared to existing educational materials.
Results:
Forty-six learners completed the survey. Participants reported the virtual cardiac models to be more effective than textbook diagrams (60%), and equally or more effective compared to didactic teaching (78%) and specimen videos (78%). Approximately half of participants (54%) found the virtual models to be less effective than hands-on cadaveric specimen inspection. Attitudes towards the virtual specimens were overall positive with most responders finding the tool engaging (87%) and enjoyable (85%). A majority reported that the models deepened their understanding of cardiac morphology (79%) and that they would recommend them to other trainees (87%).
Conclusions:
This study demonstrates that a novel teaching tool, virtual cardiac specimens, is equivalent to or more effective than many current materials for learning cardiac morphology. While they may not replace direct cadaveric specimen review, virtual models are an engaging alternative with the ability to reach a wider audience.
‘Inhalants’ have been associated with poorer mental health in adolescence, but little is known of associations with specific types of inhalants.
Aims
We aimed to investigate associations of using volatile substances, nitrous oxide and alkyl nitrates with mental health problems in adolescence.
Method
We conducted a cross-sectional analysis using data from 13- to 14-year-old adolescents across England and Wales collected between September 2019 and March 2020. Multilevel logistic regression examined associations between lifetime use of volatile substances, nitrous oxide and alkyl nitrates with self-reported symptoms of probable depression, anxiety, conduct disorder and auditory hallucinations.
Results
Of the 6672 adolescents in the study, 5.1% reported use of nitrous oxide, 4.9% volatile solvents and 0.1% alkyl nitrates. After accounting for multiple testing, adolescents who had used volatile solvents were significantly more likely to report probable depressive (odds ratio = 4.59, 95% CI 3.58, 5.88), anxiety (odds ratio = 3.47, 95% CI 2.72, 4.43) or conduct disorder (odds ratio = 7.52, 95% CI 5.80, 9.76) and auditory hallucinations (odds ratio = 5.35, 95% CI 4.00, 7.17) than those who had not. Nitrous oxide use was significantly associated with probable depression and conduct disorder but not anxiety disorder or auditory hallucinations. Alkyl nitrate use was rare and not associated with mental health outcomes. Adjustment for use of other inhalants, tobacco and alcohol resulted in marked attenuation but socioeconomic disadvantage had little effect.
Conclusion
To our knowledge, this study provides the first general population evidence that volatile solvents and nitrous oxide are associated with probable mental health disorders in adolescence. These findings require replication, ideally with prospective designs.
To compare rates of Clostridioides difficile infection (CDI) recurrence following initial occurrence treated with tapered enteral vancomycin compared to standard vancomycin.
Design:
Retrospective cohort study.
Setting:
Community health system.
Patients:
Adults ≥18 years of age hospitalized with positive C. difficile polymerase chain reaction or toxin enzyme immunoassay who were prescribed either standard 10–14 days of enteral vancomycin four times daily or a 12-week tapered vancomycin regimen.
Methods:
Retrospective propensity score pair matched cohort study. Groups were matched based on age < or ≥ 65 years and receipt of non-C. difficile antibiotics during hospitalization or within 6 months post-discharge. Recurrence rates were analyzed via logistic regression conditioned on matched pairs and reported as conditional odds ratios. The primary outcome was recurrence rates compared between standard vancomycin versus tapered vancomycin for treatment of initial CDI.
Results:
The CDI recurrence rate at 6 months was 5.3% (4/75) in the taper cohort versus 28% (21/75) in the standard vancomycin cohort. The median time to CDI recurrence was 115 days versus 20 days in the taper and standard vancomycin cohorts, respectively. When adjusted for matching, patients in the taper arm were less likely to experience CDI recurrence at 6 months when compared to standard vancomycin (cOR = 0.19, 95% CI 0.07–0.56, p < 0.002).
Conclusions:
Larger prospective trials are needed to elucidate the clinical utility of tapered oral vancomycin as a treatment option to achieve sustained clinical cure in first occurrences of CDI.
Workload is a useful construct in human factors and neuroergonomics research that describes “the perceived relationship between the amount of mental [and physical] processing capability or resources and the amount required by the task”. We apply this concept to neuropsychology and assess several dimensions of workload as it relates to performance on the Trail Making Test.
Participants and Methods:
Twenty college students completed the Trail Making Test (TMT). After completion of each Part A and B, workload was assessed with the NASA-Task Load Index (NASA-TLX), a popular self-report measure of workload including subscales: Mental Demand, Physical Demand, Temporal Demand, Performance, Effort, and Frustration, with an overall average total score as well.
Results:
Completion time differed of course between Parts A and B (p < .001). Of more interest, overall workload differed between TMT A (M = 20.33, SD = 13.32) and TMT B (M = 35.79, SD = 17.37) (p < .001, h2 = .68). The greatest subscale differences were with Mental Demand (p < .001, h2 = .68) and Effort (p < .001, h2 = .59), but Physical Demand also showed a difference (p < .007, h2 = .33). Temporal Demand showed the smallest and nonsignificant difference (p = .081, h2 = .152).
Conclusions:
Based on previous research in our lab, most results were expected and understandable. As we know with the TMT, Part B is more cognitively demanding (in various ways) than Part A. The greater Physical Demand with Part B is a somewhat more complex finding, needing a solid explanation. Finally, the NASA-TLX appears to be a valid instrument of workload with a standard neuropsychologist test. We argue it can provide useful interesting information in the assessment of cognitive status in clinical populations.
Cannabis has been associated with poorer mental health, but little is known of the effect of synthetic cannabinoids or cannabidiol (often referred to as CBD).
Aims
To investigate associations of cannabis, synthetic cannabinoids and cannabidiol with mental health in adolescence.
Method
We conducted a cross-sectional analysis with 13- to 14-year-old adolescents across England and Wales in 2019–2020. Multilevel logistic regression was used to examine the association of lifetime use of cannabis, synthetic cannabinoids and cannabidiol with self-reported symptoms of probable depression, anxiety, conduct disorder and auditory hallucinations.
Results
Of the 6672 adolescents who participated, 5.2% reported using of cannabis, 1.9% reported using cannabidiol and 0.6% reported using synthetic cannabinoids. After correction for multiple testing, adolescents who had used these substances were significantly more likely to report a probable depressive, anxiety or conduct disorder, as well as auditory hallucinations, than those who had not. Adjustment for socioeconomic disadvantage had little effect on associations, but weekly tobacco use resulted in marked attenuation of associations. The association of cannabis use with probable anxiety and depressive disorders was weaker in those who reported using cannabidiol than those who did not. There was little evidence of an interaction between synthetic cannabinoids and cannabidiol.
Conclusions
To our knowledge, this study provides the first general population evidence that synthetic cannabinoids and cannabidiol are associated with probable mental health disorders in adolescence. These associations require replication, ideally with prospective cohorts and stronger study designs.
The persistence of seed-dispersing animals in degraded habitats could be critical for ensuring the long-term conservation value and restoration of forests. This is particularly important in Southeast Asia, where > 70% of the remaining forest areas are within 1 km of a forest edge, and many are degraded (e.g. logged). We synthesized information on the habitat associations of the binturong Arctictis binturong, a large, semi-arboreal, frugivorous civet and one of the most important seed dispersers in the region, especially for figs (Ficus spp). We adopted a multiscale approach by employing ensemble species distribution modelling from presence-only records, assessing landscape-scale variation in detection rates in published camera-trap studies and using hierarchical occupancy modelling to assess local (i.e. within-landscape) patterns observed from 20 new camera-trap surveys. Contrary to prior reports that binturongs are strongly associated with intact forests, the species was equally present in degraded forests and near forest edges where sufficient forest cover was maintained (> 40% forest cover within a 20-km radius). The species also tolerates moderate incursions of oil palm plantations (< 20% of the area within a 20-km radius covered by oil palm plantations). The relative resilience of binturongs to habitat degradation could be in part because of behavioural adaptations towards increased nocturnal activity. These results support the notion that key seed dispersers can persist and maintain their ecological function in degraded forests.
The recent Covid-19 pandemic highlighted stark social inequalities, notably around access to food, nutrition and to green or blue space (i.e. outdoor spaces with vegetation and water). Consequently, obesity is socio-economically patterned by this inequality; and while the environmental drivers of obesity are widely acknowledged, there is currently little upstream intervention. We know that living with obesity contributes to increasing health inequalities, and places healthcare systems under huge strain. Our environment could broadly be described obesogenic, in the sense of supporting unhealthful eating patterns and sedentary behaviour. Evidence points to the existence of nearly 700 UK obesity policies, all of which have had little success. Obesity prevention and treatment has focused on educational and behavioural interventions targeted at individual consumers. A more sustainable approach would be to try and change the environments that promote less healthy eating and high energy intake as well as sedentary behaviour. Approaches which modify the environment have the potential to assist in the prevention of this complex condition. This review paper focuses on the role of wider food environments or foodscapes. While there is an imperfect evidence base relating to the role of the foodscape in terms of the obesity crisis, policy, practice, civic society and industry must work together and take action now, in areas where current evidence suggests change is required. Despite the current cost-of-living crisis, shaping the foodscape to better support healthful eating decisions has the potential to be a key aspect of a successful obesity prevention intervention.
Individuals are often ambiguity-averse when choosing among purely chance-based prospects (Ellsberg, 1961). However, they often prefer apparently ambiguous ability-based prospects to unambiguous chance-based prospects. According to the competence hypothesis (Heath & Tversky, 1991), this pattern derives from favorable perceptions of one’s competence. In most past tests of the competence hypothesis, ambiguity is confounded with personal controllability and the source of the ambiguity (e.g., chance vs. missing information). We unconfound these factors in three experiments and find strong evidence for independent effects of both ambiguity aversion and competence. In Experiment 1, participants preferred an unambiguous chance-based option to an ambiguous ability-based option when the ambiguity derived from chance rather than uncertainty about one’s own ability. In Experiments 2 and 3, which used different operationalizations of ambiguity in choice contexts with actual consequences, participants attempted to avoid both ambiguity and chance insofar as they could. These findings support and extend the competence hypothesis by demonstrating ambiguity aversion independent of personal controllability and source of ambiguity.
To determine the reliability of teleneuropsychological (TNP) compared to in-person assessments (IPA) in people with HIV (PWH) and without HIV (HIV−).
Methods:
Participants included 80 PWH (Mage = 58.7, SDage = 11.0) and 23 HIV− (Mage = 61.9, SDage = 16.7). Participants completed two comprehensive neuropsychological IPA before one TNP during the COVID-19 pandemic (March–December 2020). The neuropsychological tests included: Hopkins Verbal Learning Test-Revised (HVLT-R Total and Delayed Recall), Controlled Oral Word Association Test (COWAT; FAS-English or PMR-Spanish), Animal Fluency, Action (Verb) Fluency, Wechsler Adult Intelligence Scale 3rd Edition (WAIS-III) Symbol Search and Letter Number Sequencing, Stroop Color and Word Test, Paced Auditory Serial Addition Test (Channel 1), and Boston Naming Test. Total raw scores and sub-scores were used in analyses. In the total sample and by HIV status, test-retest reliability and performance-level differences were evaluated between the two consecutive IPA (i.e., IPA1 and IPA2), and mean in-person scores (IPA-M), and TNP.
Results:
There were statistically significant test-retest correlations between IPA1 and IPA2 (r or ρ = .603–.883, ps < .001), and between IPA-M and TNP (r or ρ = .622–.958, ps < .001). In the total sample, significantly lower test-retest scores were found between IPA-M and TNP on the COWAT (PMR), Stroop Color and Word Test, WAIS-III Letter Number Sequencing, and HVLT-R Total Recall (ps < .05). Results were similar in PWH only.
Conclusions:
This study demonstrates reliability of TNP in PWH and HIV−. TNP assessments are a promising way to improve access to traditional neuropsychological services and maintain ongoing clinical research studies during the COVID-19 pandemic.
A ubiquitous arrangement in nature is a free-flowing fluid coupled to a porous medium, for example a river or lake lying above a porous bed. Depending on the environmental conditions, thermal convection can occur and may be confined to the clear fluid region, forming shallow convection cells, or it can penetrate into the porous medium, forming deep cells. Here, we combine three complementary approaches – linear stability analysis, fully nonlinear numerical simulations and a coarse-grained model – to determine the circumstances that lead to each configuration. The coarse-grained model yields an explicit formula for the transition between deep and shallow convection in the physically relevant limit of small Darcy number. Near the onset of convection, all three of the approaches agree, validating the predictive capability of the explicit formula. The numerical simulations extend these results into the strongly nonlinear regime, revealing novel hybrid configurations in which the flow exhibits a dynamic shift from shallow to deep convection. This hybrid shallow-to-deep convection begins with small, random initial data, progresses through a metastable shallow state and arrives at the preferred steady state of deep convection. We construct a phase diagram that incorporates information from all three approaches and depicts the regions in parameter space that give rise to each convective state.
The development of criminological theory has been grounded on urban centres with urban youth as the focus. From Walter Miller’s explanation of the focal concerns to the Chicago School’s development of social disorganization theory, criminological theories have focused almost solely on urban youth and urban centres. The lack of theoretical explanations for crime in rural areas has hampered our understanding of crime and deviant behaviour (see Donnermeyer, 2007).
Rural areas are distinct from urban areas. Research has demonstrated that drug use, gun availability and poverty differ in rural and urban centres. Moreover, some crimes are unique to rural areas, such as livestock theft. Scholars have also pointed out that rural areas are often seen as homogenous by criminologists. However, rural areas display a wide variety of characteristics, such as poverty level, racial composition and unemployment. Therefore, there is a need for the development of criminological theory based on rural areas.
One of the areas of interest may be the examination of youth sub-cultures within rural areas. Rural areas are often believed to be devoid of gangs or any type of sub-culture that may produce crime (see Weisheit and Wells, 2001). Criminologists have argued that any gang activity in rural areas was due to migration from urban areas. For example, families may move from an urban area and bring their gang affiliations and activity to the rural area. Another line of reasoning is that rural youth may copy the gang symbols and activities they see youth in urban areas using. Yet some scholars have argued that migration and copying do not explain the activity found in rural areas. Research has shown that most gang members in rural areas are homegrown.
Rural areas have a unique culture. Research has demonstrated that gang activity in rural areas does not mimic that of urban areas (see Dukes and Stein, 2003; Howell and Egley, 2005). Urban gang activity may not have the same appeal to youth in rural areas. Instead, rural youth may create their own sub-culture and gangs based on their unique experience. For example, drug use is a problem in urban and rural communities.
To describe inpatient fluoroquinolone use and susceptibility data over a 10-year period after the implementation of an antimicrobial stewardship program (ASP) led by an infectious diseases pharmacist starting in 2011.
Design:
Retrospective surveillance study.
Setting:
Large community health system.
Methods:
Fluoroquinolone use was quantified by days of therapy (DOT) per 1,000 patient days (PD) and reported quarterly. Use data are reported for inpatients from 2016 to 2020. Levofloxacin susceptibility is reported for Pseudomonas aeruginosa and Escherichia coli for inpatients from 2011 to 2020 at a 4 adult-hospital health system.
Results:
Inpatient fluoroquinolone use decreased by 74% over a 5-year period, with an average decrease of 3.45 DOT per 1,000 PD per quarter (P < .001). Over a 10-year period, inpatient levofloxacin susceptibility increased by 57% for P. aeruginosa and by 15% for E. coli. P. aeruginosa susceptibility to levofloxacin increased by an average of 2.73% per year (P < .001) and had a strong negative correlation with fluoroquinolone use, r = −0.99 (P = .002). E. coli susceptibility to levofloxacin increased by an average of 1.33% per year (P < .001) and had a strong negative correlation with fluoroquinolone use, r = −0.95 (P = .015).
Conclusions:
A substantial decrease in fluoroquinolone use and increase in P. aeruginosa and E. coli levofloxacin susceptibility was observed after implementation of an antimicrobial stewardship program. These results demonstrate the value of stewardship services and highlight the effectiveness of an infectious diseases pharmacist led antimicrobial stewardship program.
The aim of this note is to discuss the Weil restriction of schemes and algebraic spaces, highlighting pathological phenomena that appear in the theory and are not widely known. It is shown that the Weil restriction of a locally finite algebraic space along a finite flat morphism is an algebraic space.
In this study, we evaluated the impact of a microbiology nudge on de-escalation to first-generation cephalosporins in hospitalized patients with urinary tract infections secondary to Escherichia coli, Klebsiella pneumoniae, and Proteus mirabilis isolates with minimum inhibitory concentrations (MICs) ≤ 16 µg/mL. De-escalation to first generation-cephalosporins was uncommon at MICs = 4–16 µg/mL.
Background:Candida auris is an emerging multidrug-resistant yeast that is transmitted in healthcare facilities and is associated with substantial morbidity and mortality. Environmental contamination is suspected to play an important role in transmission but additional information is needed to inform environmental cleaning recommendations to prevent spread. Methods: We conducted a multiregional (Chicago, IL; Irvine, CA) prospective study of environmental contamination associated with C. auris colonization of patients and residents of 4 long-term care facilities and 1 acute-care hospital. Participants were identified by screening or clinical cultures. Samples were collected from participants’ body sites (eg, nares, axillae, inguinal creases, palms and fingertips, and perianal skin) and their environment before room cleaning. Daily room cleaning and disinfection by facility environmental service workers was followed by targeted cleaning of high-touch surfaces by research staff using hydrogen peroxide wipes (see EPA-approved product for C. auris, List P). Samples were collected immediately after cleaning from high-touch surfaces and repeated at 4-hour intervals up to 12 hours. A pilot phase (n = 12 patients) was conducted to identify the value of testing specific high-touch surfaces to assess environmental contamination. High-yield surfaces were included in the full evaluation phase (n = 20 patients) (Fig. 1). Samples were submitted for semiquantitative culture of C. auris and other multidrug-resistant organisms (MDROs) including methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Enterococcus (VRE), extended-spectrum β-lactamase–producing Enterobacterales (ESBLs), and carbapenem-resistant Enterobacterales (CRE). Times to room surface contamination with C. auris and other MDROs after effective cleaning were analyzed. Results:Candida auris colonization was most frequently detected in the nares (72%) and palms and fingertips (72%). Cocolonization of body sites with other MDROs was common (Fig. 2). Surfaces located close to the patient were commonly recontaminated with C. auris by 4 hours after cleaning, including the overbed table (24%), bed handrail (24%), and TV remote or call button (19%). Environmental cocontamination was more common with resistant gram-positive organisms (MRSA and, VRE) than resistant gram-negative organisms (Fig. 3). C. auris was rarely detected on surfaces located outside a patient’s room (1 of 120 swabs; <1%). Conclusions: Environmental surfaces near C. auris–colonized patients were rapidly recontaminated after cleaning and disinfection. Cocolonization of skin and environment with other MDROs was common, with resistant gram-positive organisms predominating over gram-negative organisms on environmental surfaces. Limitations include lack of organism sequencing or typing to confirm environmental contamination was from the room resident. Rapid recontamination of environmental surfaces after manual cleaning and disinfection suggests that alternate mitigation strategies should be evaluated.
We implemented a parent–teacher Vanderbilt agreement program to increase return rates of Vanderbilt assessment scales for children in our primary care practice, and compared the assessment return rate before and after agreement signature.
Methods
We retrospectively reviewed children diagnosed with attention-deficit/hyperactivity disorder (ADHD) who had a signed Vanderbilt agreement and were under continuous care at our clinic. Return rates were compared 1 year before and 1 year after the agreement date.
Results
Among 195 children, prior to the agreement, 71% returned teacher assessments, and 59% returned parent forms; after the intervention, assessment rates were not significantly different (76%, p = .255; and 65%, p = .185, respectively). The median number of returned assessments increased after the agreement.
Conclusions
Lack of documented parent and teacher Vanderbilt assessments remain a barrier to appropriate management of ADHD. Improving the rate of assessments returned is an important outcome for treating ADHD in the primary care setting.