We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Hepatitis B virus vaccination is currently recommended in Australia for adults at an increased risk of acquiring infection or at high risk of complications from infection. This retrospective cohort study used data from an Australian sentinel surveillance system to assess the proportion of individuals who had a recorded test that indicated being susceptible to hepatitis B infection in six priority populations, as well as the proportion who were then subsequently vaccinated within six months of being identified as susceptible. Priority populations included in this analysis were people born overseas in a hepatitis B endemic country, people living with HIV, people with a recent hepatitis C infection, gay, bisexual and other men who have sex with men, people who have ever injected drugs, and sex workers. Results of the study found that in the overall cohort of 43,335 individuals, 14,140 (33%) were identified as susceptible to hepatitis B, and 5,255 (37%) were subsequently vaccinated. Between 26% and 33% of individuals from priority populations were identified as susceptible to hepatitis B infection, and the proportion of these subsequently vaccinated within six months was between 28% and 42% across the groups. These findings suggest further efforts are needed to increase the identification and subsequent vaccination of susceptible individuals among priority populations recommended for hepatitis B vaccination, including among people who are already engaged in hepatitis B care.
This study characterizes 2008-2022 FDA advisory committee discussions of new supplemental indication applications that were not approved by FDA. Discussion themes included contextual concerns unique to already-approved drugs, including insights from prior experience and concerns about off-label use, and efficacy and safety concerns also observed for new drugs. These findings highlight advisory committees’ role in transparency of regulatory decision-making, specifically for drugs already authorized for use.
Cereal rye (Secale cereale L.) cover crop and preemergence herbicides are important components of an integrated weed management program for waterhemp [Amaranthus tuberculatus (Moq.) Sauer] and Palmer amaranth (Amaranthus palmeri S. Watson) management in soybean [Glycine max (L.) Merr.]. Accumulating adequate cereal rye biomass for effective suppression of Amaranthus spp. can be challenging in the upper Midwest due to the short window for cereal rye growth in a corn–soybean rotation. Farmers are adopting the planting green system to optimize cereal rye biomass production and weed suppression. This study aimed to evaluate the feasibility of planting soybean green when integrated with preemergence herbicides for the control of Amaranthus spp. under two soybean planting time frames. The study was conducted across 19 site-years in the United States over the 2021 and 2022 growing seasons. Factors included cover crop management practices (“no-till,” “cereal rye early-term,” and “cereal rye plant-green”), soybean planting times (“early” and “late”), and use of preemergence herbicides (“NO PRE” and “YES PRE”). Planting soybean green increased cereal rye biomass production by 33% compared with early termination. Greater cereal rye biomass production when planting green provided a 44% reduction in Amaranthus spp. density compared with no-till. The use of preemergence herbicides also resulted in a 68% reduction in Amaranthus spp. density compared with NO PRE. Greater cereal rye biomass produced when planting green reduced soybean stand, which directly reduced soybean yield in some site-years. Planting soybean green is a feasible management practice to optimize cereal rye biomass production, which, combined with preemergence herbicides, provided effective Amaranthus spp. management. Soybean stand was a key factor in maintaining soybean yields compared with no-till when planting green. Farmers should follow best management recommendations for proper planter and equipment setup to ensure effective soybean establishment under high levels of cereal rye biomass when planting green.
Background: Patients undergoing hemodialysis are at high risk for healthcare-associated infections; they are at 100 times the risk of Staphylococcus aureus bloodstream infections (BSI) compared with U.S. adults not on hemodialysis. Prior studies found that nasal decolonization with mupirocin prevented S. aureus BSI among hemodialysis patients. We implemented a nasal decolonization intervention in which patients self-administered povidone-iodine (PVI) at each dialysis session. We aimed to assess: 1) hemodialysis patients’ knowledge of their infection risk and their willingness to take an active role in infection prevention; 2) the acceptability of the PVI nasal decolonization intervention. Methods: We performed a stepped wedge cluster randomized trial at 16 outpatient hemodialysis centers. Patients were surveyed: before starting PVI, 1 month after their center started using PVI, and ~6 months after starting PVI. We used a chi-square test to compare results. Results: 469 patients completed at least 1 survey: 400 pre-intervention, 237 at 1 month and 201 at 6 months. Overall, 56% of patients thought that their risk of infection was average or below average compared with an average person in the U.S. (Figure). Over 98% agreed with the statement “One of the most important things I can do for my health is to take an active role in my health care." In the pre-intervention survey, 73% were willing to do “a lot of effort” to prevent an infection. This proportion was similar (73%) in the 2nd survey, but decreased to 63% in the final survey (p < 0 .01). Among 106 patients who reported starting PVI, 85% reported that PVI felt neutral or pleasant, 9.4% reported a side effect, and 79% reported using it during the past 3 dialysis sessions. Among 102 patients who reported using PVI at 6 months, 87% said it felt neutral/pleasant, 3.9% reported a side effect and 75% reported using it during the past 3 dialysis sessions. Side effects included nasal dripping, congestion or burning/stinging, unpleasant smell, headache, yellow tears, and minor nose bleeding. Conclusions: Hemodialysis patients are not aware of their high risk of infection. Although many were willing to expend a lot of effort to prevent an infection, this willingness decreased during an infection prevention intervention. There were few PVI side effects and most patients stated that PVI felt neutral/pleasant, yet many patients chose to not use PVI. Future research should aim to improve patient education on their risk of infection and assess barriers to adherence with infection prevention interventions.
A substantial proportion of patients undergoing hemodialysis carry Staphylococcus aureus in their noses, and carriers are at increased risk of S. aureus bloodstream infections. Our pragmatic clinical trial implemented nasal povidone-iodine (PVI) decolonization for the prevention of bloodstream infections in the novel setting of hemodialysis units.
Objective:
We aimed to identify pragmatic strategies for implementing PVI decolonization among patients in outpatient hemodialysis units.
Design:
Qualitative descriptive study.
Setting:
Outpatient hemodialysis units affiliated with five US academic medical centers. Units varied in size, patient demographics, and geographic location.
Interviewees:
Sixty-six interviewees including nurses, hemodialysis technicians, research coordinators, and other personnel.
Methods:
We conducted interviews with personnel affiliated with all five academic medical centers and conducted thematic analysis of transcripts.
Results:
Hemodialysis units had varied success with patient recruitment, but interviewees reported that patients and healthcare personnel (HCP) found PVI decolonization acceptable and feasible. Leadership support, HCP engagement, and tailored patient-focused tools or strategies facilitated patient engagement and PVI implementation. Interviewees reported both patients and HCP sometimes underestimated patients’ infection risks and experienced infection-prevention fatigue. Other HCP barriers included limited staffing and poor staff engagement. Patient barriers included high health burdens, language barriers, memory issues, and lack of social support.
Conclusion:
Our qualitative study suggests that PVI decolonization would be acceptable to patients and clinical personnel, and implementation is feasible for outpatient hemodialysis units. Hemodialysis units could facilitate implementation by engaging unit leaders, patients and personnel, and developing education for patients about their infection risk.
Infections cause substantial morbidity and mortality among patients receiving care in outpatient hemodialysis facilities. We describe comprehensive infection prevention assessments by US public health departments using standardized interview and observation tools. Results demonstrated how facility layouts can undermine infection prevention and that clinical practices often fall short of policies.
The meridional rank conjecture asks whether the bridge number of a knot in $S^3$ is equal to the minimal number of meridians needed to generate the fundamental group of its complement. In this paper, we investigate the analogous conjecture for knotted spheres in $S^4$. Towards this end, we give a construction to produce classical knots with quotients sending meridians to elements of any finite order in Coxeter groups and alternating groups, which detect their meridional ranks. We establish the equality of bridge number and meridional rank for these knots and knotted spheres obtained from them by twist-spinning. On the other hand, we show that the meridional rank of knotted spheres is not additive under connected sum, so that either bridge number also collapses, or meridional rank is not equal to bridge number for knotted spheres.
Victims of electrical injury (EI) often experience injuries to the peripheral nervous system and neuromuscular damage that may diminish motor function, such as flexibility/dexterity. These difficulties may continue after rehabilitation due to the reorganization of muscle afferent projections during peripheral nerve regeneration. Therefore, understanding how patients with a history of thermal burn injuries perform on motoric measures is necessary to explain the impact neuromuscular damage has on both motor and non-motor tests of cognition. However, no studies have examined the impact of motor functioning on cognition in patients who experienced thermal and electrical injuries compared to an electrical shock injury. This study explored the impact of motor dysfunction and psychiatric distress measured by depression severity on psychomotor speed and executive test performances among EI patients with and without thermal burn injuries.
Participants and Methods:
This cross-sectional study consisted of EI patients undergoing an outpatient neuropsychological evaluation, including tests of motor dexterity (Grooved Pegboard [GP]), psychomotor speed (Wechsler Adult Intelligence Scale-IV Coding, Trail Making Test [TMT] Part A), and executive functioning (Stroop Color and Word Test [SCWT] Color-Word trial, TMT Part B). The sample was 83% male and 17% female, 88% White, 3% Black, 5% Hispanic, and 2% other race/ethnicity, with a mean age of 43.9 years (SD=11.36), mean education of 12.9 years (SD=2.05), and mean depression severity of 20.05 (SD=12.59) on the Beck Depression Inventory-II (BDI-II). Exclusion criteria were: 1) injury history of moderate-to-severe head trauma, 2) >2 performance validity test failures, and 3) any amputation of the upper extremity. Regression analyses included GP T-Scores for dominant hand and BDI-II total score as independent variables and neuropsychological normative test data as dependent variables.
Results:
Among validly performing patients with EI (n=86), regression analyses revealed GP performance accounted for significant variance (R2 =.153-.169) on all neuropsychological measures. Among EI patients with burn injuries (n=50), regression analyses revealed GP performance accounted for significant variance (R2 =.197-.266) on all neuropsychological measures. Among EI patients without burn injuries (n=36), analyses revealed that neither GP performance nor BDI-II severity accounted for significant variance across the neurocognitive tests (R2=.056-.142). Furthermore, among EI patients with burn injuries and the total sample, regression analyses revealed depression severity negatively predicted GP performance (R2 =.099-.13), however, in patients without burn injuries, depression did not predict GP performance (R2 =.052).
Conclusions:
Overall, results showed that GP performance is a significant predictor of neurocognitive performance on both motor and non-motor measures in EI patients with burn injuries. Therefore, among EI patients with burn injuries, GP performance may have potential utility as an early indicator of injury severity, considering that it predicts neuropsychological test performance on measures of psychomotor speed and executive functioning. Lastly, depression predicted GP performance within the burn injury sample illustrating that psychological distress may negatively impact motor functionality.
Up to 90% of adults with untreated atrial septal defect will be symptomatic by 4th decade, and 30-49% will develop heart failure. 8–10% of these patients have pulmonary arterial hypertension with a female predominance regardless of age. We aimed to demonstrate that fenestrated closure can be safely performed in patients with decompensated heart failure and atrial septal defect-associated pulmonary arterial hypertension with improved outcome.
Methods:
Transcatheter fenestrated atrial septal defect closures (Occlutech GmbH, Jena, Germany) were performed on a compassionate-use basis in 5 consecutive adult patients with atrial septal defect-associated pulmonary arterial hypertension and severe heart failure with prohibitive surgical mortality risks. Change in systemic oxygen saturation, 6-minute walk test, NYHA class, echocardiographic and haemodynamic parameters were used as parameters of outcome.
Results:
All patients were female, mean age 48.8 ± 13.5 years, followed up for a median of 29 months (max 64 months). Significant improvements observed in the 6-minute walk test, and oxygen saturation comparing day 0 time point to all other follow-up time points data (B = 1.32, SE = 0.28, t (22.7) = -4.77, p = 0.0001); and in the haemodynamic data (including pulmonary vascular resistance and pulmonary pressure) (B = –0.60, SE = 0.22, t (40.2) = 2.74, p = .009). All patients showed improved right ventricular size and function along with NYHA class. There were no procedure-related complications.
Conclusion:
Fenestrated atrial septal defect closure is feasible in adults with decompensated heart failure and atrial septal defect-associated pulmonary arterial hypertension. It results in sustained haemodynamic and functional improvement
Disaster planning and preparedness for a burn mass casualty incident (BMCI) must consider the needs of those who will be directly involved and support the response to such an event. An aspect of developing a more comprehensive statewide burn disaster program included meeting (regionally) with healthcare coalitions (HCC) to identify gaps in care and deficiencies.
Method:
Regularly scheduled (quarterly) HCC meetings are held around the state linking stakeholders representing local hospitals, health departments, emergency medical services (EMS) agencies, and other interested parties. We were able to use the HCCs regional meetings to serve as a platform for conducting focus group research to identify gaps specific to a BMCI and to inform strategy development for a statewide approach. Additionally, we held engagement meetings with state emergency response network (a state agency that coordinates the movement of ambulances to appropriate destinations) and the Burn Medical Directors findings were vetted from the focus groups.
Results:
One of the deficiencies identified, included a lack of burn-specific wound care dressings that could support the initial response. Relying on this same process, a consensus was attained for equipment types and quantities, including a kit for storage. Furthermore, a maintenance, supply replacement, and delivery to the scene processes were developed for these kits of supplies that could augment a BMCI response.
Conclusion:
Focus group feedback reminded us that outside of the world of burn care, many report an infrequent opportunity to provide care for patients with burn injuries. Several types of burn-specific dressings can be expensive, and with the occurrence being infrequent. EMS agencies and rural hospitals alike reported that it was unlikely their agency/hospital would have more than a minimal stock of burn injury supplies. Developing supply caches that can be quickly mobilized and deployed to the impacted area was one of the deficiencies we addressed.
To estimate the incidence, duration and risk factors for diagnostic delays associated with pertussis.
Design:
We used longitudinal retrospective insurance claims from the Marketscan Commercial Claims and Encounters, Medicare Supplemental (2001–2020), and Multi-State Medicaid (2014–2018) databases.
Setting:
Inpatient, emergency department, and outpatient visits.
Patients:
The study included patients diagnosed with pertussis (International Classification of Diseases [ICD] codes) and receipt of macrolide antibiotic treatment.
Methods:
We estimated the number of visits with pertussis-related symptoms before diagnosis beyond that expected in the absence of diagnostic delays. Using a bootstrapping approach, we estimated the number of visits representing a delay, the number of missed diagnostic opportunities per patient, and the duration of delays. Results were stratified by age groups. We also used a logistic regression model to evaluate potential factors associated with delay.
Results:
We identified 20,828 patients meeting inclusion criteria. On average, patients had almost 2 missed opportunities prior to diagnosis, and delay duration was 12 days. Across age groups, the percentage of patients experiencing a delay ranged from 29.7% to 37.6%. The duration of delays increased considerably with age from an average of 5.6 days for patients aged <2 years to 13.8 days for patients aged ≥18 years. Factors associated with increased risk of delays included emergency department visits, telehealth visits, and recent prescriptions for antibiotics not effective against pertussis.
Conclusions:
Diagnostic delays for pertussis are frequent. More work is needed to decrease diagnostic delays, especially among adults. Earlier case identification may play an important role in the response to outbreaks by facilitating treatment, isolation, and improved contact tracing.
Public health officials have faced resistance in their efforts to promote mask-wearing to counter the spread of COVID-19. One approach to promoting behavior change is to alert people to the fact that a behavior is common (a descriptive norm). However, partisan differences in pandemic mitigation behavior mean that Americans may be especially (in)sensitive to information about behavioral norms depending on the party affiliation of the group in question. In July–August 2020, we tested the effects of providing information to respondents about how many Americans, co-partisans, or out-partisans report wearing masks regularly on both mask-wearing intentions and on the perceived effectiveness of masks. Learning that a majority of Americans report wearing masks regularly increases mask-wearing intentions and perceived effectiveness, though the effects of this information are not distinguishable from other treatments.
Electrical injury (EI) is a significant, multifaceted trauma often with multi-domain cognitive sequelae, even when the expected current path does not pass through the brain. Chronic pain (CP) research suggests pain may affect cognition directly and indirectly by influencing emotional distress which then impacts cognitive functioning. As chronic pain may be critical to understanding EI-related cognitive difficulties, the aims of the current study were: examine the direct and indirect effects of pain on cognition following EI and compare the relationship between pain and cognition in EI and CP populations.
Method:
This cross-sectional study used data from a clinical sample of 50 patients with EI (84.0% male; Mage = 43.7 years) administered standardized measures of pain (Pain Patient Profile), depression, and neurocognitive functioning. A CP comparison sample of 93 patients was also included.
Results:
Higher pain levels were associated with poorer attention/processing speed and executive functioning performance among patients with EI. Depression was significantly correlated with pain and mediated the relationship between pain and attention/processing speed in patients with EI. When comparing the patients with EI and CP, the relationship between pain and cognition was similar for both clinical groups.
Conclusions:
Findings indicate that pain impacts mood and cognition in patients with EI, and the influence of pain and its effect on cognition should be considered in the assessment and treatment of patients who have experienced an electrical injury.
To achieve the elimination of the hepatitis C virus (HCV), sustained and sufficient levels of HCV testing is critical. The purpose of this study was to assess trends in testing and evaluate the effectiveness of strategies to diagnose people living with HCV. Data were from 12 primary care clinics in Victoria, Australia, that provide targeted services to people who inject drugs (PWID), alongside general health care. This ecological study spanned 2009–2019 and included analyses of trends in annual numbers of HCV antibody tests among individuals with no previous positive HCV antibody test recorded and annual test yield (positive HCV antibody tests/all HCV antibody tests). Generalised linear models estimated the association between count outcomes (HCV antibody tests and positive HCV antibody tests) and time, and χ2 test assessed the trend in test yield. A total of 44 889 HCV antibody tests were conducted 2009–2019; test numbers increased 6% annually on average [95% confidence interval (CI) 4–9]. Test yield declined from 2009 (21%) to 2019 (9%) (χ2P = <0.01). In more recent years (2013–2019) annual test yield remained relatively stable. Modest increases in HCV antibody testing and stable but high test yield within clinics delivering services to PWID highlights testing strategies are resulting in people are being diagnosed however further increases in the testing of people at risk of HCV or living with HCV may be needed to reach Australia's HCV elimination goals.
Quasi-periodic plasmoid formation at the tip of magnetic streamer structures is observed to occur in experiments on the Big Red Ball as well as in simulations of these experiments performed with the extended magnetohydrodynamics code, NIMROD. This plasmoid formation is found to occur on a characteristic time scale dependent on pressure gradients and magnetic curvature in both experiment and simulation. Single mode, or laminar, plasmoids exist when the pressure gradient is modest, but give way to turbulent plasmoid ejection when the system drive is higher, which produces plasmoids of many sizes. However, a critical pressure gradient is also observed, below which plasmoids are never formed. A simple heuristic model of this plasmoid formation process is presented and suggested to be a consequence of a dynamic loss of equilibrium in the high-$\beta$ region of the helmet streamer. This model is capable of explaining the periodicity of plasmoids observed in the experiment and simulations, and produces plasmoid periods of 90 minutes when applied to two-dimensional models of solar streamers with a height of $3R_\odot$. This is consistent with the location and frequency at which periodic plasma blobs have been observed to form by Large Angle and Spectrometric Coronograph and Sun Earth Connection Coronal and Heliospheric Investigation instruments.
Critically ill patients requiring extracorporeal membrane oxygenation (ECMO) frequently require interhospital transfer to a center that has ECMO capabilities. Patients receiving ECMO were evaluated to determine whether interhospital transfer was a risk factor for subsequent development of a nosocomial infection.
Design:
Retrospective cohort study.
Setting:
A 425-bed academic tertiary-care hospital.
Patients:
All adult patients who received ECMO for >48 hours between May 2012 and May 2020.
Methods:
The rate of nosocomial infections for patients receiving ECMO was compared between patients who were cannulated at the ECMO center and patients who were cannulated at a hospital without ECMO capabilities and transported to the ECMO center for further care. Additionally, time to infection, organisms responsible for infection, and site of infection were compared.
Results:
In total, 123 patients were included in analysis. For the primary outcome of nosocomial infection, there was no difference in number of infections per 1,000 ECMO days (25.4 vs 29.4; P = .03) by univariate analysis. By Cox proportional hazard analysis, transport was not significantly associated with increased infections (hazard ratio, 1.7; 95% confidence interval, 0.8–4.2; P = .20).
Conclusion:
In this study, we did not identify an increased risk of nosocomial infection during subsequent hospitalization. Further studies are needed to identify sources of nosocomial infection in this high-risk population.
To develop a pediatric research agenda focused on pediatric healthcare-associated infections and antimicrobial stewardship topics that will yield the highest impact on child health.
Participants:
The study included 26 geographically diverse adult and pediatric infectious diseases clinicians with expertise in healthcare-associated infection prevention and/or antimicrobial stewardship (topic identification and ranking of priorities), as well as members of the Division of Healthcare Quality and Promotion at the Centers for Disease Control and Prevention (topic identification).
Methods:
Using a modified Delphi approach, expert recommendations were generated through an iterative process for identifying pediatric research priorities in healthcare associated infection prevention and antimicrobial stewardship. The multistep, 7-month process included a literature review, interactive teleconferences, web-based surveys, and 2 in-person meetings.
Results:
A final list of 12 high-priority research topics were generated in the 2 domains. High-priority healthcare-associated infection topics included judicious testing for Clostridioides difficile infection, chlorhexidine (CHG) bathing, measuring and preventing hospital-onset bloodstream infection rates, surgical site infection prevention, surveillance and prevention of multidrug resistant gram-negative rod infections. Antimicrobial stewardship topics included β-lactam allergy de-labeling, judicious use of perioperative antibiotics, intravenous to oral conversion of antimicrobial therapy, developing a patient-level “harm index” for antibiotic exposure, and benchmarking and or peer comparison of antibiotic use for common inpatient conditions.
Conclusions:
We identified 6 healthcare-associated infection topics and 6 antimicrobial stewardship topics as potentially high-impact targets for pediatric research.
Previous genetic association studies have failed to identify loci robustly associated with sepsis, and there have been no published genetic association studies or polygenic risk score analyses of patients with septic shock, despite evidence suggesting genetic factors may be involved. We systematically collected genotype and clinical outcome data in the context of a randomized controlled trial from patients with septic shock to enrich the presence of disease-associated genetic variants. We performed genomewide association studies of susceptibility and mortality in septic shock using 493 patients with septic shock and 2442 population controls, and polygenic risk score analysis to assess genetic overlap between septic shock risk/mortality with clinically relevant traits. One variant, rs9489328, located in AL589740.1 noncoding RNA, was significantly associated with septic shock (p = 1.05 × 10–10); however, it is likely a false-positive. We were unable to replicate variants previously reported to be associated (p < 1.00 × 10–6 in previous scans) with susceptibility to and mortality from sepsis. Polygenic risk scores for hematocrit and granulocyte count were negatively associated with 28-day mortality (p = 3.04 × 10–3; p = 2.29 × 10–3), and scores for C-reactive protein levels were positively associated with susceptibility to septic shock (p = 1.44 × 10–3). Results suggest that common variants of large effect do not influence septic shock susceptibility, mortality and resolution; however, genetic predispositions to clinically relevant traits are significantly associated with increased susceptibility and mortality in septic individuals.