We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Racial and ethnic variations in antibiotic utilization are well-reported in outpatient settings but little is known about inpatient settings. Our objective was to describe national inpatient antibiotic utilization among children by race and ethnicity.
Methods:
This study included hospital visit data from the Pediatric Health Information System between 01/01/2022 and 12/31/2022 for patients <20 years. Primary outcomes were the percentage of hospitalization encounters that received an antibiotic and antibiotic days of therapy (DOT) per 1000 patient days. Mixed-effect regression models were used to determine the association of race-ethnicity with outcomes, adjusting for covariates.
Results:
There were 846,530 hospitalizations. 45.2% of children were Non-Hispanic (NH) White, 27.1% were Hispanic, 19.2% were NH Black, 4.5% were NH Other, 3.5% were NH Asian, 0.3% were NH Native Hawaiian/Other Pacific Islander (NHPI) and 0.2% were NH American Indian. Adjusting for covariates, NH Black children had lower odds of receiving antibiotics compared to NH White children (aOR 0.96, 95%CI 0.94–0.97), while NH NHPI had higher odds of receiving antibiotics (aOR 1.16, 95%CI 1.05–1.29). Children who were Hispanic, NH Asian, NH American Indian, and children who were NH Other received antibiotic DOT compared to NH White children, while NH NHPI children received more antibiotic DOT.
Conclusions:
Antibiotic utilization in children’s hospitals differs by race and ethnicity. Hospitals should assess policies and practices that may contribute to disparities in treatment; antibiotic stewardship programs may play an important role in promoting inpatient pharmacoequity. Additional research is needed to examine individual diagnoses, clinical outcomes, and drivers of variation.
Small-angle X-ray scattering (SAXS) has been widely used as a microstructure characterization technology. In this work, a fully connected dense forward network is applied to inversely retrieve the mean particle size and particle distribution from SAXS data of samples dynamically compressed with high-power lasers and probed with X-ray free electron lasers. The trained network allows automatic acquisition of microstructure information, performing well in predictions on single-species nanoparticles on the theoretical model and in situ experimental data. We evaluate our network by comparing it with other methods, revealing its reliability and efficiency in dynamic experiments, which is of great value for in situ characterization of materials under high-power laser-driven dynamic compression.
To describe pediatric outpatient visits and antibiotic prescribing during the coronavirus disease 2019 (COVID-19) pandemic.
Design:
An observational, retrospective control study from January 2019 to October 2021.
Setting:
Outpatient clinics, including 27 family medicine clinics, 27 pediatric clinics, and 26 urgent or prompt care clinics.
Patients:
Children aged 0–19 years receiving care in an outpatient setting.
Methods:
Data were extracted from the electronic health record. The COVID-19 era was defined as April 1, 2020, to October 31, 2021. Virtual visits were identified by coded encounter or visit type variables. Visit diagnoses were assigned using a 3-tier classification system based on appropriateness of antibiotic prescribing and a subanalysis of respiratory visits was performed to compare changes in the COVID-19 era compared to baseline.
Results:
Through October 2021, we detected an overall sustained reduction of 18.2% in antibiotic prescribing to children. Disproportionate changes occurred in the percentages of antibiotic visits in respiratory visits for children by age, race or ethnicity, practice setting, and prescriber type. Virtual visits were minimal during the study period but did not result in higher rates of antibiotic visits or in-person follow-up visits.
Conclusions:
These findings suggest that reductions in antibiotic prescribing have been sustained despite increases in outpatient visits. However, additional studies are warranted to better understand disproportionate rates of antibiotic visits.
To explore and provide contextual meaning around issues surrounding food insecurity, namely factors influencing food access, as one domain of food security.
Design:
A community-based, qualitative inquiry using semi-structured face-to-face interviews was conducted as part of a larger sequential mixed-methods study.
Setting:
Cayo District, Belize, May 2019–August 2019.
Participants:
Thirty English-speaking individuals (eight males, twenty-two females) between the ages of 18–70, with varying family composition residing within the Cayo District.
Results:
Participants describe a complex interconnectedness between family- and individual-level barriers to food access. Specifically, family composition, income, education and employment influence individuals’ ability to afford and access food for themselves or their families. Participants also cite challenges with transportation and distance to food sources and educational opportunities as barriers to accessing food.
Conclusion:
These findings provide insight around food security and food access barriers in a middle-income country and provide avenues for further study and potential interventions. Increased and sustained investment in primary and secondary education, including programmes to support enrollment, should be a priority to decreasing food insecurity. Attention to building public infrastructure may also ease burdens around accessing foods.
To describe risk factors associated with inappropriate antibiotic prescribing to children.
Design:
Cross-sectional, retrospective analysis of antibiotic prescribing to children, using Kentucky Medicaid medical and pharmacy claims data, 2017.
Participants:
Population-based sample of pediatric Medicaid patients and providers.
Methods:
Antibiotic prescriptions were identified from pharmacy claims and used to describe patient and provider characteristics. Associated medical claims were identified and linked to assign diagnoses. An existing classification scheme was applied to determine appropriateness of antibiotic prescriptions.
Results:
Overall, 10,787 providers wrote 779,813 antibiotic prescriptions for 328,515 children insured by Kentucky Medicaid in 2017. Moreover, 154,546 (19.8%) of these antibiotic prescriptions were appropriate, 358,026 (45.9%) were potentially appropriate, 163,654 (21.0%) were inappropriate, and 103,587 (13.3%) were not associated with a diagnosis. Half of all providers wrote 12 prescriptions or less to Medicaid children. The following child characteristics were associated with inappropriate antibiotic prescribing: residence in a rural area (odds ratio [OR], 1.09; 95% confidence interval [CI], 1.07–1.1), having a visit with an inappropriate prescriber (OR, 4.15; 95% CI, 4.1–4.2), age 0–2 years (OR, 1.39; 95% CI, 1.37–1.41), and presence of a chronic condition (OR, 1.31; 95% CI, 1.28–1.33).
Conclusions:
Inappropriate antibiotic prescribing to Kentucky Medicaid children is common. Provider and patient characteristics associated with inappropriate prescribing differ from those associated with higher volume. Claims data are useful to describe inappropriate use and could be a valuable metric for provider feedback reports. Policies are needed to support analysis and dissemination of antibiotic prescribing reports and should include all provider types and geographic areas.
Post-tonsillectomy bleeding is the most frequent complication of tonsillectomy. Inherited platelet function disorders have an estimated prevalence of 1 per cent. Any association between post-tonsillectomy bleeds and undiagnosed inherited platelet function disorders has not been investigated before.
Objectives
To assess the prevalence of inherited platelet function disorders in a cohort of post-tonsillectomy bleed patients.
Methods
An observational cohort study was conducted using hospital digital records. Platelet function analyser 100 (‘PFA-100’) closure time was tested on post-tonsillectomy bleed patients who presented to hospital.
Results
Between 2013 and 2017, 9 of 91 post-tonsillectomy bleed patients who underwent platelet function analyser 100 testing (9.89 per cent) had positive results. Five patients (5.49 per cent) had undiagnosed inherited platelet function disorders. Four patients had false positive results secondary to a non-steroidal anti-inflammatory drug effect (specificity of 95.3 per cent) proven by repeat testing six weeks later, off medication. The false negative rate was 0 per cent.
Conclusion
The prevalence of inherited platelet function disorders in our post-tonsillectomy bleed cohort is five-fold higher than in the general population. Platelet function analyser 100 testing when patients present with a post-tonsillectomy bleed allows management of their inherited platelet function disorder.
Obtaining objective, dietary exposure information from individuals is challenging because of the complexity of food consumption patterns and the limitations of self-reporting tools (e.g., FFQ and diet diaries). This hinders research efforts to associate intakes of specific foods or eating patterns with population health outcomes.
Design:
Dietary exposure can be assessed by the measurement of food-derived chemicals in urine samples. We aimed to develop methodologies for urine collection that minimised impact on the day-to-day activities of participants but also yielded samples that were data-rich in terms of targeted biomarker measurements.
Setting:
Urine collection methodologies were developed within home settings.
Participants:
Different cohorts of free-living volunteers.
Results:
Home collection of urine samples using vacuum transfer technology was deemed highly acceptable by volunteers. Statistical analysis of both metabolome and selected dietary exposure biomarkers in spot urine collected and stored using this method showed that they were compositionally similar to urine collected using a standard method with immediate sample freezing. Even without chemical preservatives, samples can be stored under different temperature regimes without any significant impact on the overall urine composition or concentration of forty-six exemplar dietary exposure biomarkers. Importantly, the samples could be posted directly to analytical facilities, without the need for refrigerated transport and involvement of clinical professionals.
Conclusions:
This urine sampling methodology appears to be suitable for routine use and may provide a scalable, cost-effective means to collect urine samples and to assess diet in epidemiological studies.
This paper explores dependencies between operational risks and between operational risks and other risks such as market, credit and insurance risk. The paper starts by setting the regulatory context and then goes into practical aspects of operational risk dependencies. Next, methods of modelling operational risk dependencies are considered with a simulation study exploring the sensitivity of diversification benefits arising from dependency models. The following two sections consider how correlation assumptions may be set, highlighting some generic dependencies between operational risks and with non-operational risks to assist in the assessment of dependencies and correlation assumptions. Supplementary appendices provide further detail on generic dependencies as well as a case study of how business models can lead to operational risks interacting with other risks. Finally, the paper finishes with a literature review of operational risk dependency papers including correlation studies and benchmark reports.
To test the feasibility of using telehealth to support antimicrobial stewardship at Veterans Affairs medical centers (VAMCs) that have limited access to infectious disease-trained specialists.
Design
A prospective quasi-experimental pilot study.
Setting
Two rural VAMCs with acute-care and long-term care units.
Intervention
At each intervention site, medical providers, pharmacists, infection preventionists, staff nurses, and off-site infectious disease physicians formed a videoconference antimicrobial stewardship team (VAST) that met weekly to discuss cases and antimicrobial stewardship-related education.
Methods
Descriptive measures included fidelity of implementation, number of cases discussed, infectious syndromes, types of recommendations, and acceptance rate of recommendations made by the VAST. Qualitative results stemmed from semi-structured interviews with VAST participants at the intervention sites.
Results
Each site adapted the VAST to suit their local needs. On average, sites A and B discussed 3.5 and 3.1 cases per session, respectively. At site A, 98 of 140 cases (70%) were from the acute-care units; at site B, 59 of 119 cases (50%) were from the acute-care units. The most common clinical syndrome discussed was pneumonia or respiratory syndrome (41% and 35% for sites A and B, respectively). Providers implemented most VAST recommendations, with an acceptance rate of 73% (186 of 256 recommendations) and 65% (99 of 153 recommendations) at sites A and B, respectively. Qualitative results based on 24 interviews revealed that participants valued the multidisciplinary aspects of the VAST sessions and felt that it improved their antimicrobial stewardship efforts and patient care.
Conclusions
This pilot study has successfully demonstrated the feasibility of using telehealth to support antimicrobial stewardship at rural VAMCs with limited access to local infectious disease expertise.
In September 2016, an imported case of measles in Edinburgh in a university student resulted in a further 17 confirmed cases during October and November 2016. All cases were genotype D8 and were associated with a virus strain most commonly seen in South East Asia. Twelve of the 18 cases were staff or students at a university in Edinburgh and 17 cases had incomplete or unknown measles mumps rubella (MMR) vaccination status. The public health response included mass follow-up of all identified contacts, widespread communications throughout universities in Edinburgh and prompt vaccination clinics at affected campuses. Imported cases of measles pose a significant risk to university student cohorts who may be undervaccinated, include a large number of international students and have a highly mobile population. Public health departments should work closely with universities to promote MMR uptake and put in place mass vaccination plans to prevent rapidly spreading measles outbreaks in higher educational settings in future.
As chemical management options for weeds become increasingly limited due to selection for herbicide resistance, investigation of additional nonchemical tools becomes necessary. Harvest weed seed control (HWSC) is a methodology of weed management that targets and destroys weed seeds that are otherwise dispersed by harvesters following threshing. It is not known whether problem weeds in western Canada retain their seeds in sufficient quantities until harvest at a height suitable for collection. A study was conducted at three sites over 2 yr to determine whether retention and height criteria were met by wild oat, false cleavers, and volunteer canola. Wild oat consistently shed seeds early, but seed retention was variable, averaging 56% at the time of wheat swathing, with continued losses until direct harvest of wheat and fababean. The majority of retained seeds were >45 cm above ground level, suitable for collection. Cleavers seed retention was highly variable by site-year, but generally greater than wild oat. The majority of seed was retained >15 cm above ground level and would be considered collectable. Canola seed typically had >95% retention, with the majority of seed retained >15 cm above ground level. The suitability ranking of the species for management with HWSC was canola>cleavers>wild oat. Efficacy of HWSC systems in western Canada will depend on the target species and site- and year-specific environmental conditions.
Limitations of access have long restricted exploration and investigation of the cavities beneath ice shelves to a small number of drillholes. Studies of sea-ice underwater morphology are limited largely to scientific utilization of submarines. Remotely operated vehicles, tethered to a mother ship by umbilical cable, have been deployed to investigate tidewater-glacier and ice-shelf margins, but their range is often restricted. The development of free-flying autonomous underwater vehicles (AUVs) with ranges of tens to hundreds of kilometres enables extensive missions to take place beneath sea ice and floating ice shelves. Autosub2 is a 3600 kg, 6.7 m long AUV, with a 1600 m operating depth and range of 400 km, based on the earlier Autosub1 which had a 500 m depth limit. A single direct-drive d.c. motor and five-bladed propeller produce speeds of 1–2 m s−1. Rear-mounted rudder and stern-plane control yaw, pitch and depth. The vehicle has three sections. The front and rear sections are free-flooding, built around aluminium extrusion space-frames covered with glass-fibre reinforced plastic panels. The central section has a set of carbon-fibre reinforced plastic pressure vessels. Four tubes contain batteries powering the vehicle. The other three house vehicle-control systems and sensors. The rear section houses subsystems for navigation, control actuation and propulsion and scientific sensors (e.g. digital camera, upward-looking 300 kHz acoustic Doppler current profiler, 200 kHz multibeam receiver). The front section contains forward-looking collision sensor, emergency abort, the homing systems, Argos satellite data and location transmitters and flashing lights for relocation as well as science sensors (e.g. twin conductivity–temperature–depth instruments, multibeam transmitter, sub-bottom profiler, AquaLab water sampler). Payload restrictions mean that a subset of scientific instruments is actually in place on any given dive. The scientific instruments carried on Autosub are described and examples of observational data collected from each sensor in Arctic or Antarctic waters are given (e.g. of roughness at the underside of floating ice shelves and sea ice).
The development of an economic capital model requires a decision to be made regarding how to aggregate capital requirements for the individual risk factors while taking into account the effects of diversification. Under the Individual Capital Adequacy Standards framework, UK life insurers have commonly adopted a correlation matrix approach due to its simplicity and ease in communication to the stakeholders involved, adjusting the result, where appropriate, to allow for non-linear interactions. The regulatory requirements of Solvency II have been one of the principal drivers leading to an increased use of more sophisticated aggregation techniques in economic capital models. This paper focusses on a simulation-based approach to the aggregation of capital requirements using copulas and proxy models. It describes the practical challenges in parameterising a copula including how allowance may be made for tail dependence. It also covers the challenges associated with fitting and validating a proxy model. In particular, the paper outlines how insurers could test, communicate and justify the choices made through the use of some examples.
Among extant crinoids, the feather stars are the most diverse and occupy the greatest bathymetric range, being especially common in reef environments. Feather stars possess a variety of morphological, behavioral and physiological traits that have been hypothesized to be critical to their success, especially in their ability to cope with predation. However, knowledge of their predators is exceptionally scant, consisting primarily of circumstantial evidence of attacks by fishes. In this study the question whether regular echinoids, recently shown to consume stalked crinoids, also consume feather stars is explored. Aquarium observations indicate that regular echinoids find feather stars palatable, including feather stars known to be distasteful to fish, and that regular echinoids can capture and eat live feather stars, including those known to swim. Gut-content analyses of the echinoid Araeosoma fenestratum (Thomson, 1872), which is commonly observed with large populations of the feather star Koehlermetra porrecta (Carpenter, 1888) in video transects from marine canyons off the coast of France, revealed elements of feather stars in the guts of 6 of 13 individuals. The high proportion of crinoid material (up to 90%), and the presence of articulated crinoid skeletal elements in the gut of A. fenestratum, suggest that these echinoids consumed at least some live crinoids, although they may have also ingested some postmortem remains found in the sediment. Additionally, photographic evidence from the northeast Atlantic suggests that another regular echinoid, Cidaris cidaris (Linnaeus, 1758), preys on feather stars. Thus in spite of the broad suite of antipredatory adaptations, feather stars are today subject to predation by regular echinoids and may have been since the Mesozoic, when this group of crinoids first appeared.
This paper seeks to establish good practice in setting inputs for operational risk models for banks, insurers and other financial service firms. It reviews Basel, Solvency II and other regulatory requirements as well as publicly available literature on operational risk modelling. It recommends a combination of historic loss data and scenario analysis for modelling of individual risks, setting out issues with these data, and outlining good practice for loss data collection and scenario analysis. It recommends the use of expert judgement for setting correlations, and addresses information requirements for risk mitigation allowances and capital allocation, before briefly covering Bayesian network methods for modelling operational risks.
The Cycadales are a group of significant global conservation concern and have the highest extinction risk of all seed plants. Understanding the synchronisation of reproductive phenology of Cycadales may be useful for conservation by enabling the targeting of pollen and seed collection from wild populations and identifying the window of fertilisation to aid in the cultivation of Cycadales. Phenological data for 11 species of Zamia were gathered from herbarium specimens. Four phenological characters were coded with monthly character states. DNA was isolated and sequenced for 26S, CAB, NEEDLY, matK and rbcL, and a simultaneous phylogenetic analysis of phenology and DNA sequence data was carried out. Three major clades were recovered: a Caribbean clade, a Central American clade and a South American clade. Eight species showed statistically significant synchronisation in microsporangiate and ovulate phenological phases, indicating the time of fertilisation. Close reproductive synchronisation was consistently observed throughout the Caribbean clade (statistically significant in four of five species) but was less consistent in the Central American clade (statistically significant in one of two species) and South American clade (statistically significant in three of four species). Ultimately, phenology is shown to be a potential driver of speciation in some clades of Zamia and in others to be a potential barrier to hybridisation.
The deep interiors of cold, degenerate stars consist of a mixture of elements, either because of primordial inhomogeneities or because of incomplete nuclear burning. However, most existing calculations for the cooling of such bodies (subsequent to any nuclear burning) assume that the only source of luminosity is the heat content of the star. An additional (and potentially much larger) energy source is available if the elements have limited mutual solubility below some temperature. The resulting differentiation and gravitational settling can dramatically decrease the rate of cooling, enhance the number of (potentially) observable low luminosity bodies, and may deplete the atmosphere of heavy elements (if substantial mixing between the atmosphere and deep interior occurs). The observational evidence for these phenomena is equivocal at present.
Semiconvection is the name given to a situation that arises in the evolution of large-mass main-sequence stars and lower-mass horizontal-branch stars, in which a layer forms where almost all of the heat flux is transported by radiation, but where slow convection is required to redistribute a stably stratified solute (e.g. helium). For a general discussion on semiconvection, see Spiegel (1969). The main problem is to determine the correct relationship between the solute distribution and the temperature gradient in the inhomogeneous layer. Ledoux (1947) and Schwarzschild and Harm (1958) have proposed two very different prescriptions. Since theory and experiment indicate that the Schwarzschild-Harm (SH) prescription is correct for the onset of convection (in the form of overstability), Spiegel (1969) has proposed that SH are correct. However, neither observation (oceanographic or laboratory) nor theory precludes a substantial deviation from the SH prescription as applied to finite amplitude semiconvection. The observational evidence is only readily applicable to stars if Pr≳, 1, where Pr is the Prandtl number of the fluid. In fact, Pr≪ 1 in stars. It is shown below that if one could have stars in which Pr ≳ 1, then the SH prescription would probably be wrong, and the Ledoux criterion might then be nearer to being correct. In the real situation (Pr≪ 1), observations or experiments are lacking and theory alone does not provide an unequivocal answer to the problem. This paper does not attempt a complete discussion of the problem, and a more detailed report will be submitted elsewhere. Some aspects of the problem have already been discussed in the context of the giant planets (Stevenson and Salpeter 1977).
According to Oort (1965), the mass density in the solar neighbourhood (inferred from the gravity component normal to the galactic plane) is between 50% and 150% greater than the mass density inferred from non-dwarf stars. One possible explanation for the “missing mass” is an overabundance of faint M-dwarfs (Weistrop 1972), but present indications are that this overabundance is either small (Weistrop 1976; Sanduleak 1976) or non-existent (Faber et al. 1976; Eggen 1976). Nevertheless, Salpeter’s initial mass function (Salpeter 1955) suggests that the total mass may be dominated by low mass stars, including masses M≤0.08M⊙ which never undergo significant hydrogen burning.