We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We conducted a quantitative analysis of the microbial burden and prevalence of epidemiologically important pathogens (EIP) found on long-term care facilities (LTCF) environmental surfaces.
Methods:
Microbiological samples were collected using Rodac plates (25cm2/plate) from resident rooms and common areas in five LTCFs. EIP were defined as MRSA, VRE, C. difficile and multidrug-resistant (MDR) Gram-negative rods (GNRs).
Results:
Rooms of residents with reported colonization had much greater EIP counts per Rodac (8.32 CFU, 95% CI 8.05, 8.60) than rooms of non-colonized residents (0.78 CFU, 95% CI 0.70, 0.86). Sixty-five percent of the resident rooms and 50% of the common areas were positive for at least one EIP. If a resident was labeled by the facility as colonized with an EIP, we only found that EIP in 30% of the rooms. MRSA was the most common EIP recovered, followed by C. difficile and MDR-GNR.
Discussion:
We found frequent environmental contamination with EIP in LTCFs. Colonization status of a resident was a strong predictor of higher levels of EIP being recovered from his/her room.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Ceramic clays are characterized by a method that stems from the IL/MA procedure proposed by Keeling (1958a, b) but which is more rapid to perform and more basic in concept. By thermogravimetric methods an estimate is made of both OH water and interlayer plus physically adsorbed water. These are plotted against each other in a manner similar to the IL/MA plot. The composition of the clay can be characterized uniquely on such a diagram by estimating either the quartz or the free silica contents of a sample, its silt, and < 2μ fraction, and plotting for each on a 100% clay basis their OH water, and interlayer and physically adsorbed water. Simple methods are given for assessing these factors. The determination of interlayer water content could form the basis of a method of plant control of clay composition in a way similar to that proposed by Keeling using moisture adsorption determinations.
On 2 March 2016, several small en échelon tabular icebergs calved from the seaward front of the McMurdo Ice Shelf, and a previously inactive rift widened and propagated by ~3 km, ~25% of its previous length, setting the stage for the future calving of a ~14 km2 iceberg. Within 24 h of these events, all remaining land-fast sea ice that had been stabilizing the ice shelf broke-up. The events were witnessed by time-lapse cameras at nearby Scott Base, and put into context using nearby seismic and automatic weather station data, satellite imagery and subsequent ground observation. Although the exact trigger of calving and rifting cannot be identified definitively, seismic records reveal superimposed sets of both long-period (>10 s) sea swell propagating into McMurdo Sound from storm sources beyond Antarctica, and high-energy, locally-sourced, short-period (<10 s) sea swell, in the 4 days before the fast ice break-up and associated ice-shelf calving and rifting. This suggests that sea swell should be studied further as a proximal cause of ice-shelf calving and rifting; if proven, it suggests that ice-shelf stability is tele-connected with far-field storm conditions at lower latitudes, adding a global dimension to the physics of ice-shelf break-up.
Background: The diagnosis of a sports-related concussion is often dependent on the athlete self-reporting their symptoms. It has been suggested that improving youth athlete knowledge and attitudes toward concussion may increase self-reporting behaviour. The objective of this study was to determine if a novel Concussion-U educational program improves knowledge of and attitudes about concussion among a cohort of elite male Bantam and Midget AAA hockey players. Methods: Fifty-seven male Bantam and Midget AAA-level hockey players (mean age=14.52±1.13 years) were recruited from the local community. Each participant completed a modified version of the Rosenbaum Concussion Knowledge and Attitudes Survey–Student Version immediately before and after a Concussion-U educational presentation. Follow-up sessions were arranged 4 to 6 months after the presentation, and assessed retention of knowledge and attitude changes. Results: Forty-three players completed all three surveys. Concussion knowledge and attitude scores significantly (p<0.01) increased from pre- to post-presentation by 12.79 and 8.41%, respectively. At long-term follow-up, knowledge levels remained significantly (p<0.01) higher than baseline by 8.49%. Mean attitude scores were also increased at follow-up; however, this increase was not statistically significant. Conclusions: A Concussion-U educational program led to an immediate improvement in concussion knowledge and attitudes among elite male Bantam and Midget AAA hockey players. Increased knowledge was maintained at long-term follow-up, but improved attitude was not. Future studies should investigate whether similar educational programs influence symptom reporting and concussion incidence. In addition, they should focus on how to maintain improved concussion attitudes.
Background: Research has suggested that female athletes have a higher incidence of concussion compared to their male counterparts. As such, programs designed to improve knowledge and attitudes of concussion should target this high-risk population. Previous work demonstrated the effect of a novel Concussion-U educational presentation on knowledge and attitudes of concussion amongst male Bantam and Midget AAA hockey players. The objective of this study was to determine if the same presentation was effective in improving the knowledge and attitudes of concussion in a cohort of elite female hockey players. Methods: 26 elite female high-school aged (14-17) hockey players from the province of New Brunswick consented to participate in the study. Each participant completed a modified version of Rosenbaum and Arnett’s Concussion Knowledge and Attitudes Survey questionnaire immediately before and after a Concussion-U educational presentation. Results were compared across the two time-points to assess the effectiveness of the presentation. Results: Concussion knowledge and attitude scores significantly (p<.001) increased from pre-presentation to post-presentation by 12.5% and 13.4%, respectively. Conclusions: A Concussion-U educational presentation resulted in increased knowledge and improved attitudes towards concussion in elite female hockey players. Future research should examine the long-term retention of these improvements.
The first experimental measurements of intense (${\sim }7\times 10^{19}~ {\rm W}~ {\rm cm}^{-2}$) laser-driven terahertz (THz) radiation from a solid target which is preheated by an intense pulse of laser-accelerated protons is reported. The total energy of the THz radiation is found to decrease by approximately a factor of 2 compared to a cold target reference. This is attributed to an increase in the scale length of the preformed plasma, driven by proton heating, at the front surface of the target, where the THz radiation is generated. The results show the importance of controlling the preplasma scale length for THz production.
The objective of this study was to estimate the sensitivity and specificity of a culture method and a polymerase chain reaction (PCR) method for detection of two Campylobacter species: C. jejuni and C. coli. Data were collected during a 3-year survey of UK broiler flocks, and consisted of parallel sampling of caeca from 436 batches of birds by both PCR and culture. Batches were stratified by season (summer/non-summer) and whether they were the first depopulation of the flock, resulting in four sub-populations. A Bayesian approach in the absence of a gold standard was adopted, and the sensitivity and specificity of the PCR and culture for each Campylobacter subtype was estimated, along with the true C. jejuni and C. coli prevalence in each sub-population. Results indicated that the sensitivity of the culture method was higher than that of PCR in detecting both species when the samples were derived from populations infected with at most one species of Campylobacter. However, from a mixed population, the sensitivity of culture for detecting both C. jejuni or C. coli is reduced while PCR is potentially able to detect both species, although the total probability of correctly identifying at least one species by PCR is similar to that of the culture method.
During 2007–2009 a UK-wide, 3-year stratified randomized survey of UK chicken broiler flocks was conducted to estimate the prevalence of Campylobacter-infected batches of birds at slaughter. Thirty-seven abattoirs, processing 88·3% of the total UK slaughter throughput, were recruited at the beginning of the survey. Of the 1174 slaughter batches sampled, 79·2% were found to be colonized with Campylobacter, the majority of isolates being C. jejuni. Previous partial depopulation of the flock [odds ratio (OR) 5·21], slaughter in the summer months (categorized as June, July and August; OR 14·27) or autumn months (categorized as September, October and November; OR 1·70) increasing bird age (40–41 days, OR 3·18; 42–45 days, OR 3·56; ⩾46 days, OR 13·43) and higher recent mortality level in the flock (1·00–1·49% mortality, OR 1·57; ⩾1·49% mortality, OR 2·74) were all identified as significant risk factors for Campylobacter colonization of the birds at slaughter. Time in transit to the slaughterhouse of more than 2·5 h was identified as a protective factor (OR 0·52).
In the UK contemporary estimates of dietary Fe intakes rely upon food Fe content data from the 1980s or before. Moreover, there has been speculation that the natural Fe content of foods has fallen over time, predominantly due to changes in agricultural practices. Therefore, we re-analysed common plant-based foods of the UK diet for their Fe content (the ‘2000s analyses’) and compared the values with the most recent published values (the ‘1980s analyses’) and the much older published values (the ‘1930s analyses’), the latter two being from different editions of the McCance and Widdowson food tables. Overall, there was remarkable consistency between analytical data for foods spanning the 70 years. There was a marginal, but significant, apparent decrease in natural food Fe content from the 1930s to 1980s/2000s. Whether this represents a true difference or is analytical error between the eras is unclear and how it could translate into differences in intake requires clarification. However, fortificant Fe levels (and fortificant Fe intake based upon linked national data) did appear to have increased between the 1980s and 2000s, and deserve further attention in light of recent potential concerns over the long-term safety and effectiveness of fortificant Fe. In conclusion, the overall Fe content of plant-based foods is largely consistent between the 1930s and 2000s, with a fall in natural dietary Fe content negated or even surpassed by a rise in fortificant Fe but for which the long-term effects are uncertain.
A baseline survey on the prevalence of Campylobacter spp. in broiler flocks and Campylobacter spp. on broiler carcases in the UK was performed in 2008 in accordance with Commission Decision 2007/516/EC. Pooled caecal contents from each randomly selected slaughter batch, and neck and breast skin from a single carcase were examined for Campylobacter spp. The prevalence of Campylobacter in the caeca of broiler batches was 75·8% (303/400) compared to 87·3% (349/400) on broiler carcases. Overall, 27·3% of the carcases were found to be highly contaminated with Campylobacter (⩾1000 c.f.u./g). Slaughter in the summer months (June, July, August) [odds ratio (OR) 3·50], previous partial depopulation of the flock (OR 3·37), and an increased mortality at 14 days (⩾1·25% to <1·75%) (OR 2·54) were identified as significant risk factors for the most heavily Campylobacter-contaminated carcases. Four poultry companies and farm location were also found to be significantly associated with highly contaminated carcases.
Despite the patient numbers and scope of ENT surgery, it is under-represented in most undergraduate medical curricula.
Method:
An online questionnaire was e-mailed, at National Health Service trust level, to 3544 newly qualified doctors from 30 UK medical schools. Undergraduate ENT exposure, confidence and educational value were measured on a Likert scale.
Results:
We received 444 eligible responses. The mean undergraduate ENT exposure was 3.4 days of pre-clinical teaching plus 5.0 days of ENT departmental experience. However, 15.8 per cent of respondents reported no formal departmental ENT experience, and 65.8 per cent would have liked further undergraduate experience. Teaching modalities with a lower perceived educational value were generally offered more frequently than those with a higher perceived educational value. Graduates felt significantly less confident with ENT history-taking, examination and management, compared with their cardiology clinical competencies (p < 0.001).
Conclusion:
These results highlight the lack of UK ENT undergraduate education, and the significant effect this has on junior doctors’ clinical confidence. In addition, commonly used teaching methods may not be optimally effective.