We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
For the past 25 years, European badgers (Meles meles) have been subject to culling in Britain in attempts to limit the spread of tuberculosis (TB) to cattle. As part of a far-reaching evaluation of the effectiveness and acceptability of badger culling as a TB control measure, this paper assesses one aspect of the welfare of badger populations subjected to culling: the killing of breeding females, which risks leaving their unweaned cubs to starve in the den. To avoid this possibility, a three-month closed season was adopted, running from 1st February to 30th April, based on the best available estimates of the timing of birth and weaning in British badgers. During May 1999–2003, when a total of 4705 adult badgers were culled, field teams failed to capture 12 unweaned litters when their mothers were despatched. In 31 other cases, lactating females were culled but litters of almost-weaned cubs were also caught and despatched at the same dens, usually within a day of capture of the mother. The number of unweaned cubs missed by culling teams — estimated at approximately nine per year on average — was dramatically lower than that projected by a badger welfare lobby group. Our data suggest that the closed season is effective in reducing the suffering of unweaned cubs in badger populations subject to culling, and we recommend that this measure be maintained should badger culling form a component of any future TB control policy.
For over 25 years, European badgers (Meles meles) have been subject to culling in Britain in attempts to limit the spread of tuberculosis (TB) to cattle. As part of a far-reaching evaluation of the effectiveness and acceptability of badger culling as a TB control measure, this paper assesses one aspect of the welfare of badger populations subjected to culling: the risk of badgers confined to cage traps prior to despatch becoming injured as a result of rubbing or biting on the cage. In a large-scale field trial, 88% of badgers received no detectable injuries as a result of being confined in the trap. Of those that were injured, 72% received only minor skin abrasions. A minority (1.8% of the total) acquired damage to the teeth or jaws that may have caused serious pain. Although trap rounds were commenced in the early morning, badgers were no more likely to sustain injuries when they remained in traps until later in the day. Coating of cage traps, intended to give the wire mesh a smoother surface, was associated with a reduction in the incidence of minor skin abrasions, although it may have slightly increased the frequency of less common but more serious abrasions. Modification of the door design reduced tooth damage. Traps will be further modified if appropriate. However, all aspects of the conduct of trapping operations must balance badger welfare with concerns for the health and safety of field staff.
Cognitive impairment is common in individuals presenting to alcohol and other drug (AOD) settings and the presence of biopsychosocial complexity and health inequities can complicate the experience of symptoms and access to treatment services. A challenge for neuropsychologists in these settings is to evaluate the likely individual contribution of these factors to cognition when providing an opinion regarding diagnoses such as acquired brain injury (ABI). This study therefore aimed to identify predictors of cognitive functioning in AOD clients attending for neuropsychological assessment.
Methods:
Clinical data from 200 clients with AOD histories who attended for assessment between 2014 and 2018 were analysed and a series of multiple regressions were conducted to explore predictors of cognitive impairment including demographic, diagnostic, substance use, medication, and mental health variables.
Results:
Regression modelling identified age, gender, years of education, age of first use, days of abstinence, sedative load, emotional distress and diagnoses of ABI and developmental disorders as contributing to aspects of neuropsychological functioning. Significant models were obtained for verbal intellectual functioning (Adj R2 = 0.19), nonverbal intellectual functioning (Adj R2 = 0.10), information processing speed (Adj R2 = 0.20), working memory (Adj R2 = 0.05), verbal recall (Adj R2 = 0.08), visual recall (Adj R2 = 0.22), divided attention (Adj R2 = 0.14), and cognitive inhibition (Adj R2 = 0.07).
Conclusions:
These findings highlight the importance of careful provision of diagnoses in clients with AOD histories who have high levels of unmet clinical needs. They demonstrate the interaction of premorbid and potentially modifiable comorbid factors such as emotional distress and prescription medication on cognition. Ensuring that modifiable risk factors for cognitive impairment are managed may reduce experiences of cognitive impairment and improve diagnostic clarity.
The objectives of this study were to develop and refine EMPOWER (Enhancing and Mobilizing the POtential for Wellness and Resilience), a brief manualized cognitive-behavioral, acceptance-based intervention for surrogate decision-makers of critically ill patients and to evaluate its preliminary feasibility, acceptability, and promise in improving surrogates’ mental health and patient outcomes.
Method
Part 1 involved obtaining qualitative stakeholder feedback from 5 bereaved surrogates and 10 critical care and mental health clinicians. Stakeholders were provided with the manual and prompted for feedback on its content, format, and language. Feedback was organized and incorporated into the manual, which was then re-circulated until consensus. In Part 2, surrogates of critically ill patients admitted to an intensive care unit (ICU) reporting moderate anxiety or close attachment were enrolled in an open trial of EMPOWER. Surrogates completed six, 15–20 min modules, totaling 1.5–2 h. Surrogates were administered measures of peritraumatic distress, experiential avoidance, prolonged grief, distress tolerance, anxiety, and depression at pre-intervention, post-intervention, and at 1-month and 3-month follow-up assessments.
Results
Part 1 resulted in changes to the EMPOWER manual, including reducing jargon, improving navigability, making EMPOWER applicable for a range of illness scenarios, rearranging the modules, and adding further instructions and psychoeducation. Part 2 findings suggested that EMPOWER is feasible, with 100% of participants completing all modules. The acceptability of EMPOWER appeared strong, with high ratings of effectiveness and helpfulness (M = 8/10). Results showed immediate post-intervention improvements in anxiety (d = −0.41), peritraumatic distress (d = −0.24), and experiential avoidance (d = −0.23). At the 3-month follow-up assessments, surrogates exhibited improvements in prolonged grief symptoms (d = −0.94), depression (d = −0.23), anxiety (d = −0.29), and experiential avoidance (d = −0.30).
Significance of results
Preliminary data suggest that EMPOWER is feasible, acceptable, and associated with notable improvements in psychological symptoms among surrogates. Future research should examine EMPOWER with a larger sample in a randomized controlled trial.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: In Ottawa, STEMI patients are transported directly to percutaneous coronary intervention (PCI) by advanced care paramedics (ACPs), primary care paramedics (PCPs), or transferred from PCP to ACP crew (ACP-intercept). PCPs have a limited skill set to address complications during transport.The objective of this study was to determine what clinically important events (CIEs) occurred in STEMI patients transported for primary PCI via a PCP crew, and what proportion of such events could only be treated by ACP protocols. Methods: We conducted a health record review of STEMI patients transported for primary PCI from Jan 1, 2011-Dec 21, 2015. Ottawa has a single PCI center and its EMS system employs both PCP and ACP paramedics. We identified consecutive STEMI bypass patients transported by PCP-only and ACP-intercept using the dispatch database. A data extraction form was piloted and used to extract patient demographics, transport times, and primary outcomes: CIEs and interventions performed during transport, and secondary outcomes: hospital diagnosis, and mortality. CIEs were reviewed by two investigators to determine if they would be treated differently by ACP protocols. We present descriptive statistics. Results: We identified 967 STEMI bypass cases among which 214 (118 PCP-only and 96 ACP-intercept) met all inclusion criteria. Characteristics were: mean age 61.4 years, 78% male, 31.8% anterior and 44.4% inferior infarcts, mean response time 6 min, total paramedic contact time 29 min, and in cases of ACP-intercept 7 min of PCP-only contact time.A CIE occurred in 127 (59%) of cases: SBP<90 mmHg 26.2%, HR<60 30.4%, HR>100 20.6%, malignant arrhythmias 7.5%, altered mental status 6.5%, airway intervention 2.3%, 2 patients (0.9%) arrested, both survived. Of the CIE identified, 54 (42.5%) could be addressed differently by ACP vs PCP protocols (25.2% of total cases). The majority related to fluid boluses for hypotension (44 cases; 35% of CIE). ACP intervention for CIEs within the ACP intercept group was 51.6%. There were 6 in-hospital deaths (2.8%) with no difference in transport crew type. Conclusion: CIEs are common in STEMI bypass patients however a smaller proportion of such CIE would be addressed differently by ACP protocols compared to PCP protocols. The vast majority of CIE appeared to be transient and of limited clinical significance.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Civilian suicide rates vary by occupation in ways related to occupational stress exposure. Comparable military research finds suicide rates elevated in combat arms occupations. However, no research has evaluated variation in this pattern by deployment history, the indicator of occupation stress widely considered responsible for the recent rise in the military suicide rate.
Method
The joint associations of Army occupation and deployment history in predicting suicides were analysed in an administrative dataset for the 729 337 male enlisted Regular Army soldiers in the US Army between 2004 and 2009.
Results
There were 496 suicides over the study period (22.4/100 000 person-years). Only two occupational categories, both in combat arms, had significantly elevated suicide rates: infantrymen (37.2/100 000 person-years) and combat engineers (38.2/100 000 person-years). However, the suicide rates in these two categories were significantly lower when currently deployed (30.6/100 000 person-years) than never deployed or previously deployed (41.2–39.1/100 000 person-years), whereas the suicide rate of other soldiers was significantly higher when currently deployed and previously deployed (20.2–22.4/100 000 person-years) than never deployed (14.5/100 000 person-years), resulting in the adjusted suicide rate of infantrymen and combat engineers being most elevated when never deployed [odds ratio (OR) 2.9, 95% confidence interval (CI) 2.1–4.1], less so when previously deployed (OR 1.6, 95% CI 1.1–2.1), and not at all when currently deployed (OR 1.2, 95% CI 0.8–1.8). Adjustment for a differential ‘healthy warrior effect’ cannot explain this variation in the relative suicide rates of never-deployed infantrymen and combat engineers by deployment status.
Conclusions
Efforts are needed to elucidate the causal mechanisms underlying this interaction to guide preventive interventions for soldiers at high suicide risk.
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) has found that the proportional elevation in the US Army enlisted soldier suicide rate during deployment (compared with the never-deployed or previously deployed) is significantly higher among women than men, raising the possibility of gender differences in the adverse psychological effects of deployment.
Method
Person-month survival models based on a consolidated administrative database for active duty enlisted Regular Army soldiers in 2004–2009 (n = 975 057) were used to characterize the gender × deployment interaction predicting suicide. Four explanatory hypotheses were explored involving the proportion of females in each soldier's occupation, the proportion of same-gender soldiers in each soldier's unit, whether the soldier reported sexual assault victimization in the previous 12 months, and the soldier's pre-deployment history of treated mental/behavioral disorders.
Results
The suicide rate of currently deployed women (14.0/100 000 person-years) was 3.1–3.5 times the rates of other (i.e. never-deployed/previously deployed) women. The suicide rate of currently deployed men (22.6/100 000 person-years) was 0.9–1.2 times the rates of other men. The adjusted (for time trends, sociodemographics, and Army career variables) female:male odds ratio comparing the suicide rates of currently deployed v. other women v. men was 2.8 (95% confidence interval 1.1–6.8), became 2.4 after excluding soldiers with Direct Combat Arms occupations, and remained elevated (in the range 1.9–2.8) after adjusting for the hypothesized explanatory variables.
Conclusions
These results are valuable in excluding otherwise plausible hypotheses for the elevated suicide rate of deployed women and point to the importance of expanding future research on the psychological challenges of deployment for women.
During improved oil recovery (IOR), gas may be introduced into a porous reservoir filled with surfactant solution in order to form foam. A model for the evolution of the resulting foam front known as ‘pressure-driven growth’ is analysed. An asymptotic solution of this model for long times is derived that shows that foam can propagate indefinitely into the reservoir without gravity override. Moreover, ‘pressure-driven growth’ is shown to correspond to a special case of the more general ‘viscous froth’ model. In particular, it is a singular limit of the viscous froth, corresponding to the elimination of a surface tension term, permitting sharp corners and kinks in the predicted shape of the front. Sharp corners tend to develop from concave regions of the front. The principal solution of interest has a convex front, however, so that although this solution itself has no sharp corners (except for some kinks that develop spuriously owing to errors in a numerical scheme), it is found nevertheless to exhibit milder singularities in front curvature, as the long-time asymptotic analytical solution makes clear. Numerical schemes for the evolving front shape which perform robustly (avoiding the development of spurious kinks) are also developed. Generalisations of this solution to geologically heterogeneous reservoirs should exhibit concavities and/or sharp corner singularities as an inherent part of their evolution: propagation of fronts containing such ‘inherent’ singularities can be readily incorporated into these numerical schemes.
The US Army suicide rate has increased sharply in recent years. Identifying significant predictors of Army suicides in Army and Department of Defense (DoD) administrative records might help focus prevention efforts and guide intervention content. Previous studies of administrative data, although documenting significant predictors, were based on limited samples and models. A career history perspective is used here to develop more textured models.
Method
The analysis was carried out as part of the Historical Administrative Data Study (HADS) of the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS). De-identified data were combined across numerous Army and DoD administrative data systems for all Regular Army soldiers on active duty in 2004–2009. Multivariate associations of sociodemographics and Army career variables with suicide were examined in subgroups defined by time in service, rank and deployment history.
Results
Several novel results were found that could have intervention implications. The most notable of these were significantly elevated suicide rates (69.6–80.0 suicides per 100 000 person-years compared with 18.5 suicides per 100 000 person-years in the total Army) among enlisted soldiers deployed either during their first year of service or with less than expected (based on time in service) junior enlisted rank; a substantially greater rise in suicide among women than men during deployment; and a protective effect of marriage against suicide only during deployment.
Conclusions
A career history approach produces several actionable insights missed in less textured analyses of administrative data predictors. Expansion of analyses to a richer set of predictors might help refine understanding of intervention implications.
The oncogenic potential of human papillomaviruses (HPV) is well known in the context of cervical carcinoma; however, their role in the development of oesophageal squamous cell carcinoma (OSCC) is less clear. We aimed to determine the extent of the association between HPV infection and OSCC. A comprehensive literature search found 132 studies addressing HPV and OSCC in human cases, and a meta-analysis was performed using a random-effects model. There was evidence of an increased risk of OSCC in patients with HPV infection [odds ratio (OR) 2·69, 95% confidence interval (CI) 2·05–3·54]. The prevalence of HPV in OSCC was found to be 24·8%. There was an increased risk associated with HPV-16 infection (OR 2·35, 95% CI 1·73–3·19). Subgroup analyses showed geographical variance, with Asia (OR 2·94, 95% CI 2·16–4·00), and particularly China (OR 2·85, 95% CI 2·05–3·96) being high-risk areas. Our results confirm an increase in HPV infection in OSCC cases.
It has been postulated that aging is the consequence of an accelerated accumulation of somatic DNA mutations and that subsequent errors in the primary structure of proteins ultimately reach levels sufficient to affect organismal functions. The technical limitations of detecting somatic changes and the lack of insight about the minimum level of erroneous proteins to cause an error catastrophe hampered any firm conclusions on these theories. In this study, we sequenced the whole genome of DNA in whole blood of two pairs of monozygotic (MZ) twins, 40 and 100 years old, by two independent next-generation sequencing (NGS) platforms (Illumina and Complete Genomics). Potentially discordant single-base substitutions supported by both platforms were validated extensively by Sanger, Roche 454, and Ion Torrent sequencing. We demonstrate that the genomes of the two twin pairs are germ-line identical between co-twins, and that the genomes of the 100-year-old MZ twins are discerned by eight confirmed somatic single-base substitutions, five of which are within introns. Putative somatic variation between the 40-year-old twins was not confirmed in the validation phase. We conclude from this systematic effort that by using two independent NGS platforms, somatic single nucleotide substitutions can be detected, and that a century of life did not result in a large number of detectable somatic mutations in blood. The low number of somatic variants observed by using two NGS platforms might provide a framework for detecting disease-related somatic variants in phenotypically discordant MZ twins.
In this article we present the protocol of the Birmingham Registry for Twin Heritability Studies (BiRTHS), which aims to establish a long-term prospective twin registry with twins identified from the antenatal period and subjected to detailed follow-up. We plan to investigate the concordance in anthropo-metrics and early childhood phenotypes between 66 monozygotic and 154 dizygotic twin pairs in the first 2 years of recruitment. In this project we plan to determine the relative contributions of heritability and environment to fetal growth, birth size, growth in infancy and development up to 2 years of age in an ethnically mixed population. Twins will be assessed with the Griffitth's Mental Development Scales, which will enable us to obtain detailed information on development. As maternal depression may have an effect on the twins' neurodevelopment, the Edinburgh Postnatal Depression Scale will be used at various stages during pregnancy and after delivery to assess maternal depressive symptoms. The increasing prevalence of obesity in both adults and children has raised concerns about the effect of maternal obesity in pregnancy on fetal growth. The prospective study design gives us the opportunity to obtain data on maternal nutrition (reflected by body mass index) and ante- and postnatal growth and development of twins.
The protozoan parasite Toxoplasma gondii is prevalent worldwide and can infect a remarkably wide range of hosts despite felids being the only definitive host. As cats play a major role in transmission to secondary mammalian hosts, the interaction between cats and these hosts should be a major factor determining final prevalence in the secondary host. This study investigates the prevalence of T. gondii in a natural population of Apodemus sylvaticus collected from an area with low cat density (<2·5 cats/km2). A surprisingly high prevalence of 40·78% (95% CI: 34·07%–47·79%) was observed despite this. A comparable level of prevalence was observed in a previously published study using the same approaches where a prevalence of 59% (95% CI: 50·13%–67·87%) was observed in a natural population of Mus domesticus from an area with high cat density (>500 cats/km2). Detection of infected foetuses from pregnant dams in both populations suggests that congenital transmission may enable persistence of infection in the absence of cats. The prevalences of the related parasite, Neospora caninum were found to be low in both populations (A. sylvaticus: 3·39% (95% CI: 0·12%–6·66%); M. domesticus: 3·08% (95% CI: 0·11%–6·05%)). These results suggest that cat density may have a lower than expected effect on final prevalence in these ecosystems.
The Health Protection Agency/QSurveillance national surveillance system utilizes QSurveillance® a recently developed general practitioner database covering over 23 million people in the UK. We describe the spread of the first wave of the influenza A(H1N1) pandemic 2009 using data on consultations for influenza-like illness (ILI), respiratory illness and prescribing for influenza from 3400 contributing general practices. Daily data, provided from 27 April 2009 to 28 January 2010, were used to give a timely overview for those managing the pandemic nationally and locally. The first wave particularly affected London and the West Midlands with a peak in ILI in week 30. Children aged between 1 and 15 years had consistently high consultation rates for ILI. Daily ILI rates were used for modelling national weekly case estimates. The system enabled the ‘real-time’ monitoring of the pandemic to a small geographical area, linking morbidity and prescribing for influenza and other respiratory illnesses.
The methanol multi-beam (MMB) survey has produced the largest and most complete catalogue of Galactic 6.7-GHz methanol masers to date. 6.7-GHz methanol masers are exclusively associated with high-mass star formation, and as such provide invaluable insight into the Galactic distribution and properties of high-mass star formation regions. I present the statistical properties of the MMB catalogue and, through the calculation of kinematic distances, investigate the resolution of distance ambiguities and explore the Galactic distribution.
The development of a stock of Corcyra cephalonica (Stnt.) from Burma was followed carefully on a diet of wheatfeed, glycerol and yeast at constant temperatures ranging from 15 to 37°C and humidities from 15 to 90% RH. Two other stocks, from Nigeria and Malawi, were also studied under a few conditions. Limits for complete development from egg hatch to adult emergence were about 17 and 35°C at 70% RH. At 15°C, all larvae died early in development, but at 37·5°C a few managed to pupate. Highest survival and most rapid development occurred at 30–32·5°C and 70% RH. Development was completed in the range 15–80% RH, but few adults emerged at 15% RH and none at 90% RH unless a mould-inhibitor was present in the food. No second-generation larvae were obtained from adults reared and kept at 20°C and 70% RH. Egg period was influenced by temperature but not by humidity in the range 20–80%, RH. Eggs hatched at temperatures from 17·5 to 32·5°C. Hatch was adversely affected by low humidity, and very few hatched at 20% RH. Considerable variation in the rate of egg hatch between the three stocks may have been due to differences in the length of time each stock had been reared in the laboratory. Cold tolerance of eggs of the Nigerian stock was low. All eggs died at 10°C after a seven-day exposure, and at 15°C, although a few 0–1-day-old eggs exposed for 14 days hatched, none completed development to the adult stage. Although older eggs were slightly more cold-tolerant than younger ones at 10°C, they were less so at 15°C. Adult males tended to emerge earlier and live longer than unmated females. Adults of the recently collected Malawi stock were heavier and lived longer than those of the Burma stock that had been reared in the laboratory for many generations.