To send content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about sending content to .
To send content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about sending to your Kindle.
Note you can select to send to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Intensified cover cropping practices are increasingly viewed as an herbicide resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover cropping tactics, including (1) facilitation of reduced herbicide inputs, and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter- and summer- annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha-1 in corn and 3,000 to 5,500 kg ha-1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production with cover cropping tactics providing an additive weed suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (> 7.6 cm diameter) at the time of pre-plant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (> 10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
Fluoroquinolones (FQs) and extended-spectrum cephalosporins (ESCs) are associated with higher risk of Clostridioides difficile infection (CDI). Decreasing the unnecessary use of FQs and ESCs is a goal of antimicrobial stewardship. Understanding how prescribers perceive the risks and benefits of FQs and ESCs is needed.
We conducted interviews with clinicians from 4 hospitals. Interviews elicited respondent perceptions about the risk of ESCs, FQs, and CDI. Interviews were audio recorded, transcribed, and analyzed using a flexible coding approach.
Interviews were conducted with 64 respondents (38 physicians, 7 nurses, 6 advance practice providers, and 13 pharmacists). ESCs and FQs were perceived to have many benefits, including infrequent dosing, breadth of coverage, and greater patient adherence after hospital discharge. Prescribers stated that it was easy to make decisions about these drugs, so they were especially appealing to use in the context of time pressures. They described having difficulty discontinuing these drugs when prescribed by others due to inertia and fear. Prescribers were skeptical about targeting specific drugs as a stewardship approach and felt that the risk of a negative outcome from under treatment of a suspected bacterial infection was a higher priority than the prevention of CDI.
Prescribers in this study perceived many advantages to using ESCs and FQs, especially under conditions of time pressure and uncertainty. In making decisions about these drugs, prescribers balance risk and benefit, and they believed that the risk of CDI was acceptable in compared with the risk of undertreatment.
Introduction: Digital distraction is being integrated into pediatric pain care, but its efficacy is currently unknown. We conducted a systematic review to determine the effect of digital technology distraction on pain and distress for children experiencing acutely painful conditions or medical procedures. Methods: We searched eight online databases (MEDLINE, Embase, Cochrane Library, CINAHL, PsycINFO, IEEE Xplore, Ei Compendex, Web of Science), grey literature sources, scanned reference lists, and contacted experts for quantitative studies where digital technologies were used as distraction for acutely painful conditions or procedures in children. Study selection was performed by two independent reviewers with consensus. One reviewer extracted relevant study data and another verified it for accuracy. Appraisal of risk of bias within studies and the certainty of the body of evidence were performed independently in duplicate, with the final appraisal determined by consensus. The primary outcomes of interest were child pain and distress. Results: Of 3247 unique records identified by the search, we included 106 studies (n = 7820) that reported on digital technology distractors (e.g., virtual reality; videogames) used during common procedures (e.g., venipuncture, minor dental procedures, burn treatments). We located no studies reporting on painful conditions. For painful procedures, digital distraction resulted in a modest but clinically important reduction in self-reported pain (SMD -0.48, 95% CI -0.66 to -0.29, 46 RCTs, n = 3200), observer-reported pain (SMD -0.68, 95% CI -0.91 to -0.45, 17 RCTs, n = 1199), behavioural pain (SMD -0.57, 95% CI -0.94 to -0.19, 19 RCTs, n = 1173), self-reported distress (SMD -0.49, 95% CI -0.70 to -0.27, 19 RCTs, n = 1818), observer-reported distress (SMD -0.47, 95% CI -0.77 to -0.17, 10 RCTs, n = 826), and behavioural distress (SMD -0.35, 95% CI -0.59 to -0.12, 17 RCTs, n = 1264) compared to usual care. Few studies directly compared different distractors or provided subgroup data to inform applicability. Conclusion: Digital distraction provides modest pain and distress reduction for children undergoing painful procedures; its superiority over non-digital distractors is not established. Healthcare providers and parents should strongly consider using distractions as a pain-reduction strategy for children and teens during common painful procedures (e.g., needle pokes, dental fillings). Context, child preference, and availability should inform the choice of distractor.
Background: Since January 1, 2016 2358 people have died from opioid poisoning in Alberta. Buprenorphine/naloxone (bup/nal) is the recommended first line treatment for opioid use disorder (OUD) and this treatment can be initiated in emergency departments and urgent care centres (EDs). Aim Statement: This project aims to spread a quality improvement intervention to all 107 adult EDs in Alberta by March 31, 2020. The intervention supports clinicians to initiate bup/nal for eligible individuals and provide rapid referrals to OUD treatment clinics. Measures & Design: Local ED teams were identified (administrators, clinical nurse educators, physicians and, where available, pharmacists and social workers). Local teams were supported by a provincial project team (project manager, consultant, and five physician leads) through a multi-faceted implementation process using provincial order sets, clinician education products, and patient-facing information. We used administrative ED and pharmacy data to track the number of visits where bup/nal was given in ED, and whether discharged patients continued to fill any opioid agonist treatment (OAT) prescription 30 days after their index ED visit. OUD clinics reported the number of referrals received from EDs and the number attending their first appointment. Patient safety event reports were tracked to identify any unintended negative impacts. Evaluation/Results: We report data from May 15, 2018 (program start) to September 31, 2019. Forty-nine EDs (46% of 107) implemented the program and 22 (45% of 49) reported evaluation data. There were 5385 opioid-related visits to reporting ED sites after program adoption. Bup/nal was given during 832 ED visits (663 unique patients): 7 visits in the 1st quarter the program operated, 55 in the 2nd, 74 in the 3rd, 143 in the 4th, 294 in the 5th, and 255 in the 6th. Among 505 unique discharged patients with 30 day follow up data available 319 (63%) continued to fill any OAT prescription after receiving bup/nal in ED. 16 (70%) of 23 community clinics provided data. EDs referred patients to these clinics 440 times, and 236 referrals (54%) attended their first follow-up appointment. Available data may under-report program impact. 5 patient safety events have been reported, with no harm or minimal harm to the patient. Discussion/Impact: Results demonstrate effective spread and uptake of a standardized provincial ED based early medical intervention program for patients who live with OUD.
The curves recommended for calibrating radiocarbon (14C) dates into absolute dates have been updated. For calibrating atmospheric samples from the Northern Hemisphere, the new curve is called IntCal20. This is accompanied by associated curves SHCal20 for the Southern Hemisphere, and Marine20 for marine samples. In this “companion article” we discuss advances and developments that have led to improvements in the updated curves and highlight some issues of relevance for the general readership. In particular the dendrochronological based part of the curve has seen a significant increase in data, with single-year resolution for certain time ranges, extending back to 13,910 calBP. Beyond the tree rings, the new curve is based upon an updated combination of marine corals, speleothems, macrofossils, and varved sediments and now reaches back to 55,000 calBP. Alongside these data advances, we have developed a new, bespoke statistical curve construction methodology to allow better incorporation of the diverse constituent records and produce a more robust curve with uncertainties. Combined, these data and methodological advances offer the potential for significant new insight into our past. We discuss some implications for the user, such as the dating of the Santorini eruption and also some consequences of the new curve for Paleolithic archaeology.
Increased impulsivity is a diagnostic feature of mania in bipolar disorder (BD). However it is unclear whether increased impulsivity is also a trait feature of BD and therefore present in remission. Trait impulsivity can also be construed as a personality dimension but the relationship between personality and impulsivity in BD has not been explored. The aim of this study was to examine the relationship of impulsivity to clinical status and personality characteristics in patients with BD.
We measured impulsivity using the Barratt Impulsiveness Scale (BIS-11) and personality dimensions using Eysenck Personality Questionnaire in 106 BD patients and demographically matched healthy volunteers. Clinical symptoms were assessed in all participants using the Clinical Global Impressions Scale, the Montgomery-Asberg Depression Rating Scale and the Young Mania Rating Scale. Based on their clinical status patients were divided in remitted (n = 36), subsyndromal (n = 25) and syndromal (n = 45).
There was no difference in BIS-11 and EPQ scores between remitted patients and healthy subjects. Impulsivity, Neuroticism and Psychoticism scores were increased in subsyndromal and syndromal patients. Within the BD group, total BIS-11 score was predicted mainly by symptoms severity followed by Psychoticism and Neuroticism scores.
Increased impulsivity may not be a trait feature of BD. Symptom severity is the most significant determinant of impulsivity measures even in subsyndromal patients.
Before October 2012 there was no service level agreement for psychiatry cover in Whiston Hospital, an acute trust in the UK. The Crisis team would visit on goodwill to assess patients. This changed when a Liaison Psychiatry (LP) service was commissioned to provide 24 hour cover, Monday to Sunday for the Emergency Department (ED) for adults.
To quantify waiting times to be assessed by psychiatry, comparing the new LP Service (intervention group) to its predecessor (control). The null hypothesis being that the waiting time for the control and intervention group are the same.
The authors prospectively collected data on all referrals received by the LP service in the first three months of operation n=305 and retrospectively collected data on a random sample of 50 patients referred from ED in the same months 2011 (control).
The median time from referral to the time of psychiatric assessment in the control group was 162.5 minutes [IQR 130–330], the mean time was 246.16 [95% CI 180 to 312]. The median time from referral to the time of psychiatric assessment following the introduction of the LP service was 30 minutes [IQR 15-90], the mean time was 79.63 [95% CI 65 to 93]. When the two samples were compared using an independent t test they were significantly different p<0.002.
The new LP service has decreased the median wait for a psychiatry assessment by 132 minutes. The team currently seeS 82% of referrals within 60 minutes. This improves patient safety and encourages appropriate and timely discharge.
The Department of Health in the UK wants the National Health Service to make £20 Billion worth of efficiency savings by 2015 to reinvest.
In the UK the General Hospitals use paper records which are then scanned to create electronic records while Psychiatric Hospitals require that information to be typed on to their electronic records and these electronic records are not available to each other.
Therefore liaison psychiatry assessments require a written entry to be made in the Medical notes and a second entry typed on to the psychiatric electronic patient record which requires a full psychiatric history.
This duplication in typing information was consuming a considerable amount of this Teams time and resources which could have instead been spent with patients.
To identify how much time is spent by Staff typing information on to the psychiatric electronic patient records.
We electronically checked for the preceding three months the amount of time spent typing information on to the electronic records after every liaison psychiatry assessment.
We were then able to obtain the average for every week.
On average about 36 to 40 hours were spent every week typing information on to the electronic records.
Liaison Psychiatry should dispense with the requirement for information to be duplicated on to the electronic patient records and should instead scan the written entry made in the Medical notes.
This should lead to a saving of about £50,000, enough to employ an additional member of Staff every week.
Neurocognitive impairments robustly predict functional outcome. However, heterogeneity in neurocognition is common within diagnostic groups, and data-driven analyses reveal homogeneous neurocognitive subgroups cutting across diagnostic boundaries.
To determine whether data-driven neurocognitive subgroups of young people with emerging mental disorders are associated with 3-year functional course.
Model-based cluster analysis was applied to neurocognitive test scores across nine domains from 629 young people accessing mental health clinics. Cluster groups were compared on demographic, clinical and substance-use measures. Mixed-effects models explored associations between cluster-group membership and socio-occupational functioning (using the Social and Occupational Functioning Assessment Scale) over 3 years, adjusted for gender, premorbid IQ, level of education, depressive, positive, negative and manic symptoms, and diagnosis of a primary psychotic disorder.
Cluster analysis of neurocognitive test scores derived three subgroups described as ‘normal range’ (n = 243, 38.6%), ‘intermediate impairment’ (n = 252, 40.1%), and ‘global impairment’ (n = 134, 21.3%). The major mental disorder categories (depressive, anxiety, bipolar, psychotic and other) were represented in each neurocognitive subgroup. The global impairment subgroup had lower functioning for 3 years of follow-up; however, neither the global impairment (B = 0.26, 95% CI −0.67 to 1.20; P = 0.581) or intermediate impairment (B = 0.46, 95% CI −0.26 to 1.19; P = 0.211) subgroups differed from the normal range subgroup in their rate of change in functioning over time.
Neurocognitive impairment may follow a continuum of severity across the major syndrome-based mental disorders, with data-driven neurocognitive subgroups predictive of functional course. Of note, the global impairment subgroup had longstanding functional impairment despite continuing engagement with clinical services.
Duchenne muscular dystrophy is associated with progressive cardiorespiratory failure, including left ventricular dysfunction.
Methods and Results:
Males with probable or definite diagnosis of Duchenne muscular dystrophy, diagnosed between 1 January, 1982 and 31 December, 2011, were identified from the Muscular Dystrophy Surveillance Tracking and Research Network database. Two non-mutually exclusive groups were created: patients with ≥2 echocardiograms and non-invasive positive pressure ventilation-compliant patients with ≥1 recorded ejection fraction. Quantitative left ventricular dysfunction was defined as an ejection fraction <55%. Qualitative dysfunction was defined as mild, moderate, or severe. Progression of quantitative left ventricular dysfunction was modelled as a continuous time-varying outcome. Change in qualitative left ventricle function was assessed by the percentage of patients within each category at each age. Forty-one percent (n = 403) had ≥2 ejection fractions containing 998 qualitative assessments with a mean age at first echo of 10.8 ± 4.6 years, with an average first ejection fraction of 63.1 ± 12.6%. Mean age at first echo with an ejection fraction <55 was 15.2 ± 3.9 years. Thirty-five percent (140/403) were non-invasive positive pressure ventilation-compliant and had ejection fraction information. The estimated rate of decline in ejection fraction from first ejection fraction was 1.6% per year and initiation of non-invasive positive pressure ventilation did not change this rate.
In our cohort, we observed that left ventricle function in patients with Duchenne muscular dystrophy declined over time, independent of non-invasive positive pressure ventilation use. Future studies are needed to examine the impact of respiratory support on cardiac function.
Healthcare personnel (HCP) were recruited to provide serum samples, which were tested for antibodies against Ebola or Lassa virus to evaluate for asymptomatic seroconversion.
From 2014 to 2016, 4 patients with Ebola virus disease (EVD) and 1 patient with Lassa fever (LF) were treated in the Serious Communicable Diseases Unit (SCDU) at Emory University Hospital. Strict infection control and clinical biosafety practices were implemented to prevent nosocomial transmission of EVD or LF to HCP.
All personnel who entered the SCDU who were required to measure their temperatures and complete a symptom questionnaire twice daily were eligible.
No employee developed symptomatic EVD or LF. EVD and LF antibody studies were performed on sera samples from 42 HCP. The 6 participants who had received investigational vaccination with a chimpanzee adenovirus type 3 vectored Ebola glycoprotein vaccine had high antibody titers to Ebola glycoprotein, but none had a response to Ebola nucleoprotein or VP40, or a response to LF antigens.
Patients infected with filoviruses and arenaviruses can be managed successfully without causing occupation-related symptomatic or asymptomatic infections. Meticulous attention to infection control and clinical biosafety practices by highly motivated, trained staff is critical to the safe care of patients with an infection from a special pathogen.
Intermittent explosive disorder (IED) is characterised by impulsive anger attacks that vary greatly across individuals in severity and consequence. Understanding IED subtypes has been limited by lack of large, general population datasets including assessment of IED. Using the 17-country World Mental Health surveys dataset, this study examined whether behavioural subtypes of IED are associated with differing patterns of comorbidity, suicidality and functional impairment.
IED was assessed using the Composite International Diagnostic Interview in the World Mental Health surveys (n = 45 266). Five behavioural subtypes were created based on type of anger attack. Logistic regression assessed association of these subtypes with lifetime comorbidity, lifetime suicidality and 12-month functional impairment.
The lifetime prevalence of IED in all countries was 0.8% (s.e.: 0.0). The two subtypes involving anger attacks that harmed people (‘hurt people only’ and ‘destroy property and hurt people’), collectively comprising 73% of those with IED, were characterised by high rates of externalising comorbid disorders. The remaining three subtypes involving anger attacks that destroyed property only, destroyed property and threatened people, and threatened people only, were characterised by higher rates of internalising than externalising comorbid disorders. Suicidal behaviour did not vary across the five behavioural subtypes but was higher among those with (v. those without) comorbid disorders, and among those who perpetrated more violent assaults.
The most common IED behavioural subtypes in these general population samples are associated with high rates of externalising disorders. This contrasts with the findings from clinical studies of IED, which observe a preponderance of internalising disorder comorbidity. This disparity in findings across population and clinical studies, together with the marked heterogeneity that characterises the diagnostic entity of IED, suggests that it is a disorder that requires much greater research.
For life insurers in the United Kingdom (UK), the risk margin is one of the most controversial aspects of the Solvency II regime which came into force in 2016.
The risk margin is the difference between the technical provisions and the best estimate liabilities. The technical provisions are intended to be market-consistent, and so are defined as the amount required to be paid to transfer the business to another undertaking. In practice, the technical provisions cannot be directly calculated, and so the risk margin must be determined using a proxy method; the method chosen for Solvency II is known as the cost-of-capital method.
Following the implementation of Solvency II, the risk margin came under considerable criticism for being too large and too sensitive to interest rate movements. These criticisms are particularly valid for annuity business in the UK – such business is of great significance to the system for retirement provision. A further criticism is that mitigation of the impact of the risk margin has led to an increase in reinsurance of longevity risks, particularly to overseas reinsurers.
This criticism has led to political interest, and the risk margin was a major element of the Treasury Committee inquiry into EU Insurance Regulation.
The working party was set up in response to this criticism. Our brief is to consider both the overall purpose of the risk margin for life insurers and solutions to the current problems, having regard to the possibility of post-Brexit flexibility.
We have concluded that a risk margin in some form is necessary, although its size depends on the level of security desired, and so is primarily a political question.
We have reviewed possible alternatives to the current risk margin, both within the existing cost-of-capital methodology and considering a wide range of alternatives.
We believe that requirements for the risk margin will depend on future circumstances, in particular relating to Brexit, and we have identified a number of possible changes to methodology which should be considered, depending on circumstances.
Disturbed sleep and activity are prominent features of bipolar disorder type I (BP-I). However, the relationship of sleep and activity characteristics to brain structure and behavior in euthymic BP-I patients and their non-BP-I relatives is unknown. Additionally, underlying genetic relationships between these traits have not been investigated.
Relationships between sleep and activity phenotypes, assessed using actigraphy, with structural neuroimaging (brain) and cognitive and temperament (behavior) phenotypes were investigated in 558 euthymic individuals from multi-generational pedigrees including at least one member with BP-I. Genetic correlations between actigraphy-brain and actigraphy-behavior associations were assessed, and bivariate linkage analysis was conducted for trait pairs with evidence of shared genetic influences.
More physical activity and longer awake time were significantly associated with increased brain volumes and cortical thickness, better performance on neurocognitive measures of long-term memory and executive function, and less extreme scores on measures of temperament (impulsivity, cyclothymia). These associations did not differ between BP-I patients and their non-BP-I relatives. For nine activity-brain or activity-behavior pairs there was evidence for shared genetic influence (genetic correlations); of these pairs, a suggestive bivariate quantitative trait locus on chromosome 7 for wake duration and verbal working memory was identified.
Our findings indicate that increased physical activity and more adequate sleep are associated with increased brain size, better cognitive function and more stable temperament in BP-I patients and their non-BP-I relatives. Additionally, we found evidence for pleiotropy of several actigraphy-behavior and actigraphy-brain phenotypes, suggesting a shared genetic basis for these traits.
We conducted a systematic review and network meta-analysis to determine the comparative efficacy of antibiotics used to control bovine respiratory disease (BRD) in beef cattle on feedlots. The information sources for the review were: MEDLINE®, MEDLINE In-Process and MEDLINE® Daily, AGRICOLA, Epub Ahead of Print, Cambridge Agricultural and Biological Index, Science Citation Index, Conference Proceedings Citation Index – Science, the Proceedings of the American Association of Bovine Practitioners, World Buiatrics Conference, and the United States Food and Drug Administration Freedom of Information New Animal Drug Applications summaries. The eligible population was weaned beef cattle raised in intensive systems. The interventions of interest were injectable antibiotics used at the time the cattle arrived at the feedlot. The outcome of interest was the diagnosis of BRD within 45 days of arrival at the feedlot. The network meta-analysis included data from 46 studies and 167 study arms identified in the review. The results suggest that macrolides are the most effective antibiotics for the reduction of BRD incidence. Injectable oxytetracycline effectively controlled BRD compared with no antibiotics; however, it was less effective than macrolide treatment. Because oxytetracycline is already commonly used to prevent, control, and treat BRD in groups of feedlot cattle, the use of injectable oxytetracycline for BRD control might have advantages from an antibiotic stewardship perspective.
Vaccination against putative causal organisms is a frequently used and preferred approach to controlling bovine respiratory disease complex (BRD) because it reduces the need for antibiotic use. Because approximately 90% of feedlots use and 90% of beef cattle receive vaccines in the USA, information about their comparative efficacy would be useful for selecting a vaccine. We conducted a systematic review and network meta-analysis of studies assessing the comparative efficacy of vaccines to control BRD when administered to beef cattle at or near their arrival at the feedlot. We searched MEDLINE, MEDLINE In-Process, MEDLINE Daily Epub Ahead of Print, AGRICOLA, Cambridge Agricultural and Biological Index, Science Citation Index, and Conference Proceedings Citation Index – Science and hand-searched the conference proceedings of the American Association of Bovine Practitioners and World Buiatrics Congress. We found 53 studies that reported BRD morbidity within 45 days of feedlot arrival. The largest connected network of studies, which involved 17 vaccine protocols from 14 studies, was included in the meta-analysis. Consistent with previous reviews, we found little compelling evidence that vaccines used at or near arrival at the feedlot reduce the incidence of BRD diagnosis.
Early detection and intervention strategies in patients at clinical high-risk (CHR) for syndromal psychosis have the potential to contain the morbidity of schizophrenia and similar conditions. However, research criteria that have relied on severity and number of positive symptoms are limited in their specificity and risk high false-positive rates. Our objective was to examine the degree to which measures of recency of onset or intensification of positive symptoms [a.k.a., new or worsening (NOW) symptoms] contribute to predictive capacity.
We recruited 109 help-seeking individuals whose symptoms met criteria for the Progression Subtype of the Attenuated Positive Symptom Psychosis-Risk Syndrome defined by the Structured Interview for Psychosis-Risk Syndromes and followed every three months for two years or onset of syndromal psychosis.
Forty-one (40.6%) of 101 participants meeting CHR criteria developed a syndromal psychotic disorder [mostly (80.5%) schizophrenia] with half converting within 142 days (interquartile range: 69–410 days). Patients with more NOW symptoms were more likely to convert (converters: 3.63 ± 0.89; non-converters: 2.90 ± 1.27; p = 0.001). Patients with stable attenuated positive symptoms were less likely to convert than those with NOW symptoms. New, but not worsening, symptoms, in isolation, also predicted conversion.
Results suggest that the severity and number of attenuated positive symptoms are less predictive of conversion to syndromal psychosis than the timing of their emergence and intensification. These findings also suggest that the earliest phase of psychotic illness involves a rapid, dynamic process, beginning before the syndromal first episode, with potentially substantial implications for CHR research and understanding the neurobiology of psychosis.
Viral pneumonia is an important cause of death and morbidity among infants worldwide. Transmission of non-influenza respiratory viruses in households can inform preventative interventions and has not been well-characterised in South Asia. From April 2011 to April 2012, household members of pregnant women enrolled in a randomised trial of influenza vaccine in rural Nepal were surveyed weekly for respiratory illness until 180 days after birth. Nasal swabs were tested by polymerase chain reaction for respiratory viruses in symptomatic individuals. A transmission event was defined as a secondary case of the same virus within 14 days of initial infection within a household. From 555 households, 825 initial viral illness episodes occurred, resulting in 79 transmission events. The overall incidence of transmission was 1.14 events per 100 person-weeks. Risk of transmission incidence was associated with an index case age 1–4 years (incidence rate ratio (IRR) 2.35; 95% confidence interval (CI) 1.40–3.96), coinfection as initial infection (IRR 1.94; 95% CI 1.05–3.61) and no electricity in household (IRR 2.70; 95% CI 1.41–5.00). Preventive interventions targeting preschool-age children in households in resource-limited settings may decrease the risk of transmission to vulnerable household members, such as young infants.