We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A growing number of Australians are experiencing challenges accessing and affording healthy food due to climate-related disasters, global supply chain disruptions, and rapid inflation that is affecting the cost of healthy food(1). There is limited understanding of how participation community-based food cooperatives can address these challenges and improve food security and dietary intake. This study investigated the motivations for joining and impact of participation in a community-based food cooperative called Box Divvy on self-reported food security status and intake of fruits and vegetables among a sample of Australian adults. A cross-sectional online survey was conducted among Box Divvy members, that measured sociodemographic characteristics, motivations for joining, self-reported fruit and vegetable intake (serves/week), and food insecurity status (USDA 6-item short form(2)) before and while using Box Divvy. Participants were classified as being food secure, or experiencing marginal, moderate, or severe food insecurity. Logistic regression assessed demographic predictors and self-reported change in food security status, and ANOVA examined changes in dietary intake before joining and while using Box Divvy. Of participants (n = 2764, 37% aged 35–44 years, 83% European ethnicity, 92% New South Wales residents), most joined Box Divvy to support local farmers (87.3%), and save money on healthy foods (70.6%). Around half of respondents (50.8%) reported experiencing food insecurity before joining Box Divvy (24.5% marginal, 18.4% moderate, 7.9% severe food insecurity). Univariate logistic regression identified age, household structure, and income as significant predictors of food insecurity (p < 0.001). Participants experiencing food insecurity reported significantly lower consumption of fruits and vegetables prior to joining Box Divvy compared to those who were food secure (p < 0.001). While using Box Divvy, 28.2% of participants reported experiencing food insecurity (16.6% marginal, 9.6% moderate, 2.1% severe food insecurity). The odds of food insecurity while using Box Divvy were 62% lower than before joining (OR: 0.38; 95% CI 0.34–0.43; p < 0.001). On average, participants reported their fruit intake increased by 2.5 ± 5.6 serves/week (p < 0.001), and vegetable intake increased by 3.3 ± 5.7 serves/week (p < 0.001). The mean increase was significantly greater among moderately food insecure (fruit mean difference 3.2 ± 6.5 serves/week; vegetable mean difference 3.9 ± 6.9 serves/week) and severely food insecure groups (fruit mean difference 4.4 ± 6.9 serves/week; vegetable mean difference 5.5 ± 7.7 serves/week; p < 0.001). Participation in Box Divvy significantly improved self-reported food security status and fruit and vegetable intake among a large sample of Australian adults. Notably, fruit and vegetable intake significantly increased among those experiencing moderate and severe food insecurity. This underscores the potential of community-based food cooperatives to improve food security and promote healthier eating habits among Australian adults, especially households experiencing food insecurity.
Metabolic enzymes are the catalysts that drive the biochemical reactions essential for sustaining life. Many of these enzymes are tightly regulated by feedback mechanisms. To fully understand their roles and modulation, it is crucial to investigate the relationship between their structure, catalytic mechanism, and function. In this perspective, by using three examples from our studies on Mycobacterium tuberculosis (Mtb) isocitrate lyase and related proteins, we highlight how an integrated approach combining structural, activity, and biophysical data provides insights into their biological functions. These examples underscore the importance of employing fast-fail experiments at the early stages of a research project, emphasise the value of complementary techniques in validating findings, and demonstrate how in vitro data combined with chemical, biochemical, and physiological knowledge can lead to a broader understanding of metabolic adaptations in pathogenic bacteria. Finally, we address the unexplored questions in Mtb metabolism and discuss how we expand our approach to include microbiological and bioanalytical techniques to further our understanding. Such an integrated and interdisciplinary strategy has the potential to uncover novel regulatory mechanisms and identify new therapeutic opportunities for the eradication of tuberculosis. The approach can also be broadly applied to investigate other biochemical networks and complex biological systems.
Chronic musculoskeletal pain is associated with neurobiological, physiological, and cellular measures. Importantly, we have previously demonstrated that a biobehavioral and psychosocial resilience index appears to have a protective relationship on the same biomarkers. Less is known regarding the relationships between chronic musculoskeletal pain, protective factors, and brain aging. This study investigates the relationships between clinical pain, a resilience index, and brain age. We hypothesized that higher reported chronic pain would correlate with older appearing brains, and the resilience index will attenuate the strength of the relationship between chronic pain and brain age.
Participants and Methods:
Participants were drawn from an ongoing observational multisite study and included adults with chronic pain who also reported knee pain (N = 135; age = 58.3 ± 8.1; 64% female; 49% non-Hispanic Black, 51% non-Hispanic White; education Mdn = some college; income level Mdn = $30,000 - $40,000; MoCA M = 24.27 ± 3.49). Measures included the Graded Chronic Pain Scale (GCPS), characteristic pain intensity (CPI) and disability, total pain body sites; and a cognitive screening (MoCA). The resilience index consisted of validated biobehavioral (e.g., smoking, waist/hip ratio, and active coping) and psychosocial measures (e.g., optimism, positive affect, negative affect, perceived stress, and social support). T1-weighted MRI data were obtained. Surface area metrics were calculated in FreeSurfer using the Human Connectome Project's multi-modal cortical parcellation scheme. We calculated brain age in R using previously validated and trained machine learning models. Chronological age was subtracted from predicted brain age to generate a brain age gap (BAG). With higher scores of BAG indicating predicated age is older than chronological age. Three parallel hierarchical regression models (each containing one of three pain measures) with three blocks were performed to assess the relationships between chronic pain and the resilience index in relation to BAG, adjusting for covariates. For each model, Block 1 entered the covariates, Block 2 entered a pain score, and Block 3 entered the resilience index.
Results:
GCPS CPI (R2 change = .033, p = .027) and GCPS disability (R2 change = 0.038, p = 0.017) significantly predicted BAG beyond the effects of the covariates, but total pain sites (p = 0.865) did not. The resilience index was negatively correlated and a significant predictor of BAG in all three models (p < .05). With the resilience index added in Block 3, both GCPS CPI (p = .067) and GCPS disability (p = .066) measures were no longer significant in their respective models. Additionally, higher education/income (p = 0.016) and study site (p = 0.031) were also significant predictors of BAG.
Conclusions:
In this sample, higher reported chronic pain correlated with older appearing brains, and higher resilience attenuated this relationship. The biobehavioral and psychosocial resilience index was associated with younger appearing brains. While our data is cross-sectional, findings are encouraging that interventions targeting both chronic pain and biobehavioral and psychosocial factors (e.g., coping strategies, positive and negative affect, smoking, and social support) might buffer brain aging. Future directions include assessing if chronic pain and resilience factors can predict brain aging over time.
Although offspring of women exposed to childhood trauma exhibit elevated rates of psychopathology, many children demonstrate resilience to these intergenerational impacts. Among the variety of factors that likely contribute to resilience, epigenetic processes have been suggested to play an important role. The current study used a prospective design to test the novel hypothesis that offspring epigenetic aging – a measure of methylation differences that are associated with infant health outcomes – moderates the relationship between maternal exposure to childhood adversity and offspring symptomatology. Maternal childhood adversity was self-reported during pregnancy via the ACEs survey and the CTQ, which assessed total childhood trauma as well as maltreatment subtypes (i.e., emotional, physical, and sexual abuse). Offspring blood samples were collected at or shortly after birth and assayed on a DNA methylation microarray, and offspring symptomatology was assessed with the CBCL/1.5–5 when offspring were 2–4 years old. Results indicated that maternal childhood trauma, particularly sexual abuse, was predictive of offspring symptoms (ps = 0.003–0.03). However, the associations between maternal sexual abuse and offspring symptomatology were significantly attenuated in offspring with accelerated epigenetic aging. These findings further our understanding of how epigenetic processes may contribute to and attenuate the intergenerational link between stress and psychopathology.
Globally, burns are responsible for around 11 million injuries and 180 000 burn-related deaths yearly. Unfortunately, 9 of 10 burn injuries and deaths happen in low-and-middle-income countries (LMICs) such as Pakistan. One in three people admitted to hospitals with burn injuries die within three weeks, and survivors face serious lifelong physical, emotional and psychosocial problems. This may result in anxiety, depression, post-traumatic stress disorder, increased mortality and social disintegration. This study aims to evaluate if implementation of a culturally adapted multidisciplinary rehabilitation programme for burn survivors is clinically and cost-effective, sustainable and scalable across Pakistan.
Objectives
- To understand lived experiences of burn survivors, families, and other stakeholders including the experience of care and impact of burns To work together with key stakeholders (such as burn survivors, family members) to adapt a culturally appropriate affordable burn rehabilitation programme
- To undertake social media campaigns to promote burn prevention and risk assessment at communities, workplaces/industries/households; improve first aid; and address burn related stigma
- To work with policy makers/parliamentarians to develop national guidelines for burns care and prevention in Pakistan
Methods
There are 6 work-packages (WPs). WP1 is to co-adapt a culturally appropriate burn care and rehabilitation programme. WP2 will develop and implement national burn registry on WHO’s initiative. WP3 is a cluster randomised controlled trial to determine clinical and cost-effectiveness in Pakistan. WP4 will evaluate social media campaigns for burn prevention and reduce stigma. WP5 involves working with key-stakeholders for burns-related care and policy and WP6 offers sustainable capacity and capability for burns treatment and rehabilitation.
Results
A clinical and cost-effective burn care quality and rehabilitation programme may have a huge potential to save lives and contribute health and socio-economic benefits for patients, families, and the healthcare system in Pakistan. The nation-wide implementation and involvement of burn centres across all provinces offer an excellent opportunity to overcome the problem of burn care access experienced in LMICs.
Conclusions
To date, burns prevention, care and rehabilitation have not received sufficient attention in policy initiatives in Pakistan and other LMICs. This study is an excellent opportunity to evaluate culturally adapted burn care and rehabilitation programmes that can be implemented across LMICs. We will disseminate our findings widely, using a variety of approaches, supported by our stakeholder and patient advisory groups.
The National Institutes of Health launched the NIH Centers for Accelerated Innovation and the Research Evaluation and Commercialization Hubs programs to develop approaches and strategies to promote academic entrepreneurship and translate research discoveries into products and tools to help patients. The two programs collectively funded 11 sites at individual research institutions or consortia of institutions around the United States. Sites provided funding, project management, and coaching to funded investigators and commercialization education programs open to their research communities.
Methods:
We implemented an evaluation program that included longitudinal tracking of funded technology development projects and commercialization outcomes; interviews with site teams, funded investigators, and relevant institutional and innovation ecosystem stakeholders and analysis and review of administrative data.
Results:
As of May 2021, interim results for 366 funded projects show that technologies have received nearly $1.7 billion in follow-on funding to-date. There were 88 start-ups formed, a 40% Small Business Innovation Research/Small Business Technology Transfer application success rate, and 17 licenses with small and large businesses. Twelve technologies are currently in clinical testing and three are on the market.
Conclusions:
Best practices used by the sites included leadership teams using milestone-based project management, external advisory boards that evaluated funding applications for commercial merit as well as scientific, sustained engagement with the academic community about commercialization in an effort to shift attitudes about commercialization, application processes synced with education programs, and the provision of project managers with private-sector product development expertise to coach funded investigators.
To assess the relationship between food insecurity, sleep quality, and days with mental and physical health issues among college students.
Design:
An online survey was administered. Food insecurity was assessed using the ten-item Adult Food Security Survey Module. Sleep was measured using the nineteen-item Pittsburgh Sleep Quality Index (PSQI). Mental health and physical health were measured using three items from the Healthy Days Core Module. Multivariate logistic regression was conducted to assess the relationship between food insecurity, sleep quality, and days with poor mental and physical health.
Setting:
Twenty-two higher education institutions.
Participants:
College students (n 17 686) enrolled at one of twenty-two participating universities.
Results:
Compared with food-secure students, those classified as food insecure (43·4 %) had higher PSQI scores indicating poorer sleep quality (P < 0·0001) and reported more days with poor mental (P < 0·0001) and physical (P < 0·0001) health as well as days when mental and physical health prevented them from completing daily activities (P < 0·0001). Food-insecure students had higher adjusted odds of having poor sleep quality (adjusted OR (AOR): 1·13; 95 % CI 1·12, 1·14), days with poor physical health (AOR: 1·01; 95 % CI 1·01, 1·02), days with poor mental health (AOR: 1·03; 95 % CI 1·02, 1·03) and days when poor mental or physical health prevented them from completing daily activities (AOR: 1·03; 95 % CI 1·02, 1·04).
Conclusions:
College students report high food insecurity which is associated with poor mental and physical health, and sleep quality. Multi-level policy changes and campus wellness programmes are needed to prevent food insecurity and improve student health-related outcomes.
South Africa (SA) is a developing country with an ageing population. Adequate nutrition and physical activity (PA) protect against the loss of muscle mass and physical function, both of which are important components of sarcopenia. This study aimed to measure the prevalence of sarcopenia in older black SA women and investigate its associations with PA and protein intake.
Materials and Methods
Older black SA women (age, 68 (range; 60–85 years) n = 122) completed sociodemographic questionnaires, 24 h urine collection (estimate protein intake), venous blood (hs-C-reactive protein (hs-CRP) and ferritin), functional tests (grip strength, 3 m timed-up-and-go (TUG), 10 m walk test) and PA monitoring (activPAL). Dual-energy x-ray absorptiometry whole-body scans assessed fat and fat-free soft tissue mass (FFSTM).
Results
According to the European Working group on Sarcopenia in Older People (EWGSOP)2, 2.5% (n = 3) had confirmed sarcopenia of a low severity based on normal physical function. Of the total cohort, 9% (n = 11) had low grip strength, 22.1% (n = 27) had a low appendicular skeletal muscle index (ASMI), and no women had low TUG (s) or gait speed (m/s). Higher ASMI was associated with lower hs-CRP (p = 0.05; Rho = -0.209) and higher ferritin (Rho = 0.252; p = 0.019), grip strength (kg, Rho = 0.223; p = 0.015), and gait speed (m/s, Rho = 0.180; p = 0.050). Protein intake suggested intake of 41.8g/day/ or 0.51 g/kg of body mass/day. Higher total protein intake (g/24h), was associated with higher FFSTM (kg) and ASMI (p < 0.001). PA outcomes were not correlated with FFSTM or ASMI (p > 0.05), however, there was a strong positive correlation of TUG (s) and gait speed (m/s) with time spent: 1) stepping per day (min) and; 2) at a high cadence (> 100 steps/min) (all p < 0.01). Daily step count was 7137 ± 3233 (mean ± Standard deviation), with 97.9 ± 38.7 min of total time spent stepping and 12.6 ± 16.8 min spent stepping at a high cadence (> 100 steps/min). Of note, 13.9% (n = 17) of women were completing > 10,000 steps/day.
Discussion
Based on the EWGSOP2 criteria, there is a low prevalence of sarcopenia in older black SA women, explained by the maintenance of strength and physical function that directly related to PA, especially that performed at higher intensities. In contrast, low muscle mass was relatively prevalent (22.1%) and was associated with low dietary protein and not PA. Notably, it may be important to review the cut-points of EWGSOP2 criteria to be specific to the older SA women from disadvantaged communities.
Osteoporosis was not a public health concern in black South African (SA) women, until recently when it was reported that the prevalence of vertebral fractures was 9.1% in black compared to 5.0% in white SA women. Accordingly, this study aimed to measure bone mineral density (BMD) of older black SA women and to investigate its association with risk factors for osteoporosis, including strength, muscle and fat mass, dietary intake and objectively measured physical activity (PA).
Methods and materials
Older black SA women (age, 68 (range; 60–85 years) n = 122) completed sociodemographic and quantitative food frequency questionnaires (QFFQ), fasting venous blood samples (25-hydroxycholecalciferol: Vitamin D-25), 24 h urine collection (estimate protein intake), grip strength and PA monitoring (activPAL). Dual-energy x-ray absorptiometry (DXA) scans of the hip (femoral neck and total) and lumbar spine determined BMD and whole-body scans for fat and fat-free soft tissue mass (FFSTM). WHO classifications were used to determine osteopenia (t-score -2.5 to -1), and osteoporosis (t-score < -2.5).
Results
At the lumbar spine 34.4% of the women (n = 42) had osteopenia and 19.7% (n = 24) had osteoporosis. Osteopenia at the left femoral neck was 32% (n = 40) and osteoporosis was 13.1% (n = 16) of participants. The total left hip BMD indicated osteopenia in 27.9% (n = 34) and osteoporosis in 13.1% (n = 16) of participants. Multinomial regression revealed no differences in age (y) or frequency of falls in the past year between all groups (p = 0.727). Compared to those with normal BMD, participants with osteoporosis at the hip neck and lumbar spine were shorter, weighed less and had a lower body mass index (BMI) (all p < 0.05). When adjusted for height, the osteoporotic group (hip neck and lumbar spine) had lower trunk fat (% whole body), FFSTM (kg) and grip strength (kg), compared to those with normal BMD (p < 0.05). Only protein intake (g; 24 h urine analyses) was lower in women with osteoporosis (all sites) compared to those with normal BMD. Fat, carbohydrate and micronutrient intakes (relative to total daily energy intake), and vitamin D concentrations were not associated with BMD (all sites). Number of daily step count and stepping time (min) were inversely associated with BMI (p < 0.05), but not with BMD (all sites; p > 0.05).
Discussion
A high prevalence of osteopenia and osteoporosis was evident at the lumbar spine and hip in older black SA women. This study highlights the importance of strength, body composition, and protein intake in maintaining BMD and preventing the development of osteoporosis in older women.
We argue that understanding of autism can be strengthened by increasing involvement of autistic individuals as researchers and by exploring cascading impacts of early sensory, perceptual, attentional, and motor atypicalities on social and communicative developmental trajectories. Participatory action research that includes diverse participants or researchers may help combat stigma while expanding research foci to better address autistic people's needs.
A national need is to prepare for and respond to accidental or intentional disasters categorized as chemical, biological, radiological, nuclear, or explosive (CBRNE). These incidents require specific subject-matter expertise, yet have commonalities. We identify 7 core elements comprising CBRNE science that require integration for effective preparedness planning and public health and medical response and recovery. These core elements are (1) basic and clinical sciences, (2) modeling and systems management, (3) planning, (4) response and incident management, (5) recovery and resilience, (6) lessons learned, and (7) continuous improvement. A key feature is the ability of relevant subject matter experts to integrate information into response operations. We propose the CBRNE medical operations science support expert as a professional who (1) understands that CBRNE incidents require an integrated systems approach, (2) understands the key functions and contributions of CBRNE science practitioners, (3) helps direct strategic and tactical CBRNE planning and responses through first-hand experience, and (4) provides advice to senior decision-makers managing response activities. Recognition of both CBRNE science as a distinct competency and the establishment of the CBRNE medical operations science support expert informs the public of the enormous progress made, broadcasts opportunities for new talent, and enhances the sophistication and analytic expertise of senior managers planning for and responding to CBRNE incidents.
Legionnaires’ disease (LD) incidence in the USA has quadrupled since 2000. Health departments must detect LD outbreaks quickly to identify and remediate sources. We tested the performance of a system to prospectively detect simulated LD outbreaks in Allegheny County, Pennsylvania, USA. We generated three simulated LD outbreaks based on published outbreaks. After verifying no significant clusters existed in surveillance data during 2014–2016, we embedded simulated outbreak-associated cases into 2016, assigning simulated residences and report dates. We mimicked daily analyses in 2016 using the prospective space-time permutation scan statistic to detect clusters of ⩽30 and ⩽180 days using 365-day and 730-day baseline periods, respectively. We used recurrence interval (RI) thresholds of ⩾20, ⩾100 and ⩾365 days to define significant signals. We calculated sensitivity, specificity and positive and negative predictive values for daily analyses, separately for each embedded outbreak. Two large, simulated cooling tower-associated outbreaks were detected. As the RI threshold was increased, sensitivity and negative predictive value decreased, while positive predictive value and specificity increased. A small, simulated potable water-associated outbreak was not detected. Use of a RI threshold of ⩾100 days minimised time-to-detection while maximizing positive predictive value. Health departments should consider using this system to detect community-acquired LD outbreaks.
Allanite phenocrysts and co-existing glass from the perlitic obsidian of Sandy Braes have been analysed for nine rare earths (RE), uranium, and thorium by instrumental neutron activation analysis and for the major elements by electron microprobe. The chondritenormalized RE plot for the allanite shows a steep slope with a negative Eu anomaly. Allanite/glass partition coefficients show a smooth variation with ionic radius (except for Eu), the variation spanning two orders of magnitude. The partitioning behaviour, which is distinct from that shown by the RE in sphene, apatite, and zircon, can be explained by the allanite structure. The pronounced affinity of the light RE for allanite makes this an important mineral in considerations of RE concentrations during the evolution of granitic liquids.
Whole-rock, minor element, rare earth, and electron microprobe data are presented for basaltic lavas from the western Kangerdlugssuaq area of East Greenland. Samples were obtained from Professor W. A. Deer's 1936 collection at Triangular Nunataks and Gardiner Plateau, and additional material obtained by sampling moraines on the surface of Kangerdlugssuaq Glacier. Both undersaturated and tholeiitic lavas are present at the Triangular Nunatak locality but the glacier suite is dominantly tholeiitic. The tholeiitic suite is less evolved than tholeiites from the Scoresby Sund area. Undersaturated lavas show enrichment in light rare earth elements and tholeiitic lavas show fiat chondrite-normalized patterns. Tholeiites from the Gardiner Plateau show no Eu anomaly but others show a slight negative Eu anomaly. Chemical data and considerations of regional geology are consistent with Cox's (1980) model of flood basalt vulcanism.
Black, opaque grains of a spinel whose composition is (Mg2TiO4) 85.8, (MgFe2O4) 0.4 (FeFe2O4) 13.8 (mole %) coexist with a MgAl2O4 spinel and geikielite in a periclase-forsterite marble that has been thermally metamorphosed against an alkalic ultramafic intrusion of Caledonian age in the Kangerdlugssuaq region of East Greenland. The spinel appears to be the closest recorded approach to the end-member Mg2TiO4 among natural rocks, and to be part of a solid-solution series extending across the join Mg2TiO4-MgFe2O4-FeFe2O4, the existence of which has not previously been reported. The composition of the series appears to be controlled by the fO2 that prevails during metamorphism.
Coinfection with human immunodeficiency virus (HIV) and viral hepatitis is associated with high morbidity and mortality in the absence of clinical management, making identification of these cases crucial. We examined characteristics of HIV and viral hepatitis coinfections by using surveillance data from 15 US states and two cities. Each jurisdiction used an automated deterministic matching method to link surveillance data for persons with reported acute and chronic hepatitis B virus (HBV) or hepatitis C virus (HCV) infections, to persons reported with HIV infection. Of the 504 398 persons living with diagnosed HIV infection at the end of 2014, 2.0% were coinfected with HBV and 6.7% were coinfected with HCV. Of the 269 884 persons ever reported with HBV, 5.2% were reported with HIV. Of the 1 093 050 persons ever reported with HCV, 4.3% were reported with HIV. A greater proportion of persons coinfected with HIV and HBV were males and blacks/African Americans, compared with those with HIV monoinfection. Persons who inject drugs represented a greater proportion of those coinfected with HIV and HCV, compared with those with HIV monoinfection. Matching HIV and viral hepatitis surveillance data highlights epidemiological characteristics of persons coinfected and can be used to routinely monitor health status and guide state and national public health interventions.
Water cultures were significantly more sensitive than concurrently collected swab cultures (n=2,147 each) in detecting Legionella pneumophila within a Veterans Affairs healthcare system. Sensitivity for water versus swab cultures was 90% versus 30% overall, 83% versus 48% during a nosocomial Legionnaires’ disease outbreak, and 93% versus 22% post outbreak.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
As part of further investigations into three linked haemorrhagic fever with renal syndrome (HFRS) cases in Wales and England, 21 rats from a breeding colony in Cherwell, and three rats from a household in Cheltenham were screened for hantavirus. Hantavirus RNA was detected in either the lungs and/or kidney of 17/21 (81%) of the Cherwell rats tested, higher than previously detected by blood testing alone (7/21, 33%), and in the kidneys of all three Cheltenham rats. The partial L gene sequences obtained from 10 of the Cherwell rats and the three Cheltenham rats were identical to each other and the previously reported UK Cherwell strain. Seoul hantavirus (SEOV) RNA was detected in the heart, kidney, lung, salivary gland and spleen (but not in the liver) of an individual rat from the Cherwell colony suspected of being the source of SEOV. Serum from 20/20 of the Cherwell rats and two associated HFRS cases had high levels of SEOV-specific antibodies (by virus neutralisation). The high prevalence of SEOV in both sites and the moderately severe disease in the pet rat owners suggest that SEOV in pet rats poses a greater public health risk than previously considered.