We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Yellow and knotroot foxtail are two common weed species infesting turfgrass and pastures in the southeastern region of the United States. Yellow and knotroot foxtail share morphological similarities and are frequently misidentified by weed managers, thus leading to confusion in herbicide selection. Greenhouse research was conducted to evaluate the response of yellow and knotroot foxtail to several turfgrass herbicides: pinoxaden (35 and 70 g ai ha−1), sethoxydim (316 and 520 g ai ha−1), thiencarbazone + dicamba + iodosulfuron (230 g ai ha−1), nicosulfuron + rimsulfuron (562.8 g ai ha−1), metribuzin (395 g ha−1), sulfentrazone (330 g ai ha−1), sulfentrazone + imazethapyr (504 g ai ha−1), and imazaquin (550 g ai ha−1). All treatments controlled yellow foxtail >87% with more than 90% reduction of the biomass. By comparison, only sulfentrazone alone controlled knotroot foxtail 90% and completely reduced aboveground biomass. Sethoxydim (520 g ai ha−1), metribuzin, and imazaquin controlled knotroot foxtail >70% at 28 d after application. In a rate response evaluation, nonlinear regression showed that yellow foxtail was approximately 8 times more susceptible to pinoxaden and 2 times more susceptible to sethoxydim than knotroot foxtail based on log (WR50) values, which were 50% reduction in fresh weight. Our research indicates that knotroot foxtail is more difficult to control across a range of herbicides, making differentiation of these two species important before herbicides are applied.
Faecal examinations for helminth eggs were performed on 1869 people from two riverside localities, Vientiane Municipality and Saravane Province, along the Mekong River, Laos. To obtain adult flukes, 42 people positive for small trematode eggs (Opisthorchis viverrini, heterophyid, or lecithodendriid eggs) were treated with a 20–30 mg kg−1 single dose of praziquantel and purged. Diarrhoeic stools were then collected from 36 people (18 in each area) and searched for helminth parasites using stereomicroscopes. Faecal examinations revealed positive rates for small trematode eggs of 53.3% and 70.8% (average 65.2%) in Vientiane and Saravane Province, respectively. Infections with O. viverrini and six species of intestinal flukes were found, namely, Haplorchistaichui, H. pumilio, H. yokogawai, Centrocestus caninus,Prosthodendrium molenkampi, and Phaneropsolus bonnei. The total number of flukes collected and the proportion of fluke species recovered were markedly different in the two localities; in Vientiane, 1041 O. viverrini (57.8 per person) and 615 others (34.2 per person), whereas in Saravane, 395 O. viverrini (21.9 per person) and 155207 others (8622.6 per person). Five people from Saravane harboured no O. viverrini but numerous heterophyid and/or lecithodendriid flukes. The results indicate that O. viverrini and several species of heterophyid and lecithodendriid flukes are endemic in these two riverside localities, and suggest that the intensity of infection and the relative proportion of fluke species vary by locality along the Mekong River basin.
Early neurodevelopmental deviations, such as abnormal cortical folding patterns, are a candidate biomarker for major depressive disorder (MDD). Previous studies on patterns of abnormal cortical gyrification in MDD have provided valuable insights; however, the findings on cortical folding are controversial.
Objectives
We aimed to investigate the association of MDD with the local gyrification index (LGI) in each cortical region at the whole-brain level and the association of the LGI with clinical characteristics of MDD, including recurrence, remission status, illness duration, severity of depression, and medication status of patients with MDD.
Methods
We obtained T1-weighted images of 234 patients with MDD and 215 healthy controls (HCs). LGI values were automatically calculated using the FreeSurfer software according to the Desikan–Killiany atlas. LGI values from 66 cortical regions in the bilateral hemispheres were analyzed. We compared the LGI values between the MDD and HC groups using the analysis of covariance, including patients’ age, sex, and years of education as covariates. The association between clinical characteristics and LGI values was investigated in the MDD group.
Results
Compared with HCs, patients with MDD showed significantly decreased LGI values in the cortical regions, including the bilateral ventrolateral and dorsolateral prefrontal cortices, medial and lateral orbitofrontal cortices, insula, right rostral anterior cingulate cortex, and several temporal and parietal regions, with the highest effect size in the left pars triangularis (Cohen’s f = 0.361; P = 1.78 × 10-13). As for the association of clinical characteristics with LGIs within the MDD group, recurrence and longer illness duration of MDD were associated with increased gyrification in several occipital and temporal regions, which showed no significant difference in LGIs between MDD and HC groups.
Conclusions
Considering that the aforementioned cortical regions are involved in emotion regulation, abnormal cortical folding patterns in such regions may be associated with the dysfunction of emotion regulation-related neural circuits, which may lead to MDD. These findings suggest that LGI may be a relatively stable neuroimaging marker associated with the trait of MDD predisposition.
Homeless shelter residents and staff may be at higher risk of SARS-CoV-2 infection. However, SARS-CoV-2 infection estimates in this population have been reliant on cross-sectional or outbreak investigation data. We conducted routine surveillance and outbreak testing in 23 homeless shelters in King County, Washington, to estimate the occurrence of laboratory-confirmed SARS-CoV-2 infection and risk factors during 1 January 2020–31 May 2021. Symptom surveys and nasal swabs were collected for SARS-CoV-2 testing by RT-PCR for residents aged ≥3 months and staff. We collected 12,915 specimens from 2,930 unique participants. We identified 4.74 (95% CI 4.00–5.58) SARS-CoV-2 infections per 100 individuals (residents: 4.96, 95% CI 4.12–5.91; staff: 3.86, 95% CI 2.43–5.79). Most infections were asymptomatic at the time of detection (74%) and detected during routine surveillance (73%). Outbreak testing yielded higher test positivity than routine surveillance (2.7% versus 0.9%). Among those infected, residents were less likely to report symptoms than staff. Participants who were vaccinated against seasonal influenza and were current smokers had lower odds of having an infection detected. Active surveillance that includes SARS-CoV-2 testing of all persons is essential in ascertaining the true burden of SARS-CoV-2 infections among residents and staff of congregate settings.
Ventilator-capable skilled nursing facilities (vSNFs) are critical to the epidemiology and control of antibiotic-resistant organisms. During an infection prevention intervention to control carbapenem-resistant Enterobacterales (CRE), we conducted a qualitative study to characterize vSNF healthcare personnel beliefs and experiences regarding infection control measures.
Design:
A qualitative study involving semistructured interviews.
Setting:
One vSNF in the Chicago, Illinois, metropolitan region.
Participants:
The study included 17 healthcare personnel representing management, nursing, and nursing assistants.
Methods:
We used face-to-face, semistructured interviews to measure healthcare personnel experiences with infection control measures at the midpoint of a 2-year quality improvement project.
Results:
Healthcare personnel characterized their facility as a home-like environment, yet they recognized that it is a setting where germs were ‘invisible’ and potentially ‘threatening.’ Healthcare personnel described elaborate self-protection measures to avoid acquisition or transfer of germs to their own household. Healthcare personnel were motivated to implement infection control measures to protect residents, but many identified structural barriers such as understaffing and time constraints, and some reported persistent preference for soap and water.
Conclusions:
Healthcare personnel in vSNFs, from management to frontline staff, understood germ theory and the significance of multidrug-resistant organism transmission. However, their ability to implement infection control measures was hampered by resource limitations and mixed beliefs regarding the effectiveness of infection control measures. Self-protection from acquiring multidrug-resistant organisms was a strong motivator for healthcare personnel both outside and inside the workplace, and it could explain variation in adherence to infection control measures such as a higher hand hygiene adherence after resident care than before resident care.
To investigate the association between parity and the risk of incident dementia in women.
Methods
We pooled baseline and follow-up data for community-dwelling women aged 60 or older from six population-based, prospective cohort studies from four European and two Asian countries. We investigated the association between parity and incident dementia using Cox proportional hazards regression models adjusted for age, educational level, hypertension, diabetes mellitus and cohort, with additional analysis by dementia subtype (Alzheimer dementia (AD) and non-Alzheimer dementia (NAD)).
Results
Of 9756 women dementia-free at baseline, 7010 completed one or more follow-up assessments. The mean follow-up duration was 5.4 ± 3.1 years and dementia developed in 550 participants. The number of parities was associated with the risk of incident dementia (hazard ratio (HR) = 1.07, 95% confidence interval (CI) = 1.02–1.13). Grand multiparity (five or more parities) increased the risk of dementia by 30% compared to 1–4 parities (HR = 1.30, 95% CI = 1.02–1.67). The risk of NAD increased by 12% for every parity (HR = 1.12, 95% CI = 1.02–1.23) and by 60% for grand multiparity (HR = 1.60, 95% CI = 1.00–2.55), but the risk of AD was not significantly associated with parity.
Conclusions
Grand multiparity is a significant risk factor for dementia in women. This may have particularly important implications for women in low and middle-income countries where the fertility rate and prevalence of grand multiparity are high.
To investigate the difference of visual pattern memory among first-episode treatment-naive patients with deficit and nondeficit schizophrenia.
Methods:
199 first-episode treatment-naive patients with schizophrenia, and 148 controls were recruited. Schedule for the Deficit Syndrome (SDS) was used to categorize the patients into deficit or nondeficit subtype. Pattern Recognition Memory (PRM) was used to test the immediate and delayed mode of visual pattern memory. Positive and Negative Symptom Scale PANSS was used to assess the degree of patients symptoms.
Results:
The PRM immediate mode and delayed mode percent correct was significant lower and time latency was significant longer in two subtypes of patients. There were no significant difference in the performance of immediate mode of PRM between deficit and nondeficit patients[(86.49 ± 15.34) vs. (87.28 ± 16.00), P=0.960]. But the impairment was more severe in patients with deficit schizophrenia [percent correct (63.10 ± 19.17) vs. (70.69 ± 15.34), P< 0.001 time latency 5086.80 ± 7528.54 vs. 3527.40 ± 3649.08 P=0.024] in the delayed mode. and PRM has no significant correlation with the negative symptoms of deficit schizophrenia.
Conclusion:
There were significant difference in the performance of immediate and delayed mode of PRM between patients and controls. The difference between first-episode treatment-naïve deficit schizophrenia and nondeficit schizophrenia was only in delayed mode of PRM, and has no correlation with the primary negative symptoms. The deficit schizophrenia is a subtype of schizophrenia with unique impairment of cognitive functions.
To evaluate the upper airway morphology changes associated with ageing in adult Chinese patients with obstructive sleep apnoea.
Methods
A total of 124 male patients diagnosed with obstructive sleep apnoea by overnight polysomnography, who underwent upper airway computed tomography, were enrolled. The linear dimensions, cross-sectional area and volume of the upper airway region and the surrounding bony frame were measured. The association between ageing and upper airway morphology was analysed.
Results
Soft palate length, minimum cross-sectional area of the retroglossal region, lateral dimensions at the minimum cross-sectional area of the retropalatal and retroglossal regions, nasopharyngeal volume, and average cross-sectional area of the nasopharyngeal region were found to significantly increase with ageing in all patients, while the upper airway shape flattened with ageing. The volume of the retropalatal region increased with ageing among the patients with a body mass index of less than 24 kg/m2. The volume of parapharyngeal fat pad increased with ageing among patients with a body mass index greater than 28 kg/m2.
Conclusion
A number of dimensional, cross-sectional and volumetric parameters of the pharynx increased with age, indicating that non-anatomical factors may play a more important role in the pathogenesis of obstructive sleep apnoea in aged patients.
Recently, we found that in ovo feeding of l-leucine (l-Leu) afforded thermotolerance, stimulated lipid metabolism and modified amino acid metabolism in male broiler chicks. However, the effects of in ovo feeding of l-Leu on thermoregulation and growth performance until marketing age of broilers are still unknown. In this study, we investigated the effects of in ovo feeding of l-Leu on body weight (BW) gain under control thermoneutral temperature or chronic heat stress. We measured changes of body temperature and food intake, organ weight, as well as amino acid metabolism and plasma metabolites under acute and chronic heat stress in broilers. A total of 168 fertilized Chunky broiler eggs were randomly divided into 2 treatment groups in experiments. The eggs were in ovo fed with l-Leu (34.5 µmol/500 µl per egg) or sterile water (500 µl/egg) during incubation. After hatching, male broilers were selected and assigned seven to nine replicates (one bird/replicate) in each group for heat challenge experiments. Broilers (29- or 30-day-old) were exposed to acute heat stress (30 ± 1°C) for 120 min or a chronic heat cyclic and continued heat stress (over 30 ± 1°C; ages, 15 to 44 days). In ovo feeding of l-Leu caused a significant suppression of enhanced body temperature without affecting food intake, plasma triacylglycerol, non-esterified fatty acids, ketone bodies, glucose, lactic acid or thyroid hormones under acute heat stress. Daily body temperature was significantly increased by l-Leu in ovo feeding under chronic heat stress. Interestingly, in ovo feeding of l-Leu caused a significantly higher daily BW gain compared with that of the control group under chronic heat stress. Moreover, some essential amino acids, including Leu and isoleucine, were significantly increased in the liver and decreased in the plasma by l-Leu in ovo feeding under acute heat stress. These results suggested that l-Leu in ovo feeding afforded thermotolerance to broilers under acute heat stress mainly through changing amino acid metabolism until marketing age.
The effect of hot streaks from a gas turbine combustor on the thermodynamic load of internally air-cooled nozzle guide vanes (NGVs) and shrouds has been numerically investigated under flight conditions. The study follows two steps: one for the high-fidelity 60° combustor sector with simplified ten NGVs and three thermocouples attached; and the other for the NGV sectors where each sector consists of one high-fidelity NGV (probe NGV) and nine dummy NGVs. The first step identifies which NGV has the highest thermal load and provides the inlet flow boundary conditions for the second step. In the second step, the flow fields and thermal loads of the probe NGVs are resolved in detail.
With the systematically validated physical models, the two-phase flowfield of the combustor-NGVs sector has been successfully simulated. The predicted mean and maximum temperature at the combustor sector exit are in excellent agreement with the experimental data, which provides a solid basis for the hot-streak effect investigation. The results indicate that the second NGV, looking upstream from left, has the highest thermal load. Its maximum surface temperature is 8.4% higher than that for the same NGV but with the mean inlet boundary conditions, and 14.1% higher than the ninth NGV. The finding is consistent with the field-observed NGV damage pattern. To extend the service life of these vulnerable NGVs, some protection methods should be considered.
Antenna-pattern measurements obtained from a double-metal supra-terahertz-frequency (supra-THz) quantum cascade laser (QCL) are presented. The QCL is mounted within a mechanically micro-machined waveguide cavity containing dual diagonal feedhorns. Operating in continuous-wave mode at 3.5 THz, and at an ambient temperature of ~60 K, QCL emission has been directed via the feedhorns to a supra-THz detector mounted on a multi-axis linear scanner. Comparison of simulated and measured far-field antenna patterns shows an excellent degree of correlation between beamwidth (full-width-half-maximum) and sidelobe content and a very substantial improvement when compared with unmounted devices. Additionally, a single output has been used to successfully illuminate and demonstrate an optical breadboard arrangement associated with a future supra-THz Earth observation space-borne payload. Our novel device has therefore provided a valuable demonstration of the effectiveness of supra-THz diagonal feedhorns and QCL devices for future space-borne ultra-high-frequency Earth-observing heterodyne radiometers.
The present study aimed to identify the factors that affect immediate (within 24 h after farrowing onset) postnatal piglet mortality in litters with hyperprolific sows, and investigate their associations with behaviour of postpartum sows in two different farrowing housing systems. A total of 30 sows were housed in: (1) CRATE (n=15): the farrowing crate closed (0.80×2.20 m) within a pen (2.50×1.70 m), and (2) OPEN (n=15): the farrowing crate open (0.80×2.20×1.80 m) within a pen (2.50×2.40 m) with a provision of 20 ls of hay in a rack. A total of 518 live born piglets, produced from the 30 sows, were used for data analyses during the first 24 h after the onset of parturition (T24). Behavioural observations of the sows were assessed via video analyses during T24. Total and crushed piglet mortality rates were higher in OPEN compared with CRATE (P<0.01, for both). During T24, the OPEN sows tended to show higher frequency of postural changes (P=0.07) and duration of standing (P=0.10), and showed higher frequencies of bar-biting (P<0.05) and piglet trapping (P<0.01), when compared with the CRATE sows. During T24, the mortality rates caused by crushing were correlated with the piglet trapping event (r=0.93, P<0.0001), postural changes (r=0.37, P<0.01), duration of standing (r=0.32, P<0.01) and frequency of bar-biting behaviour (r=0.51, P<0.01) of the sows (n=30). In conclusion, immediate postnatal piglet mortality, mainly due to crushing, may be associated with potential increases in frequency of postural changes, duration of standing and incidence of piglet trapping in postpartum sows in the open crate system with large litters.
Multiple human immunodeficiency virus (HIV)-1 genotypes in China were first discovered in Yunnan Province before disseminating throughout the country. As the HIV-1 epidemic continues to expand in Yunnan, genetic characteristics and transmitted drug resistance (TDR) should be further investigated among the recently infected population. Among 2828 HIV-positive samples newly reported in the first quarter of 2014, 347 were identified as recent infections with BED-captured enzyme immunoassay (CEIA). Of them, 291 were successfully genotyped and identified as circulating recombinant form (CRF)08_BC (47.4%), unique recombinant forms (URFs) (18.2%), CRF01_AE (15.8%), CRF07_BC (14.4%), subtype C (2.7%), CRF55_01B (0.7%), subtype B (0.3%) and CRF64_BC (0.3%). CRF08_BC and CRF01_AE were the predominant genotypes among heterosexual and homosexual infections, respectively. CRF08_BC, URFs, CRF01_AE and CRF07_BC expanded with higher prevalence in central and eastern Yunnan. The recent common ancestor of CRF01_AE, CRF07_BC and CRF08_BC dated back to 1983.1, 1992.1 and 1989.5, respectively. The effective population sizes (EPS) for CRF01_AE and CRF07_BC increased exponentially during 1991–1999 and 1994–1999, respectively. The EPS for CRF08_BC underwent two exponential growth phases in 1994–1998 and 2001–2002. Lastly, TDR-associated mutations were identified in 1.8% of individuals. These findings not only enhance our understanding of HIV-1 evolution in Yunnan but also have implications for vaccine design and patient management strategies.
This study aims to investigate the climate–malaria associations in nine cities selected from malaria high-risk areas in China. Daily reports of malaria cases in Anhui, Henan, and Yunnan Provinces for 2005–2012 were obtained from the Chinese Center for Disease Control and Prevention. Generalized estimating equation models were used to quantify the city-specific climate–malaria associations. Multivariate random-effects meta-regression analyses were used to pool the city-specific effects. An inverted-U-shaped curve relationship was observed between temperatures, average relative humidity, and malaria. A 1 °C increase of maximum temperature (Tmax) resulted in 6·7% (95% CI 4·6–8·8%) to 15·8% (95% CI 14·1–17·4%) increase of malaria, with corresponding lags ranging from 7 to 45 days. For minimum temperature (Tmin), the effect estimates peaked at lag 0 to 40 days, ranging from 5·3% (95% CI 4·4–6·2%) to 17·9% (95% CI 15·6–20·1%). Malaria is more sensitive to Tmin in cool climates and Tmax in warm climates. The duration of lag effect in a cool climate zone is longer than that in a warm climate zone. Lagged effects did not vanish after an epidemic season but waned gradually in the following 2–3 warm seasons. A warming climate may potentially increase the risk of malaria resurgence in China.
Seed shape (SS) affects the yield and appearance of soybean seeds significantly. However, little detailed information has been reported about the quantitative trait loci (QTL) affecting SS, especially SS components such as seed length (SL), seed width (SW) and seed thickness (ST), and their mutual ratios of length-to-weight (SLW), length-to-thickness (SLT) and weight-to-thickness (SWT). The aim of the present study was to identify QTL underlying SS components using 129 recombinant inbred lines derived from a cross between Dongnong46 and L-100. Phenotypic data were collected from this population after it was grown across nine environments. A total of 213 simple sequence repeat markers were used to construct the genetic linkage map, which covered approximately 3623·39 cM, with an average distance of 17·01 cM between markers. Five QTL were identified as being associated with SL, five with SW, three with ST, four with SLW, two with SLT and three with SWT. These QTL could explain 1·46–22·16% of the phenotypic variation in SS component traits. Three QTL were identified in more than six tested environments three for SL, two for SW, one for ST, two for SLW and one for SLT. These QTL have great potential value for marker-assistant selection of SS in soybean seeds.
Omics research has indicated that heat shock protein 70 (HSP70) is a potential biomarker of meat quality. However, the specific changes and the potential role of HSP70 in postmortem meat quality development need to be further defined. In this study, Arbor Acres broiler chickens (n=126) were randomly categorized into three treatment groups of unstressed control (C), 0.5-h transport (T) and subsequent water shower spray following transport (T/W). Each treatment consisted of six replicates with seven birds each. The birds were transported according to a designed protocol. The pectoralis major (PM) muscles of the transport-stressed broilers were categorized as normal and pale, soft and exudative (PSE)-like muscle samples according to L* and pH24 h values to test the expression and location of HSP70. Results revealed that the activities of plasma creatine kinase and lactate dehydrogenase increased significantly (P<0.05) in normal and PSE-like muscle samples after transportation. The mRNA expression of HSP70 in normal muscle samples increased significantly (P<0.05) compared with that in the controls after stress. The protein expression of HSP70 increased significantly in normal muscle samples and decreased significantly (P<0.05) in PSE-like muscles. Immuno-fluorescence showed that HSP70 was present in the cytoplasm and on surface membranes of PM muscle cells in the normal samples following stress. Meanwhile, HSP70 was present on the surface membranes and extracellular matrix but was barely visible in the cytoplasm of the PSE-like samples. Principal component analysis showed high correlations between HSP70 and meat quality and stress indicators. In conclusion, this research suggests that the variation in HSP70 expression may provide a novel insight into the pathways underlying meat quality development.
This study aimed to evaluate subjective symptom changes in obstructive sleep apnoea hypopnea syndrome patients following nasal surgery, and to explore treatment efficacy in improving patient quality of life.
Methods:
Patients with nasal blockage accompanied by habitual snoring were stratified into four groups. Their subjective symptoms were evaluated before and after nasal surgery.
Results:
There was a significant decrease in the nasal blockage symptom visual analogue scale, Epworth Sleepiness Scale, Snore Outcomes Survey, Spouse/Bed Partners Survey and Sino-Nasal Outcome Test 20 scores for all patients at six months after surgery. The visual analogue scale score for subjective olfactory function was significantly improved in the severe obstructive sleep apnoea hypopnea syndrome patient group.
Conclusion:
Nasal surgery can effectively improve the subjective symptoms of patients with simple snoring accompanied by nasal blockage and of patients with obstructive sleep apnoea hypopnea syndrome, thus improving their quality of life.
A nationwide population-based cohort was used to examine the severity of liver cirrhosis and risk of mortality from oral cancer.
Methods:
The cohort consisted of 3583 patients with oral cancer treated by surgery between 2008 and 2011 in Taiwan. They were grouped on the basis of normal liver function (n = 3471), cirrhosis without decompensation (n = 72) and cirrhosis with decompensation (n = 40). The primary endpoint was mortality. Hazard ratios of death were also determined.
Results:
The mortality rates in the respective groups were 14.8 per cent, 20.8 per cent and 37.5 per cent at one year (p < 0.001). The adjusted hazard ratios of death at one year for each group compared to the normal group were 2.01 (p = 0.021) for cirrhotic patients without decompensation, 4.84 (p < 0.001) for those with decompensation and 2.65 (p < 0.001) for those receiving chemotherapy.
Conclusion:
Liver cirrhosis can be used to predict one-year mortality in oral cancer patients. Chemotherapy should be used with caution and underlying co-morbidities should be managed in cirrhotic patients to reduce mortality risk.
Pathogens utilize type III secretion systems to deliver effector proteins, which facilitate bacterial infections. The Escherichia coli type III secretion system 2 (ETT2) which plays a crucial role in bacterial virulence, is present in the majority of E. coli strains, although ETT2 has undergone widespread mutational attrition. We investigated the distribution and characteristics of ETT2 in avian pathogenic E. coli (APEC) isolates and identified five different ETT2 isoforms, including intact ETT2, in 57·6% (141/245) of the isolates. The ETT2 locus was present in the predominant APEC serotypes O78, O2 and O1. All of the ETT2 loci in the serotype O78 isolates were degenerate, whereas an intact ETT2 locus was mostly present in O1 and O2 serotype strains, which belong to phylogenetic groups B2 and D, respectively. Interestingly, a putative second type III secretion-associated locus (eip locus) was present only in the isolates with an intact ETT2. Moreover, ETT2 was more widely distributed in APEC isolates and exhibited more isoforms compared to ETT2 in human extraintestinal pathogenic E. coli, suggesting that APEC might be a potential risk to human health. However, there was no distinct correlation between ETT2 and other virulence factors in APEC.
Residual feed intake (RFI), defined as the difference between an animal’s actual feed intake and expected feed intake over a specific period, is an inheritable character of feed conversion efficiency in dairy cows. Research has shown that a lower RFI could improve the profitability of milk production. This study explored variation in RFI by comparing the differences in body size, milk performance, feeding behavior, and serum metabolites in 29 Holstein cows in mid lactation. The cows were selected from a total of 84 animals based on their RFI following feedlot tests. Selected cows were ranked into high RFI (RFI >1 SD above the mean, n=14) and low RFI (RFI<1 SD below the mean, n=15). The low RFI cows (more efficient) consumed 1.59 kg/day less dry matter than the high RFI group (P<0.01), while they produced nearly equal 4% fat-corrected milk. The milk : feed ratio was higher for the low RFI group than for the high RFI group (P<0.05). The levels of milk protein (P<0.01), total solids (P<0.05), and nonfat solids (P<0.05) were also higher for the low RFI group, whereas milk urea nitrogen was lower (P<0.01). The daily feeding duration was shorter for the low RFI group than for the high RFI group (P<0.01). No significant differences were found in levels of glucose, β-hydroxybutyrate, prolactin, insulin, IGF-1, growth hormone or ghrelin, but the level of neuropeptide Y was higher (P<0.01) and levels of leptin and non-esterified fatty acid (P<0.05) were lower for the low RFI group than for the high RFI group. There were substantial differences between cows with different RFI, which might affect the efficiency of milk protein metabolism and fat mobilization.