We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Precariously housed individuals are exposed to multiple adverse factors negatively impacting neurocognitive functioning. Additionally, this population is subjected to poor life outcomes, such as impaired psychosocial functioning. Neurocognitive functioning plays an important role in psychosocial functioning and may be especially critical for precariously housed individuals who face numerous barriers in their daily lives. However, few studies have explicitly examined the cognitive determinants of functional outcomes in this population. Cognitive intraindividual variability (IIV) involves the study of within-person differences in neurocognitive functioning and has been used as marker of frontal system pathology. Increased IIV has been associated with worse cognitive performance, cognitive decline, and poorer everyday functioning. Hence, IIV may add to the predictive utility of commonly used neuropsychological measures and may serve as an emergent predictor of poor outcomes in at-risk populations. The objective of the current study was to examine IIV as a unique index of the neurocognitive contributions to functional outcomes within a large sample of precariously housed individuals. It was hypothesized that greater IIV would be associated with poorer current (i.e., baseline) and long-term (i.e., up to 12 years) psychosocial functioning.
Participants and Methods:
Four hundred and thirty-seven adults were recruited from single-room occupancy hotels located in the Downtown Eastside of Vancouver, Canada (Mage = 44 years, 78% male) between November 2008 and November 2021. Baseline neurocognitive functioning was assessed at study enrolment. Scores from the Social and Occupational Functioning Assessment Scale (SOFAS), the Role Functioning Scale (RFS), the physical component score (PCS) and the mental component score (MCS) of the 36-Item Short Form Survey Instrument were obtained at participants’ baseline assessments and at their last available follow-up assessment to represent baseline and long-term psychosocial functioning, respectively. Using an established formula, an index of IIV was derived using a battery of standardized tests that broadly assessed verbal learning and memory, sustained attention, mental flexibility, and cognitive control. A series of multiple linear regressions were conducted to predict baseline and long-term social and role functioning (average across SOFAS and RFS scores), and PCS and MCS scores from IIV. In each of the models, we also included common predictors of functioning, including a global cognitive composite score, age, and years of education.
Results:
The IIV index and the global composite score did not explain a significant proportion of the variance in baseline and long-term social and role functioning (p > .05). However, IIV was a significant predictor of baseline (B = -3.84, p = .021) and long-term (B = -3.58, p = .037) PCS scores, but not MCS scores (p > .05). The global composite score did not predict baseline or long-term PCS scores.
Conclusions:
IIV significantly predicted baseline and long-term physical functioning, but not mental functioning or social and role functioning, suggesting that IIV may be a sensitive marker for limitations in everyday functioning due to physical health problems in precariously housed individuals. Critically, the present study is the first to show that IIV may be a useful index for predicting poor long-term health-related outcomes in this population compared to traditional neuropsychological measures.
Despite advances in cancer genomics and the increased use of genomic medicine, metastatic cancer is still mostly an incurable and fatal disease. With diminishing returns from traditional drug discovery strategies, and high clinical failure rates, more emphasis is being placed on alternative drug discovery platforms, such as ex vivo approaches. Ex vivo approaches aim to embed biological relevance and inter-patient variability at an earlier stage of drug discovery, and to offer more precise treatment stratification for patients. However, these techniques also have a high potential to offer personalised therapies to patients, complementing and enhancing genomic medicine. Although an array of approaches are available to researchers, only a minority of techniques have made it through to direct patient treatment within robust clinical trials. Within this review, we discuss the current challenges to ex vivo approaches within clinical practice and summarise the contemporary literature which has directed patient treatment. Finally, we map out how ex vivo approaches could transition from a small-scale, predominantly research based technology to a robust and validated predictive tool. In future, these pre-clinical approaches may be integrated into clinical cancer pathways to assist in the personalisation of therapy choices and to hopefully improve patient experiences and outcomes.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
We observed an overall increase in the use of third- and fourth-generation cephalosporins after fluoroquinolone preauthorization was implemented. We examined the change in specific third- and fourth-generation cephalosporin use, and we sought to determine whether there was a consequent change in non-susceptibility of select Gram-negative bacterial isolates to these antibiotics.
Design:
Retrospective quasi-experimental study.
Setting:
Academic hospital.
Intervention:
Fluoroquinolone preauthorization was implemented in the hospital in October 2005. We used interrupted time series (ITS) Poisson regression models to examine trends in monthly rates of ceftriaxone, ceftazidime, and cefepime use and trends in yearly rates of nonsusceptible isolates (NSIs) of select Gram-negative bacteria before (1998–2004) and after (2006–2016) fluoroquinolone preauthorization was implemented.
Results:
Rates of use of ceftriaxone and cefepime increased after fluoroquinolone preauthorization was implemented (ceftriaxone RR, 1.002; 95% CI, 1.002–1.003; P < .0001; cefepime RR, 1.003; 95% CI, 1.001–1.004; P = .0006), but ceftazidime use continued to decline (RR, 0.991, 95% CI, 0.990–0.992; P < .0001). Rates of ceftazidime and cefepime NSIs of Pseudomonas aeruginosa (ceftazidime RR, 0.937; 95% CI, 0.910–0.965, P < .0001; cefepime RR, 0.937; 95% CI, 0.912–0.963; P < .0001) declined after fluoroquinolone preauthorization was implemented. Rates of ceftazidime and cefepime NSIs of Enterobacter cloacae (ceftazidime RR, 1.116; 95% CI, 1.078–1.154; P < .0001; cefepime RR, 1.198; 95% CI, 1.112–1.291; P < .0001) and cefepime NSI of Acinetobacter baumannii (RR, 1.169; 95% CI, 1.081–1.263; P < .0001) were increasing before fluoroquinolone preauthorization was implemented but became stable thereafter: E. cloacae (ceftazidime RR, 0.987; 95% CI, 0.948–1.028; P = .531; cefepime RR, 0.990; 95% CI, 0.962–1.018; P = .461) and A. baumannii (cefepime RR, 0.972; 95% CI, 0.939–1.006; P = .100).
Conclusions:
Fluoroquinolone preauthorization may increase use of unrestricted third- and fourth-generation cephalosporins; however, we did not observe increased antimicrobial resistance to these agents, especially among clinically important Gram-negative bacteria known for hospital-acquired infections.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
Our ability to reliably use radiocarbon (14C) dates of mollusk shells to estimate calendar ages may depend on the feeding preference and habitat of a particular species and the geology of the region. Gastropods that feed by scraping are prone to incorporation of carbon from the substrate into their shells as evidenced by studies comparing the radiocarbon dates of shells and flesh from different species on different substrates (Dye 1994; Hogg et al. 1998). Limpet shells (Patella sp.) are commonly found in prehistoric midden deposits in the British Isles and elsewhere, however these shells have largely been avoided for radiocarbon dating in regions of limestone outcrops. Results from limpets (Patella vulgata) collected alive on limestone and volcanic substrates on the coasts of Ireland indicate that the shells were formed in equilibrium with the seawater, with no significant 14C offsets. Limpets collected from the east coast of Northern Ireland have elevated 14C due to the output of Sellafield nuclear fuel reprocessing plant. In all locations, the flesh was depleted in 14C compared to the shells. The results will have an important consequence for radiocarbon dating of midden deposits as well as the bone of humans and animals who fed on the limpets.
To define optimal thromboprophylaxis strategy after stent implantation in superior or total cavopulmonary connections.
Background:
Stent thrombosis is a rare complication of intravascular stenting, with a perceived higher risk in single-ventricle patients.
Methods:
All patients who underwent stent implantation within superior or total cavopulmonary connections (caval vein, innominate vein, Fontan, or branch pulmonary arteries) were included. Cohort was divided into aspirin therapy alone versus advanced anticoagulation, including warfarin, enoxaparin, heparin, or clopidogrel. Primary endpoint was in-stent or downstream thrombus, and secondary endpoints included bleeding complications.
Results:
A total of 58 patients with single-ventricle circulation underwent 72 stent implantations. Of them 14 stents (19%) were implanted post-superior cavopulmonary connection and 58 (81%) post-total cavopulmonary connection. Indications for stenting included vessel/conduit stenosis (67%), external compression (18%), and thrombotic occlusion (15%). Advanced anticoagulation was prescribed for 32 (44%) patients and aspirin for 40 (56%) patients. Median follow up was 1.1 (25th–75th percentile, 0.5–2.6) years. Echocardiograms were available in 71 patients (99%), and advanced imaging in 44 patients (61%). Thrombosis was present in two patients on advanced anticoagulation (6.3%) and none noted in patients on aspirin (p = 0.187). Both patients with in-stent thrombus underwent initial stenting due to occlusive left pulmonary artery thrombus acutely post-superior cavopulmonary connection. There were seven (22%) significant bleeding complications for advanced anticoagulation and none for aspirin (p < 0.001).
Conclusions:
Antithrombotic strategy does not appear to affect rates of in-stent thrombus in single-ventricle circulations. Aspirin alone may be sufficient for most patients undergoing stent implantation, while pre-existing thrombus may warrant advanced anticoagulation.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
Hospitalized patients placed in isolation due to a carrier state or infection with resistant or highly communicable organisms report higher rates of anxiety and loneliness and have fewer physician encounters, room entries, and vital sign records. We hypothesized that isolation status might adversely impact patient experience as reported through Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) surveys, particularly regarding communication.
Design
Retrospective analysis of HCAHPS survey results over 5 years.
Setting
A 1,165-bed, tertiary-care, academic medical center.
Patients
Patients on any type of isolation for at least 50% of their stay were the exposure group. Those never in isolation served as controls.
Methods
Multivariable logistic regression, adjusting for age, race, gender, payer, severity of illness, length of stay and clinical service were used to examine associations between isolation status and “top-box” experience scores. Dose response to increasing percentage of days in isolation was also analyzed.
Results
Patients in isolation reported worse experience, primarily with staff responsiveness (help toileting 63% vs 51%; adjusted odds ratio [aOR], 0.77; P = .0009) and overall care (rate hospital 80% vs 73%; aOR, 0.78; P < .0001), but they reported similar experience in other domains. No dose-response effect was observed.
Conclusion
Isolated patients do not report adverse experience for most aspects of provider communication regarded to be among the most important elements for safety and quality of care. However, patients in isolation had worse experiences with staff responsiveness for time-sensitive needs. The absence of a dose-response effect suggests that isolation status may be a marker for other factors, such as illness severity. Regardless, hospitals should emphasize timely staff response for this population.
The development of laser wakefield accelerators (LWFA) over the past several years has led to an interest in very compact sources of X-ray radiation – such as “table-top” free electron lasers. However, the use of conventional undulators using permanent magnets also implies system sizes which are large. In this work, we assess the possibilities for the use of novel mini-undulators in conjunction with a LWFA so that the dimensions of the undulator become comparable with the acceleration distances for LWFA experiments (i.e., centimeters). The use of a prototype undulator using laser machining of permanent magnets for this application is described and the emission characteristics and limitations of such a system are determined. Preliminary electron propagation and X-ray emission measurements are taken with a LWFA electron beam at the University of Michigan.
In an effort to optimize patient outcomes, considerable attention is being devoted to identifying patient characteristics associated with major depressive disorder (MDD) and its responsiveness to treatment. In the current study, we extend this work by evaluating whether early change in these sensitivities is associated with response to antidepressant treatment for MDD.
Methods
Participants included 210 patients with MDD who were treated with 8 weeks of escitalopram and 112 healthy comparison participants. Of the original 210 patients, 90 non-responders received adjunctive aripiprazole for an additional 8 weeks. Symptoms of depression and anhedonia were assessed at the beginning of treatment and 8 weeks later in both samples. Reward and punishment sensitivity were assessed using the BIS/BAS scales measured at the initiation of treatment and 2 weeks later.
Results
Individuals with MDD exhibited higher punishment sensitivity and lower reward sensitivity compared with healthy comparison participants. Change in reward sensitivity during the first 2 weeks of treatment was associated with improved depressive symptoms and anhedonia following 8 weeks of treatment with escitalopram. Similarly, improvement in reward responsiveness during the first 2 weeks of adjunctive therapy with aripiprazole was associated with fewer symptoms of depression at post-treatment.
Conclusions
Findings highlight the predictive utility of early change in reward sensitivity during antidepressant treatment for major depression. In a clinical setting, a lack of change in early reward processing may signal a need to modify a patient's treatment plan with alternative or augmented treatment approaches.
The Meat Standards Australia (MSA) grading scheme has the ability to predict beef eating quality for each ‘cut×cooking method combination’ from animal and carcass traits such as sex, age, breed, marbling, hot carcass weight and fatness, ageing time, etc. Following MSA testing protocols, a total of 22 different muscles, cooked by four different cooking methods and to three different degrees of doneness, were tasted by over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia. Consumers scored the sensory characteristics (tenderness, flavor liking, juiciness and overall liking) and then allocated samples to one of four quality grades: unsatisfactory, good-every-day, better-than-every-day and premium. We observed that 26% of the beef was unsatisfactory. As previously reported, 68% of samples were allocated to the correct quality grades using the MSA grading scheme. Furthermore, only 7% of the beef unsatisfactory to consumers was misclassified as acceptable. Overall, we concluded that an MSA-like grading scheme could be used to predict beef eating quality and hence underpin commercial brands or labels in a number of European countries, and possibly the whole of Europe. In addition, such an eating quality guarantee system may allow the implementation of an MSA genetic index to improve eating quality through genetics as well as through management. Finally, such an eating quality guarantee system is likely to generate economic benefits to be shared along the beef supply chain from farmers to retailors, as consumers are willing to pay more for a better quality product.
Accurately quantifying a consumer’s willingness to pay (WTP) for beef of different eating qualities is intrinsically linked to the development of eating-quality-based meat grading systems, and therefore the delivery of consistent, quality beef to the consumer. Following Australian MSA (Meat Standards Australia) testing protocols, over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia were asked to detail their willingness to pay for beef from one of four categories that best described the sample; unsatisfactory, good-every-day, better-than-every-day or premium quality. These figures were subsequently converted to a proportion relative to the good-every-day category (P-WTP) to allow comparison between different currencies and time periods. Consumers also answered a short demographic questionnaire. Consumer P-WTP was found to be remarkably consistent between different demographic groups. After quality grade, by far the greatest influence on P-WTP was country of origin. This difference was unable to be explained by the other demographic factors examined in this study, such as occupation, gender, frequency of consumption and the importance of beef in the diet. Therefore, we can conclude that the P-WTP for beef is highly transferrable between different consumer groups, but not countries.
We report the discovery in the Greenland ice sheet of a discrete layer of free nanodiamonds (NDs) in very high abundances, implying most likely either an unprecedented influx of extraterrestrial (ET) material or a cosmic impact event that occurred after the last glacial episode. From that layer, we extracted n-diamonds and hexagonal diamonds (lonsdaleite), an accepted ET impact indicator, at abundances of up to about 5×106 times background levels in adjacent younger and older ice. The NDs in the concentrated layer are rounded, suggesting they most likely formed during a cosmic impact through some process similar to carbon-vapor deposition or high-explosive detonation. This morphology has not been reported previously in cosmic material, but has been observed in terrestrial impact material. This is the first highly enriched, discrete layer of NDs observed in glacial ice anywhere, and its presence indicates that ice caps are important archives of ET events of varying magnitudes. Using a preliminary ice chronology based on oxygen isotopes and dust stratigraphy, the ND-rich layer appears to be coeval with ND abundance peaks reported at numerous North American sites in a sedimentary layer, the Younger Dryas boundary layer (YDB), dating to 12.9 ± 0.1 ka. However, more investigation is needed to confirm this association.
Objectives: The present study examined differences in neurocognitive outcomes among non-Hispanic Black and White stroke survivors using the NIH Toolbox-Cognition Battery (NIHTB-CB), and investigated the roles of healthcare variables in explaining racial differences in neurocognitive outcomes post-stroke. Methods: One-hundred seventy adults (91 Black; 79 White), who participated in a multisite study were included (age: M=56.4; SD=12.6; education: M=13.7; SD=2.5; 50% male; years post-stroke: 1–18; stroke type: 72% ischemic, 28% hemorrhagic). Neurocognitive function was assessed with the NIHTB-CB, using demographically corrected norms. Participants completed measures of socio-demographic characteristics, health literacy, and healthcare use and access. Stroke severity was assessed with the Modified Rankin Scale. Results: An independent samples t test indicated Blacks showed more neurocognitive impairment (NIHTB-CB Fluid Composite T-score: M=37.63; SD=11.67) than Whites (Fluid T-score: M=42.59, SD=11.54; p=.006). This difference remained significant after adjusting for reading level (NIHTB-CB Oral Reading), and when stratified by stroke severity. Blacks also scored lower on health literacy, reported differences in insurance type, and reported decreased confidence in the doctors treating them. Multivariable models adjusting for reading level and injury severity showed that health literacy and insurance type were statistically significant predictors of the Fluid cognitive composite (p<.001 and p=.02, respectively) and significantly mediated racial differences on neurocognitive impairment. Conclusions: We replicated prior work showing that Blacks are at increased risk for poorer neurocognitive outcomes post-stroke than Whites. Health literacy and insurance type might be important modifiable factors influencing these differences. (JINS, 2017, 23, 640–652)
The beef industry must become more responsive to the changing market place and consumer demands. An essential part of this is quantifying a consumer’s perception of the eating quality of beef and their willingness to pay for that quality, across a broad range of demographics. Over 19 000 consumers from Northern Ireland, Poland, Ireland and France each tasted seven beef samples and scored them for tenderness, juiciness, flavour liking and overall liking. These scores were weighted and combined to create a fifth score, termed the Meat Quality 4 score (MQ4) (0.3×tenderness, 0.1×juiciness, 0.3×flavour liking and 0.3×overall liking). They also allocated the beef samples into one of four quality grades that best described the sample; unsatisfactory, good-every-day, better-than-every-day or premium. After the completion of the tasting panel, consumers were then asked to detail, in their own currency, their willingness to pay for these four categories which was subsequently converted to a proportion relative to the good-every-day category (P-WTP). Consumers also answered a short demographic questionnaire. The four sensory scores, the MQ4 score and the P-WTP were analysed separately, as dependant variables in linear mixed effects models. The answers from the demographic questionnaire were included in the model as fixed effects. Overall, there were only small differences in consumer scores and P-WTP between demographic groups. Consumers who preferred their beef cooked medium or well-done scored beef higher, except in Poland, where the opposite trend was found. This may be because Polish consumers were more likely to prefer their beef cooked well-done, but samples were cooked medium for this group. There was a small positive relationship with the importance of beef in the diet, increasing sensory scores by about 4% in Poland and Northern Ireland. Men also scored beef about 2% higher than women for most sensory scores in most countries. In most countries, consumers were willing to pay between 150 and 200% more for premium beef, and there was a 50% penalty in value for unsatisfactory beef. After quality grade, by far the greatest influence on P-WTP was country of origin. Consumer age also had a small negative relationship with P-WTP. The results indicate that a single quality score could reliably describe the eating quality experienced by all consumers. In addition, if reliable quality information is delivered to consumers they will pay more for better quality beef, which would add value to the beef industry and encourage improvements in quality.
Four field studies were conducted in 2004 to evaluate corn tolerance, weed control, grain yield, and net returns in glufosinate-resistant (GUR), glyphosate-resistant (GYR), imidazolinone-tolerant (IT), and nontransgenic (NT) corn with various herbicide systems. No significant differences between hybrid systems were observed for weed control. Limited corn injury (< 5%) was observed for all herbicide treatments. A single early POST (EPOST) system without S-metolachlor and sequential POST over the top (POT) herbicide systems, averaged over corn hybrids and PRE and late POST-directed (LAYBY) herbicide options, provide 93 and 99% control of goosegrass, respectively, and at least 83 and 97% control of Texas panicum, respectively. A single EPOST system without S-metolachlor, averaged over corn hybrids and LAYBY treatment options, provided at least 88% control of large crabgrass. When averaged over corn hybrid and PRE herbicide options, a sequential POT herbicide system alone provided at least 98, 99, 98, and 100 control of large crabgrass, morningglory species, Palmer amaranth, and common lambsquarters, respectively. The addition of ametryn at LAYBY to a single EPOST system without S-metolachlor was beneficial for improving control of morningglory species, common lambsquarters, and Palmer amaranth, depending on location. However, the observed increases (7 percentage points or less) are likely of limited biological significance. Grain yield was variable between hybrids and locations because of environmental differences. Consequently, net returns for each hybrid system within a location were also variable. Any POT system with or without ametryn at LAYBY, averaged over corn hybrid and PRE herbicide options, provided at least 101, 97, 92, and 92% yield protection at Clayton, Kinston, Lewiston, and Rocky Mount, NC, respectively. Net returns were maximized with treatments that provided excellent weed control with minimal inputs.
Field studies were conducted near Clayton, Goldsboro, Kinston, and Rocky Mount, NC in 2003 to evaluate weed control and cotton response to postemergence (POST) treatments of glufosinate applied alone or in tank mixtures with s-metolachlor, pyrithiobac, or trifloxysulfuron. Late-season control of common lambsquarters, common ragweed, entireleaf morningglory, ivyleaf morningglory, jimsonweed, pitted morningglory, purple nutsedge, and sicklepod with glufosinate early postemergence (EPOST) was ≥90%. The addition of S-metolachlor to glufosinate EPOST improved control of all weeds except sicklepod, ivyleaf morningglory, and entireleaf morningglory. When applied POST, glufosinate provided ≥90% late season control of common lambsquarters, common ragweed, entireleaf morningglory, ivyleaf morningglory, jimsonweed, large crabgrass, pitted morningglory, purple nutsedge, and sicklepod. Control of goosegrass and Palmer amaranth was 81 and 84%, respectively. When pyrithiobac or trifloxysulfuron were added in POST tank mixtures, control of Palmer amaranth improved 6 and 9 percentage points, respectively. Control of goosegrass remained near 80% regardless of herbicide treatment used. The addition of a late post-directed (LAYBY) tank-mixture of glufosinate plus prometryn provided ≥88% late season control of all weeds. Reduced control of goosegrass and Palmer amaranth was observed with the LAYBY tank mixture of glufosinate plus MSMA when compared to other LAYBY tank mixtures. Cotton lint yields in plots receiving any herbicide application were significantly higher than plots receiving no herbicide application for all application timings. Cotton lint yields were ≥ 740 kg/ha where an EPOST was applied and ≥ 680 kg/ha when a POST herbicide was applied. Cotton lint yields were at least 200 kg/ha greater on plots receiving a LAYBY application when compared to plots where no LAYBY treatment was applied.
Quantifying consumer responses to beef across a broad range of demographics, nationalities and cooking methods is vitally important for any system evaluating beef eating quality. On the basis of previous work, it was expected that consumer scores would be highly accurate in determining quality grades for beef, thereby providing evidence that such a technique could be used to form the basis of and eating quality grading system for beef. Following the Australian MSA (Meat Standards Australia) testing protocols, over 19 000 consumers from Northern Ireland, Poland, Ireland, France and Australia tasted cooked beef samples, then allocated them to a quality grade; unsatisfactory, good-every-day, better-than-every-day and premium. The consumers also scored beef samples for tenderness, juiciness, flavour-liking and overall-liking. The beef was sourced from all countries involved in the study and cooked by four different cooking methods and to three different degrees of doneness, with each experimental group in the study consisting of a single cooking doneness within a cooking method for each country. For each experimental group, and for the data set as a whole, a linear discriminant function was calculated, using the four sensory scores which were used to predict the quality grade. This process was repeated using two conglomerate scores which are derived from weighting and combining the consumer sensory scores for tenderness, juiciness, flavour-liking and overall-liking, the original meat quality 4 score (oMQ4) (0.4, 0.1, 0.2, 0.3) and current meat quality 4 score (cMQ4) (0.3, 0.1, 0.3, 0.3). From the results of these analyses, the optimal weightings of the sensory scores to generate an ‘ideal meat quality 4 score (MQ4)’ for each country were calculated, and the MQ4 values that reflected the boundaries between the four quality grades were determined. The oMQ4 weightings were far more accurate in categorising European meat samples than the cMQ4 weightings, highlighting that tenderness is more important than flavour to the consumer when determining quality. The accuracy of the discriminant analysis to predict the consumer scored quality grades was similar across all consumer groups, 68%, and similar to previously reported values. These results demonstrate that this technique, as used in the MSA system, could be used to predict consumer assessment of beef eating quality and therefore to underpin a commercial eating quality guarantee for all European consumers.
Bartonellae are blood- and vector-borne Gram-negative bacteria, recognized as emerging pathogens. Whole-blood samples were collected from 58 free-ranging lions (Panthera leo) in South Africa and 17 cheetahs (Acinonyx jubatus) from Namibia. Blood samples were also collected from 11 cheetahs (more than once for some of them) at the San Diego Wildlife Safari Park. Bacteria were isolated from the blood of three (5%) lions, one (6%) Namibian cheetah and eight (73%) cheetahs from California. The lion Bartonella isolates were identified as B. henselae (two isolates) and B. koehlerae subsp. koehlerae. The Namibian cheetah strain was close but distinct from isolates from North American wild felids and clustered between B. henselae and B. koehlerae. It should be considered as a new subspecies of B. koehlerae. All the Californian semi-captive cheetah isolates were different from B. henselae or B. koehlerae subsp. koehlerae and from the Namibian cheetah isolate. They were also distinct from the strains isolated from Californian mountain lions (Felis concolor) and clustered with strains of B. koehlerae subsp. bothieri isolated from free-ranging bobcats (Lynx rufus) in California. Therefore, it is likely that these captive cheetahs became infected by an indigenous strain for which bobcats are the natural reservoir.