We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Australian SKA Pathfinder (ASKAP) offers powerful new capabilities for studying the polarised and magnetised Universe at radio wavelengths. In this paper, we introduce the Polarisation Sky Survey of the Universe’s Magnetism (POSSUM), a groundbreaking survey with three primary objectives: (1) to create a comprehensive Faraday rotation measure (RM) grid of up to one million compact extragalactic sources across the southern ∼ 50 per cent of the sky (20,630 deg2); (2) to map the intrinsic polarisation and RM properties of a wide range of discrete extragalactic and Galactic objects over the same area; and (3) to contribute interferometric data with excellent surface brightness sensitivity, which can be combined with single-dish data to study the diffuse Galactic interstellar medium. Observations for the full POSSUM survey commenced in May 2023 and are expected to conclude by mid-2028. POSSUM will achieve an RM grid density of around 30–50 RMs per square degree with a median measurement uncertainty of ∼1 rad m−2. The survey operates primarily over a frequency range of 800–1088 MHz, with an angular resolution of 20″ and a typical RMS sensitivity in Stokes Q or U of 18 μJy beam−1. Additionally, the survey will be supplemented by similar observations covering 1296–1440 MHz over 38 per cent of the sky. POSSUM will enable the discovery and detailed investigation of magnetized phenomena in a wide range of cosmic environments, including the intergalactic medium and cosmic web, galaxy clusters and groups, active galactic nuclei and radio galaxies, the Magellanic System and other nearby galaxies, galaxy halos and the circumgalactic medium, and the magnetic structure of the Milky Way across a very wide range of scales, as well as the interplay between these components. This paper reviews the current science case developed by the POSSUM Collaboration and provides an overview of POSSUM’s observations, data processing, outputs, and its complementarity with other radio and multi-wavelength surveys, including future work with the SKA.
We examine the optical counterparts of the 1829 neutral hydrogen (H I) detections in three pilot fields in the Widefield ASKAP L-band Legacy All-sky Blind surveY (WALLABY) using data from the Dark Energy Spectroscopic Instrument (DESI) Legacy Imaging Surveys DR10. We find that 17 per cent (315) of the detections are optically low surface brightness galaxies (LSBGs; mean g-band surface brightness within 1 Re of > 23 mag arcsec−2) and 3 per cent (55) are optically ‘dark’. We find that the gas-rich WALLABY LSBGs have low star formation efficiencies, and have stellar masses spanning five orders of magnitude, which highlights the diversity of properties across our sample. 75 per cent of the LSBGs and all of the dark H I sources had not been catalogued prior to WALLABY. We examine the optically dark sample of the WALLABY pilot survey to verify the fidelity of the catalogue and investigate the implications for the full survey for identifying dark H I sources. We assess the H I detections without optical counterparts and identify 38 which pass further reliability tests. Of these, we find that 13 show signatures of tidal interactions. The remaining 25 detections have no obvious tidal origin, so are candidates for isolated galaxies with high H I masses, but low stellar masses and star-formation rates. Deeper H I and optical follow-up observations are required to verify the true nature of these dark sources.
Australian children fall short of national dietary guidelines with only 63 % consuming adequate fruit and 10 % enough vegetables. Before school care operates as part of Out of School Hours Care (OSHC) services and provides opportunities to address poor dietary habits in children. The aim of this study was to describe the food and beverages provided in before school care and to explore how service-level factors influence food provision.
Design:
A cross-sectional study was conducted in OSHC services. Services had their before school care visited twice between March and June 2021. Direct observation was used to capture food and beverage provision and child and staff behaviour during breakfast. Interviews with staff collected information on service characteristics. Foods were categorised using the Australian Dietary Guidelines, and frequencies were calculated. Fisher’s exact test was used to compare food provision with service characteristics.
Setting:
The before school care of OSHC services in New South Wales, Australia.
Participants:
Twenty-five OSHC services.
Results:
Fruit was provided on 22 % (n 11) of days and vegetables on 12 % (n 6). Services with nutrition policies containing specific language on food provision (i.e. measurable) were more likely to provide fruit compared with those with policies using non-specific language (P= 0·027). Services that reported receiving training in healthy eating provided more vegetables than those who had not received training (P= 0·037).
Conclusions:
Before school care can be supported to improve food provision through staff professional development and advocating to regulatory bodies for increased specificity requirements in the nutrition policies of service providers.
North Carolina growers have long struggled to control Italian ryegrass, and recent research has confirmed that some Italian ryegrass biotypes have become resistant to nicosulfuron, glyphosate, clethodim, and paraquat. Integrating alternative management strategies is crucial to effectively control such biotypes. The objectives of this study were to evaluate Italian ryegrass control with cover crops and fall-applied residual herbicides and investigate cover crop injury from residual herbicides. This study was conducted during the fall/winter of 2021–22 in Salisbury, NC, and fall/winter of 2021–22 and 2022–23 in Clayton, NC. The study was designed as a 3 × 5 split-plot in which the main plot consisted of three cover crop treatments (no-cover, cereal rye at 80 kg ha−1, and crimson clover at 18 kg ha−1), and the subplots consisted of five residual herbicide treatments (S-metolachlor, flumioxazin, metribuzin, pyroxasulfone, and nontreated). In the 2021–22 season at Clayton, metribuzin injured cereal rye and crimson clover 65% and 55%, respectively. However, metribuzin injured both cover crops ≤6% in 2022–23. Flumioxazin resulted in unacceptable crimson clover injury of 50% and 38% in 2021–22 and 2022–23 in Clayton and 40% in Salisbury, respectively. Without preemergence herbicides, cereal rye controlled Italian ryegrass by 85% and 61% at 24 wk after planting in 2021–22 and 2022–23 in Clayton and 82% in Salisbury, respectively. In 2021–22, Italian ryegrass seed production was lowest in cereal rye plots at both locations, except when it was treated with metribuzin. For example, in Salisbury, cereal rye plus metribuzin resulted in 39,324 seeds m–2, compared to ≤4,386 seeds m–2 from all other cereal rye treatments. In 2022–23, Italian ryegrass seed production in cereal rye was lower when either metribuzin or pyroxasulfone were used preemergence (2,670 and 1,299 seeds m–2, respectively) compared with cereal rye that did not receive an herbicide treatment (5,600 seeds m–2). cereal rye (Secale cereale L.) and crimson clover (Trifolium incarnatum L.)
Herbicide drift to sensitive crops can result in significant injury, yield loss, and even crop destruction. When pesticide drift is reported to the Georgia Department of Agriculture (GDA), tissue samples are collected and analyzed for residues. Seven field studies were conducted in 2020 and 2021 in cooperation with the GDA to evaluate the effect of (1) time interval between simulated drift event and sampling, (2) low-dose herbicide rates, and (3) the sample collection methods on detecting herbicide residues in cotton (Gossypium hirsutum L.) foliage. Simulated drift rates of 2,4-D, dicamba, and imazapyr were applied to non-tolerant cotton in the 8- to 9-leaf stage with plant samples collected at 7 or 21 d after treatment (DAT). During collection, plant sampling consisted of removing entire plants or removing new growth occurring after the 7-leaf stage. Visual cotton injury from 2,4-D reached 43% to 75% at 0.001 and 0.004 kg ae ha−1, respectively; for dicamba, it was 9% to 41% at 0.003 or 0.014 kg ae ha−1, respectively; and for imazapyr, it was 1% to 74% with 0.004 and 0.03 kg ae ha−1 rates, respectively. Yield loss was observed with both rates of 2,4-D (11% to 51%) and with the high rate of imazapyr (52%); dicamba did not influence yield. Herbicide residues were detected in 88%, 88%, and 69% of samples collected from plants treated with 2,4-D, dicamba, and imazapyr, respectively, at 7 DAT compared with 25%, 16%, and 22% when samples were collected at 21 DAT, highlighting the importance of sampling quickly after a drift event. Although the interval between drift event and sampling, drift rate, and sampling method can all influence residue detection for 2,4-D, dicamba, and imazapyr, the factor with the greatest influence is the amount of time between drift and sample collection.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
The National Environmental Isotope Facility (NEIF) Radiocarbon Laboratory at the Scottish Universities Environmental Research Centre (SUERC) performs radiocarbon measurement of a wide range of sample matrices for applications in environmental research. Radiocarbon is applied to palaeoenvironmental, palaeoceanographic, and palaeoclimatic investigations, as well as work to understand the source, fate, turnover, and age of carbon in the modern carbon cycle. The NEIF Radiocarbon Laboratory supports users in the development and deployment of novel sampling techniques and laboratory approaches. Here, we give an overview of methods and procedures used by the laboratory to support the field collection, laboratory processing, and measurement of samples. This includes in-house development of novel and/or specialized methods and approaches, such as field collection of CO2 and CH4, hydropyrolysis, and ramped oxidation. The sample types covered include organic remains (e.g., plant material, peat, wood, charcoal, proteins), carbonates (e.g., speleothems, foraminifera, mollusc shell, travertine), waters (dissolved organic and inorganic carbon), gases (CO2 and CH4), soils and sediments (including sub-fractions).
Therapeutics targeting frontotemporal dementia (FTD) are entering clinical trials. There are challenges to conducting these studies, including the relative rarity of the disease. Remote assessment tools could increase access to clinical research and pave the way for decentralized clinical trials. We developed the ALLFTD Mobile App, a smartphone application that includes assessments of cognition, speech/language, and motor functioning. The objectives were to determine the feasibility and acceptability of collecting remote smartphone data in a multicenter FTD research study and evaluate the reliability and validity of the smartphone cognitive and motor measures.
Participants and Methods:
A diagnostically mixed sample of 207 participants with FTD or from familial FTD kindreds (CDR®+NACC-FTLD=0 [n=91]; CDR®+NACC-FTLD=0.5 [n=39]; CDR®+NACC-FTLD>1 [n=39]; unknown [n=38]) were asked to remotely complete a battery of tests on their smartphones three times over two weeks. Measures included five executive functioning (EF) tests, an adaptive memory test, and participant experience surveys. A subset completed smartphone tests of balance at home (n=31) and a finger tapping test (FTT) in the clinic (n=11). We analyzed adherence (percentage of available measures that were completed) and user experience. We evaluated Spearman-Brown split-half reliability (100 iterations) using the first available assessment for each participant. We assessed test-retest reliability across all available assessments by estimating intraclass correlation coefficients (ICC). To investigate construct validity, we fit regression models testing the association of the smartphone measures with gold-standard neuropsychological outcomes (UDS3-EF composite [Staffaroni et al., 2021], CVLT3-Brief Form [CVLT3-BF] Immediate Recall, mechanical FTT), measures of disease severity (CDR®+NACC-FTLD Box Score & Progressive Supranuclear Palsy Rating Scale [PSPRS]), and regional gray matter volumes (cognitive tests only).
Results:
Participants completed 70% of tasks. Most reported that the instructions were understandable (93%), considered the time commitment acceptable (97%), and were willing to complete additional assessments (98%). Split-half reliability was excellent for the executive functioning (r’s=0.93-0.99) and good for the memory test (r=0.78). Test-retest reliabilities ranged from acceptable to excellent for cognitive tasks (ICC: 0.70-0.96) and were excellent for the balance (ICC=0.97) and good for FTT (ICC=0.89). Smartphone EF measures were strongly associated with the UDS3-EF composite (ß's=0.6-0.8, all p<.001), and the memory test was strongly correlated with total immediate recall on the CVLT3-BF (ß=0.7, p<.001). Smartphone FTT was associated with mechanical FTT (ß=0.9, p=.02), and greater acceleration on the balance test was associated with more motor features (ß=0.6, p=0.02). Worse performance on all cognitive tests was associated with greater disease severity (ß's=0.5-0.7, all p<.001). Poorer performance on the smartphone EF tasks was associated with smaller frontoparietal/subcortical volume (ß's=0.4-0.6, all p<.015) and worse memory scores with smaller hippocampal volume (ß=0.5, p<.001).
Conclusions:
These results suggest remote digital data collection of cognitive and motor functioning in FTD research is feasible and acceptable. These findings also support the reliability and validity of unsupervised ALLFTD Mobile App cognitive tests and provide preliminary support for the motor measures, although further study in larger samples is required.
Among individuals with Parkinson’s Disease (PD), visual hallucinations (VH) and mild cognitive impairment (MCI) are highly prevalent and often co-occur. Atrophy in similar brain regions [e.g. cholinergic basal forebrain (BF) nuclei] as well as specific cognitive difficulties (e.g. posterior-cortical abilities such as semantic fluency and visuoperception) have been associated with the presentation of each symptom type. While there are separate lines of evidence implicating BF volume in MCI and VH, no study to date has examined BF integrity in patients with concurrent MCI and VH symptomology. Furthermore, no prior studies examining BF integrity in MCI and VH have accounted for the potential confounding effects of dopaminergic medications which are known to exacerbate both symptom types. The aims of this study were to harmonize or bridge the two bodies of literature to determine the common neural substrate of PD-VH and PD-MCI (with an emphasis on the BF), to examine the confounding effects of dopaminergic pharmacotherapy, and to examine whether nondopaminergic “posterior” cognitive abilities differ between PD-MCI with versus without VH.
Participants and Methods:
This study used a clinical chart review and MRI data to examine the associations between BF volume in a large group (n=296) of advanced PD patients (∼10 years disease duration) with and without each VH and MCI, covarying the effect of dopaminergic therapy. A two-way ANCOVA was run on total and regional BF volumes (i.e., total BF volume, and four nuclei including Ch4, Ch4p, Ch1-2, Ch3) using VH and MCI as independent variables, while covarying for dopaminergic medication. Using Mann-Whitney U tests, we compared the performance of individuals with MCI-VH versus that of individuals with MCI-noVH on tasks of semantic verbal fluency and of visuoperceptual skills (e.g., judgement of line orientation, object decision, and silhouettes).
Results:
There were two major findings: (1) atrophy of the Ch4 region in the BF was associated with MCI with VH while Ch1-2 was associated with MCI regardless of VH status, and (2) patients with both MCI and VH had poorer performance than individuals with MCI without VH on tasks measuring object recognition but not on tasks of visuospatial perception or semantic verbal fluency. These results remained stable regardless of whether or not dopaminergic medication was included in the model.
Conclusions:
PD is a heterogeneous disease with different subtypes reflecting both dopaminergic and cholinergic dysfunction. Our findings suggest further dissociations within the cholinergic system. First, atrophy in Ch4, which projects to the cortical mantle, was preferentially associated with VH symptoms and object-based visuoperception deficits. This is consistent with proposals that VH are real-world manifestations of visuoperceptual deficits. Second, Ch1-2 atrophy, which projects primarily to the hippocampus, was associated with MCI regardless of VH. Future research will extend this work to other cognitive abilities such as memory, to analyses of brain networks that implicate the BF, and to the investigation of the relationship between anti-cholinergic medications and symptom presentation in PD.
Our institution sought to evaluate our antimicrobial stewardship empiric treatment recommendations for Salmonella. Results from 36 isolates demonstrated reduced susceptibilities to fluoroquinolones with 1 isolate susceptible only to ceftriaxone. Analysis supports the current recommendation of empiric ceftriaxone therapy for severe infection and updated recommendation for sulfamethoxazole-trimethoprim in non-severe infections.
Sample materials such as sediments and soils contain complex mixtures of different carbon-containing compounds. These bulk samples can be split into individual fractions, based on the temperature of thermal decomposition of their components. When coupled with radiocarbon (14C) measurement of the isolated fractions, this approach offers the advantage of directly investigating the residence time, turnover time, source, or age of the different components within a mixed sample, providing important insights to better understand the cycling of carbon in the environment. Several laboratories have previously reported different approaches to separate radiocarbon samples based on temperature in what is a growing area of interest within the research community. Here, we report the design and operation of a new ramped oxidation facility for separation of sample carbon on the basis of thermal resistance at the NEIF Radiocarbon Laboratory in East Kilbride, UK. Our new instrumentation shares some characteristics with the previously-reported systems applying ramped oxidation and/or ramped pyrolysis for radiocarbon measurement, but also has several differences which we describe and discuss. We also present the results of a thorough program of testing of the new system, which demonstrates both the reproducibility of the thermograms generated during sample combustion, and the reliability of the radiocarbon measurements obtained on individual sample fractions. This is achieved through quantification of the radiocarbon background and analysis of multiple standards of known 14C content during standard operation of the instrumentation.
Background: ALS is a progressive neurodegenerative disease without a cure and limited treatment options. Edaravone, a free radical scavenger, was shown to slow disease progression in a select group of patients with ALS over 6 months; however, the effect on survival was not investigated in randomized trials. The objective of this study is to describe real-world survival effectiveness over a longer timeframe. Methods: This retrospective cohort study included patients with ALS across Canada with symptom onset up to three years. Those with a minimum 6-month edaravone exposure between 2017 and 2022 were enrolled in the interventional arm, and those without formed the control arm. The primary outcome of tracheostomy-free survival was compared between the two groups, accounting for age, sex, ALS-disease progression rate, disease duration, pulmonary vital capacity, bulbar ALS-onset, and presence of frontotemporal dementia or C9ORF72 mutation using inverse propensity treatment weights. Results: 182 patients with mean ± SD age 60±11 years were enrolled in the edaravone arm and 860 in the control arm (mean ± SD age 63±12 years). Mean ± SD time from onset to edaravone initiation was 18±10 months. Tracheostomy-free survival will be calculated. Conclusions: This study will provide evidence for edaravone effectiveness on tracheostomy-free survival in patients with ALS.
Cole crops including broccoli and collard contribute more than $119 million to Georgia’s farm gate value yearly. To ensure maximum profitability, these crops must be planted into weed-free fields. Glyphosate is a tool often used to help achieve this goal because of its broad-spectrum activity on weeds coupled with the knowledge that it poses no threat to the succeeding crop when used as directed. However, recent research suggests that with certain soil textures and production systems, the residual soil activity of glyphosate may damage some crops. Therefore, field experiments were conducted in fall 2019 and 2020 to evaluate transplanted broccoli and collard response to glyphosate applied preplant onto bare soil and what practical mitigation measures could be implemented to reduce crop injury. Herbicide treatments consisted oGf 0, 2.5, or 5 kg ae ha−1 glyphosate applied preplant followed by 1) no mitigation measure, 2) tillage, 3) irrigation, or 4) tillage and irrigation prior to transplanting broccoli and collard by hand. When no mitigation was implemented, the residual activity of glyphosate at 2.5 and 5.0 kg ae ha−1 resulted in 43% to 71% and 79% to 93% injury to broccoli and collard transplants, respectively. This resulted in a 35% to 50% reduction in broccoli marketable head weights and 63% to 71% reduction in collard leaf weights. Irrigation reduced visible damage by 28% to 48%, whereas tillage reduced injury by 43% to 76%, for both crops. Irrigation alleviated yield losses for broccoli but only tillage eliminated yield loss for both crops. Care must be taken when transplanting broccoli and collard into a field recently treated with glyphosate at rates ≥2.5 kg ae ha−1. Its residual activity can damage transplants with injury levels influenced by glyphosate rate, and tillage or irrigation after application and prior to planting.
We assessed patterns of enteric infections caused by 14 pathogens, in a longitudinal cohort study of sequelae in British Columbia (BC) Canada, 2005–2014. Our population cohort of 5.8 million individuals was followed for an average of 7.5 years/person; during this time, 40 523 individuals experienced 42 308 incident laboratory-confirmed, provincially reported enteric infections (96.4 incident infections per 100 000 person-years). Most individuals (38 882/40 523; 96%) had only one, but 4% had multiple concurrent infections or more than one infection across the study. Among individuals with more than one infection, the pathogens and combinations occurring most frequently per individual matched the pathogens occurring most frequently in the BC population. An additional 298 557 new fee-for-service physician visits and hospitalisations for enteric infections, that did not coincide with a reported enteric infection, also occurred, and some may be potentially unreported enteric infections. Our findings demonstrate that sequelae risk analyses should explore the possible impacts of multiple infections, and that estimating risk for individuals who may have had a potentially unreported enteric infection is warranted.
The diagnosis of neurodegenerative and psychiatric disorders (NPDs) in primary care can suffer from inefficiencies resulting in misdiagnoses and delayed diagnosis, limiting effective treatment options. The development of speech and language-based profiling biomarkers could aid in achieving earlier motor diagnosis for PD for instance, or more accurate diagnosis of clinically similar or late presenting NPDs.
Objectives
RHAPSODY aims to investigate the feasibility of the remote administration of a battery of speech tasks in eliciting continuous narrative speech across a range of NPDs. The project also aims to determine the feasibility of using acoustic and linguistic biomarkers from speech data to support the clinical assessment and disambiguation of common NPDs
Methods
All participants (n=250) will take part in a single virtual telemedicine video conference with a researcher in which they are screened and complete a battery of speech tasks, in addition to cohort-specific screening measures. Over the following month, participants will be asked to complete a series of short, self-administered speech assessments via a smartphone application.
Results
The speech tasks will be audio-recorded and analysed on Novoic’s technology platform. Objectives will be analysed using measures including average length of speech elicitation for speech tasks, intra- and inter-subject variance, differences in linguistic patterns, and response rates to speech assessments.
Conclusions
The analyses could help to identify and validate speech-derived clinical biomarkers to support clinicians in detecting and disambiguating between NPDs with heterogeneous presentations. This should further support earlier intervention, improved treatment options and improved quality of life.
Disclosure
In terms of significant financial interest and relationships, it is emphasised that the private organisation Novoic, who aim to develop speech algorithms for diagnostic use, is the study’s sponsor and employees or former employees of this company comprise
Little is known about Se intakes and status in very young New Zealand children. However, Se intakes below recommendations and lower Se status compared with international studies have been reported in New Zealand (particularly South Island) adults. The Baby-Led Introduction to SolidS (BLISS) randomised controlled trial compared a modified version of baby-led weaning (infants feed themselves rather than being spoon-fed), with traditional spoon-feeding (Control). Weighed 3-d diet records were collected and plasma Se concentration measured using inductively coupled plasma mass spectrometry (ICP-MS). In total, 101 (BLISS n 50, Control n 51) 12-month-old toddlers provided complete data. The OR of Se intakes below the estimated average requirement (EAR) was no different between BLISS and Control (OR: 0·89; 95 % CI 0·39, 2·03), and there was no difference in mean plasma Se concentration between groups (0·04 μmol/l; 95 % CI −0·03, 0·11). In an adjusted model, consuming breast milk was associated with lower plasma Se concentrations (–0·12 μmol/l; 95 % CI −0·19, −0·04). Of the food groups other than infant milk (breast milk or infant formula), ‘breads and cereals’ contributed the most to Se intakes (12 % of intake). In conclusion, Se intakes and plasma Se concentrations of 12-month-old New Zealand toddlers were no different between those who had followed a baby-led approach to complementary feeding and those who followed traditional spoon-feeding. However, more than half of toddlers had Se intakes below the EAR.
The tolerance of cereal rye to eight herbicides registered for use in wheat, at two rates, was evaluated for potential labeling in cereal rye to expand limited chemical weed control options. Across five site-years, halauxifen-methyl + florasulam, pyroxsulam, and thifensulfuron-methyl + tribenuron-methyl applied at a 2X rate to cereal rye at Zadoks (Z) 13 caused less than 15% injury and had no impact on cereal rye density. These herbicides at the 2X rate reduced cereal rye heights 11% at 10 days after treatment (DAT), with rye recovering by 31 DAT; cereal rye heights were not reduced with these herbicides at their 1X rate. In contrast, significant injury was observed with the 1X rate of mesosulfuron-methyl (45%), pinoxaden (27%), and pinoxaden + fenoxaprop-P-ethyl (30%) applied postemergence; early-season height was reduced 19% to 26%. Residual herbicide pyroxasulfone applied as a delayed preemergence at Z 10 and flumioxazin + pyroxasulfone applied at Z 11 caused 27% to 28% and 16% to 47% injury, respectively, when the 1X rate was activated by rainfall within 2 d of application. These residual herbicides reduced cereal rye height and density up to 35% and 40%, respectively. Cereal rye grain yield was not influenced by herbicide or rate applied.
Background: Visual impairment can impact 70% of individuals who have experienced a stroke. Identification and remediation of visual impairments can improve overall function and perceived quality of life. Our project aimed to improve visual assessment and timely intervention for patients with post-stroke visual impairment (PSVI). Methods: We conducted a quality improvement initiative to create a standardized screening and referral process for patients with PSVI to access an orthoptist. Post-stroke visual impairment was identified using the Visual Screen Assessment (VISA) tool. Patients filled out a VFQ-25 questionnaire before and after orthoptic assessment, and differences between scores were evaluated. Results: Eighteen patients completed the VFQ-25 both before and after orthoptic assessment. Of the vision related constructs, there was a significant improvement in reported outcomes for general vision (M=56.9, SD=30.7; M=48.6, SD=16.0), p=0.002, peripheral vision (M=88.3, SD=16; M=75, SD=23.1), p= 0.027, ocular pain (M=97.2, SD=6.9; M=87.5, SD=21.4), p=0.022, near activities (M=82.4, SD=24.1; M=67.8, SD=25.6), p<0.001, social functioning (M=90.2, SD=19; M=78.5, SD=29.3), p=0.019, mental health (M=84.0, SD=25.9; M=70.5, SD=31.2), p=0.017, and role difficulties (M=84.7, SD=26.3; M=67.4, SD=37.9), p=0.005. Conclusions: Orthoptic assessments for those with PSVI significantly improved perceived quality of life in a numerous vision related constructs, suggesting it is a valuable part of a patient’s post-stroke recovery.
Georgia vegetable growers produce more than 27% of the nation’s fresh-market cucumbers. To maximize yields and profit, fields must be weed-free when planting. Limitations with current burndown herbicide options motivated academic, industry, and U.S. Department of Agriculture partners to search for new tools to assist growers. One possibility, glufosinate, controls many common and troublesome weeds, but its influence on cucumber development through residual activity when applied before or at planting is not understood. Thus, four different studies were each conducted two to four times from 2017 to 2020 to determine 1) transplant cucumber response to preplant glufosinate applications as influenced by rate, overhead irrigation, and interval between application and planting; and 2) seeded cucumber response to preemergence (PRE) glufosinate applications as influenced by rate, overhead irrigation, and planting depth. Glufosinate applied at 330, 660, 980, and 1,640 g ai ha−1 the day before transplanting caused 11% to 53% injury on sandy, low organic matter soils. Cucumber vine lengths and plant biomass were reduced up to 28% and 46%, respectively, with the three highest rates. Early-season yield (harvests 1 to 4) noted a 31% to 60% yield loss with glufosinate at 660 to 1,640 g ha−1 with similar trends observed with total yield (11 to 13 harvests). Irrigation (0.75 cm) after application and before transplanting reduced injury to less than 21%, eliminated vine length and biomass suppression except at the highest rate, and eliminated yield loss. Extending the interval between glufosinate application and transplanting from 1 to 4 d was not beneficial, and further extending the interval to 7 d significantly reduced injury half the time. When applied PRE to seeded cucumber and combining the data across locations, glufosinate caused less than 7% injury even at 1,640 g ha−1. Seeded plant vine lengths, biomass, and marketable yield were not influenced by the PRE application, and neither irrigation nor planting depth influenced seeded crop response to glufosinate.