We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In pasture-based dairy production systems, identifying the appropriate stocking rate (SR; cows/ha) based on the farm grass growth is a key strategic decision for driving the overall farm business. This paper investigates a number of scenarios examining the effects of SR (2–3 cows/ha (0.25 unit changes)), annual nitrogen (N) fertilizer application rates (0–300 kg N/ha (50 kg/ha unit changes)), soil type (heavy and a free-draining soil) and agroclimate location ((south and northeast of Ireland) across 16 years) on pasture growth and forage self-sufficiency using the pasture-based herd dynamic milk model merged with the Moorepark St Gilles grass growth model. The modelled outputs were grass growth, grass dry matter intake, silage harvested and offered, overall farm forage self-sufficiency and N surplus. The model outputs calculated that annual grass yield increased from 9436 kg DM/ha/year when 0 kg N/ha/year was applied to 14 996 kg DM/ha/year when 300 kg N/ha/year were applied, with an average N response of 18.4 kg DM/kg N applied (range of 9.9–27.7 kg DM/kg N applied). Systems stocked at 2.5 cows/ha and applying 250–300 kg N fertilizer/ha/year were self-sufficient for forage. As N input was reduced from 250 kg N/ha/year, farm forage self-sufficiency declined, as did farm N surplus. The results showed that a reduction in N fertilizer application of 50 kg/ha/year will require a reduction in an SR of 0.18 cows/ha to maintain self-sufficiency (R2 = 0.90).
If current food consumption patterns continue, the agriculture sector must provide significantly more food in the coming years from the available land area. Some livestock systems engage in feed–food competition as arable land is used for livestock feed rather than as crops for food; reducing the global supply of food. There is a growing argument that to meet future-food demands sustainably, feed–food competition must be minimized. To this end, we evaluated the effectiveness of two refined metrics to quantify feed–food competition in three livestock systems; dairy and its beef, suckler beef and pig production in Ireland. The metrics are edible protein conversion ratio (EPCR) and the land-use ratio (LUR). The EPCR compares the amount of human digestible protein (HDP) in livestock feed against the amount of HDP the livestock produced, calculating how efficiently it produces HDP. However, the LUR compares the potential HDP from a crop system on the land used to produce the livestock's feed against the HDP the livestock system produced. In both metrics, a value <1 demonstrates an efficient system. The EPCR values for dairy beef (0.22) and suckler beef (0.29) systems consider them efficient producers, whereas pig production (1.51) is inefficient. The LUR values designate that only the dairy beef (0.58) is a net positive producer of HDP from the land used for its feed, with crop production producing more HDP than suckler beef (1.34) and pig production (1.73). Consequently, the LUR can be deemed to be more suitable to represent feed–food competition in livestock production.
Early-life nutrition plays a key role in establishing healthy lifestyles and preventing chronic disease. This study aimed to (1) explore healthcare professionals’ (HCP) opinions on the acceptability of and factors influencing the delivery of interventions to promote healthy infant feeding behaviours within primary care and (2) identify proposed barriers/enablers to delivering such interventions during vaccination visits, to inform the development of a childhood obesity prevention intervention.
Design:
A qualitative study design was employed using semi-structured telephone interviews. Data were analysed using qualitative content analysis; findings were also mapped to the Theoretical Framework of Acceptability (TFA).
Setting:
Primary care in Ireland
Participants:
Twenty-one primary care-based HCP: five practice nurses, seven general practitioners, three public health nurses, three community dietitians and three community medical officers.
Results:
The acceptability of delivering interventions to promote healthy infant feeding within primary care is influenced by the availability of resources, HCP’s roles and priorities, and factors relating to communication and relationships between HCP and parents. Proposed barriers and enablers to delivering interventions within vaccination visits include time constraints v. opportunistic access, existing relationships and trust between parents and practice nurses, and potential communication issues. Barriers/enablers mapped to TFA constructs of Affective Attitude, Perceived Effectiveness and Self-Efficacy.
Conclusions:
This study provides a valuable insight into HCP perspectives of delivering prevention-focused infant feeding interventions within primary care settings. While promising, factors such as coordination and clarity of HCP roles and resource allocation need to be addressed to ensure acceptability of interventions to HCP involved in delivery.
A 4-year (2010–2013) plot study was undertaken to evaluate the effect of nitrogen (N) fertilizer rate (0, 60, 120, 196 and 240 kg N/ha/year) on seasonal responses and species persistency in frequently and tightly grazed (⩽4 cm) grass-only (GO) and grass white clover swards (GWc). Increasing N application rate increased herbage removed and pre-grazing sward height. Cows frequently grazed the GWc tighter than the GO. Increasing N rate reduced clover content, especially during the warmest months of the year, but less so up to 120 kg N/ha/year. The GWc had greater amounts of herbage removed than GO in the May–September period but the effect was less as N rate increased. Cumulative herbage removed from GWc was greater than GO swards receiving the same N rate and herbage quality was better in GWc than GO. Such effects were reduced as swards aged and with increasing N rate. It was concluded that under frequent and tight grazing management: (1) clover inclusion increased annual herbage removed; (2) herbage removed from GWc swards receiving no N was the same as the GO sward receiving 240 kg N/ha, and greater for the 240 GWc swards than the 240 GO swards; (3) clover inclusion benefits were mainly from summer onwards; (4) the management strategy applied in the current experiment may be capable of alleviating the detrimental effect of N fertilizer on clover, to a point between 60 and 120 kg N/ha.
The current experiment was undertaken to investigate the effect of including white clover (Trifolium repens L.; WC) into perennial ryegrass (Lolium perenne L.; PRG) swards (PRG/WC) receiving 250 kg nitrogen (N) per hectare (ha) per year compared with PRG only swards receiving 250 kg N/ha/year, in an intensive grass-based spring calving dairy production scenario. Forty spring-calving cows were allocated to graze either a PRG/WC or PRG sward (n = 20) from 6 February to 31 October 2012. Fresh herbage was offered daily (17 kg dry matter (DM)/cow) supplemented with concentrate in times of herbage deficit (total supplementation 507 kg/cow). Pre-grazing herbage mass (HM), sward WC content and milk production were measured for the duration of the experiment. Herbage DM intake was estimated in May, July and September. Pre-grazing HM (±s.e.) was similar (1467 ± 173·1 kg DM/ha) for both treatments, as was cumulative herbage production (14 158 ± 769 kg DM/ha). Average WC content of the PRG/WC swards was 236 ± 30 g/kg DM. The PRG/WC cows had greater average daily milk yield and milk solids yield from June onwards. Cumulative milk yield and milk solids yield were greater for the PRG/WC cows compared with the PRG cows (5048 and 4789 ± 34·3 kg milk yield/cow, and 400 and 388 ± 1·87 kg milk solids/cow, respectively). Cows had similar DM intake in all measurements periods (15·1 ± 0·42 kg DM/cow/day). In conclusion, including WC in N-fertilized PRG swards increased milk production from cows grazing the PRG/WC swards compared with PRG, particularly in the second half of the lactation.
To measure the trends in traditional marine food intake and serum vitamin D levels in Alaska Native women of childbearing age (20–29 years old) from the 1960s to the present.
Design
We measured a biomarker of traditional food intake, the δ15N value, and vitamin D level, as 25-hydroxycholecalciferol (25(OH)D3) concentration, in 100 serum samples from 20–29-year-old women archived in the Alaska Area Specimen Bank, selecting twenty-five per decade from the 1960s to the 1990s. We compared these with measurements of red-blood-cell δ15N values and serum 25(OH)D3 concentrations from 20–29-year-old women from the same region collected during the 2000s and 2010s in a Center for Alaska Native Health Research study.
Setting
The Yukon Kuskokwim Delta region of south-west Alaska.
Subjects
Alaska Native women (n 319) aged 20–29 years at the time of specimen collection.
Results
Intake of traditional marine foods, as measured by serum δ15N values, decreased significantly each decade from the 1960s through the 1990s, then remained constant from the 1990s through the present (F5,306=77·4, P<0·0001). Serum vitamin D concentrations also decreased from the 1960s to the present (F4,162=26·1, P<0·0001).
Conclusions
Consumption of traditional marine foods by young Alaska Native women dropped significantly between the 1960s and the 1990s and was associated with a significant decline in serum vitamin D concentrations. Studies are needed to evaluate the promotion of traditional marine foods and routine vitamin D supplementation during pregnancy for this population.
To evaluate the clinical effectiveness and cost-effectiveness of Access to Psychological Services Ireland (APSI), a primary care adult psychology service.
Methods
A repeated measures design was used to evaluate the clinical outcomes of service users who completed an intervention. Psychological distress, depressive symptomatology and anxiety symptomatology were measured using the Clinical Outcomes in Routine Evaluation–Outcome Measure (CORE-OM), the Patient Health Questionnaire-9 (PHQ-9) and the Generalised Anxiety Disorder-7 (GAD-7), respectively. Self-reported health and economic outcomes were measured using the EQ-5D-3L and the Eco-Psy, respectively.
Results
A total of 381 adults were assessed as suitable for an APSI intervention, with 198 (52%) of these completing at least one intervention. Significant reductions in psychological distress were observed for completers of guided self-help and brief cognitive behavioural therapy, with service users also showing significant reductions in anxiety and depressive symptomatology. Reliable and clinically significant change on the CORE-OM was observed for 67.9% of treatment completers. Service users reported significant improvements in their health status but did not show changes in their health service usage in the 3-month follow-up period.
Conclusions
APSI provided an accessible service model that was clinically effective in managing a range of mild to moderate mental health difficulties. The cost-effectiveness of the service model may be enhanced by offering a wider range of high-throughput interventions and by increasing the treatment completion rate.
To validate and evaluate a short answer question paper and objective structured clinical examination. Validity and effect on overall performance were considered.
Methods:
Students completed a voluntary short answer question paper during their otolaryngology attachment. Short answer question paper results were collated and compared to the essay examination and new end of year objective structured clinical examination.
Results:
The study comprised 160 students. Questions were validated for internal consistency (Cronbach's alpha = 0.76). Correlations were determined for: short answer question paper and essay results (r = 0.477), short answer question paper and objective structured clinical examination results (r = 0.355), and objective structured clinical examination and essay results (r = 0.292). On unpaired t-tests comparing the short answer question paper group and non-short answer question paper group, essay results were 1.2 marks higher (p = 0.45) and the objective structured clinical examination results were 0.09 marks lower (p = 0.74) in the short answer question paper group.
Conclusion:
Two new valid summative assessments of student ability have been introduced, which contribute to an enhanced programme of assessment to drive student learning.
The superheating that usually occurs when a solid is melted by volumetric heating can produce irregular solid–liquid interfaces. Such interfaces can be visualised in ice, where they are sometimes known as Tyndall stars. This paper describes some of the experimental observations of Tyndall stars and a mathematical model for the early stages of their evolution. The modelling is complicated by the strong crystalline anisotropy, which results in an anisotropic kinetic undercooling at the interface; it leads to an interesting class of free boundary problems that treat the melt region as infinitesimally thin.
In agricultural production systems, nitrogen (N) losses to the environment can occur through nitrous oxide (N2O) emissions and nitrate (NO3−) leaching. The objectives of the present study were to evaluate: (1) if urine excreted by non-lactating dairy cows pulse-dosed with dicyandiamide (DCD) and applied to lysimeters reduced N2O-N emissions and NO3−-N leaching on two soil types; and (2) if urine + DCD would increase herbage production over winter. Lysimeters were used to measure N2O emissions and NO3-N leaching. The soils used were a free-draining acid brown earth of sandy loam to loam texture (termed free-draining) and a poorly drained silt loam gley (termed poorly drained). Grass plots were established on the free-draining soil to measure herbage production. The N loading rate of the urine + DCD was 508 kg N/ha and the urine without DCD (urine only) was 451 kg N/ha. Total NO3−-N leaching losses from the free-draining and poorly draining soils were reduced from 100 and 81 kg NO3−-N/ha on the urine-only treatment, respectively, to 9 and 11·6 kg NO3−-N/ha on the urine + DCD treatment, respectively. Total N2O-N emissions from the free-draining and poorly drained soils were reduced significantly from 13·6 and 12·1 kg N2O-N/ha on the urine-only treatment, respectively, to 2·23 and 5·24 kg N2O-N/ha on the urine + DCD treatment, respectively. Applying urine with DCD to pastures inhibited the nitrification process for up to 56 days after treatment application. In the current experiment, there was no significant effect on spring herbage production when urine + DCD was applied to grass plots. Therefore, feeding DCD to dairy cows to apply DCD directly in urine patches was shown to be an effective mitigation strategy to reduce NO3−-N leaching and N2O-N emissions but did not appear to increase spring herbage production.
Historically, American Indian/Alaska Native (AI/AN) populations have suffered excess morbidity and mortality from influenza. We investigated the risk factors for death from 2009 pandemic influenza A(H1N1) in persons residing in five states with substantial AI/AN populations. We conducted a case-control investigation using pandemic influenza fatalities from 2009 in Alaska, Arizona, New Mexico, Oklahoma and Wyoming. Controls were outpatients with influenza. We reviewed medical records and interviewed case proxies and controls. We used multiple imputation to predict missing data and multivariable conditional logistic regression to determine risk factors. We included 145 fatal cases and 236 controls; 22% of cases were AI/AN. Risk factors (P < 0·05) included: older age [adjusted matched odds ratio (mOR) 3·2, for >45 years vs. <18 years], pre-existing medical conditions (mOR 7·1), smoking (mOR 3·0), delayed receipt of antivirals (mOR 6·5), and barriers to healthcare access (mOR 5·3). AI/AN race was not significantly associated with death. The increased influenza mortality in AI/AN individuals was due to factors other than racial status. Prevention of influenza deaths should focus on modifiable factors (smoking, early antiviral use, access to care) and identifying high-risk persons for immunization and prompt medical attention.
Because of the discretionary nature of voluntary food fortification in the European Union, there is a need to monitor fortification practices and consumption of fortified foods in order to assess the efficacy and safety of such additions on an ongoing basis. The present study aimed to investigate the nutritional impact of changes in voluntary fortification practices in adults aged 18–64 years using dietary intake data from two nationally representative cross-sectional food consumption surveys, the North/South Ireland Food Consumption Survey (NSIFCS) (1997–9) and the National Adult Nutrition Survey (NANS) (2008–10). The supply of fortified foods increased between 1997–9 and 2008–10, resulting in a higher proportion of adults consuming fortified foods (from 67 to 82 %) and a greater contribution to mean daily energy intake (from 4·6 to 8·4 %). The overall nutrient profile of fortified foods consumed remained favourable, i.e. higher in starch and dietary fibre and lower in fat and saturated fat, with polyunsaturated fat, sugars and Na in proportion to energy. Women, particularly those of childbearing age, remained the key beneficiaries of voluntary fortification practices in Ireland. Continued voluntary fortification of foods has increased protection against neural tube defect-affected pregnancy by folic acid and maintained the beneficial impact on the adequacy of Fe intake. Increased consumption of fortified foods did not contribute to an increased risk of intakes exceeding the tolerable upper intake level for any micronutrient. Recent increases in voluntary fortification of foods in Ireland have made a favourable nutritional impact on the diets of adults and have not contributed to an increased risk of adverse effects.