We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Although radiocarbon-accelerator mass spectrometry (14C-AMS) is an important tool for the establishment of soil chronology, its application is challenging due to the complex nature of soil samples. In the present study, chemical extraction methodologies were tested to obtain the most representative age of Amazonian soil deposition by 14C-AMS. We performed acid hydrolysis with different numbers of extractions, as well as treatments combining acid and bases and quartered and non-quartered samples. The ages of the soil organic matter (SOM) fractions were compared to the ages of naturally buried charcoal samples at similar depths. The results showed that the age of the non-hydrolyzable inert fraction of soil was closer to the age of charcoal and older than the ages of humin. It was also observed that the quartering process can influence the results, since the dating of the humin fraction showed variability in the results. Our results are important to provide information about the most suitable method for the 14C-AMS dating of soil samples for paleoenvironment reconstruction studies.
With the growing global population and climate change, achieving food security is a pressing challenge(1). Vertical farming has the potential to support local food production and security. In the UK population females and younger adults appear to be particularly vulnerable to micronutrient shortfalls from food sources alone. Levels of micronutrient intakes including zinc and iron are below the recommended daily intake(2). As a Total Controlled Environment Agriculture (TCEA) system, vertical farming employs hydroponics using a nutrient solution which offers opportunities to modulate nutrient uptake, and thus influence plant mineral and vitamin composition(3).
In this study we aimed to determine the suitability of different crop types for soilless agronomic biofortification with zinc and iron to achieve biofortified crops.
In this study, we investigated the effect of the addition of 20ppm (+20 mg L−1) of zinc (ZnSO4) or iron (Fe-EDTA) to the nutrient solution on the growth and nutritional components in pea microgreens, kale microgreens and kale baby leaf plants. The growth conditions were kept identical throughout the treatments with photoperiod 18 h d-1, temperature 20-22°C and relative humidity at 70-80%. Plant growth, mineral composition, glucosinolate content and protein content were evaluated. Results were analysed using ANOVA (p<0.05, Tukey’s test).
It was determined that higher amounts of zinc in the nutrient solution resulted in significantly higher levels of zinc in all three crops (p<0.05), with increases of 205% in pea microgreens, 264% in babyleaf kale and 217% in kale microgreens compared to the control plants. Higher amounts of iron in the nutrient solution resulted in significantly higher levels of iron only in pea microgreens, with an increase of 38% (p<0.05). Neither dosing regimen negatively influenced the overall crop performance.
These results suggest that the three different crops are suitable for soilless biofortification with zinc and iron, although pea microgreens were the only crop that had a significant increase in iron upon iron-dosing.
Evaluate the association between provider-ordered viral testing and antibiotic treatment practices among children discharged from an ED or hospitalized with an acute respiratory infection (ARI).
Design:
Active, prospective ARI surveillance study from November 2017 to February 2020.
Setting:
Pediatric hospital and emergency department in Nashville, Tennessee.
Participants:
Children 30 days to 17 years old seeking medical care for fever and/or respiratory symptoms.
Methods:
Antibiotics prescribed during the child’s ED visit or administered during hospitalization were categorized into (1) None administered; (2) Narrow-spectrum; and (3) Broad-spectrum. Setting-specific models were built using unconditional polytomous logistic regression with robust sandwich estimators to estimate the adjusted odds ratios and 95% confidence intervals between provider-ordered viral testing (ie, tested versus not tested) and viral test result (ie, positive test versus not tested and negative test versus not tested) and three-level antibiotic administration.
Results:
4,107 children were enrolled and tested, of which 2,616 (64%) were seen in the ED and 1,491 (36%) were hospitalized. In the ED, children who received a provider-ordered viral test had 25% decreased odds (aOR: 0.75; 95% CI: 0.54, 0.98) of receiving a narrow-spectrum antibiotic during their visit than those without testing. In the inpatient setting, children with a negative provider-ordered viral test had 57% increased odds (aOR: 1.57; 95% CI: 1.01, 2.44) of being administered a broad-spectrum antibiotic compared to children without testing.
Conclusions:
In our study, the impact of provider-ordered viral testing on antibiotic practices differed by setting. Additional studies evaluating the influence of viral testing on antibiotic stewardship and antibiotic prescribing practices are needed.
Social connection is associated with better health, including reduced risk of dementia. Personality traits are also linked to cognitive outcomes; neuroticism is associated with increased risk of dementia. Personality traits and social connection are also associated with each other. Taken together, evidence suggests the potential impacts of neuroticism and social connection on cognitive outcomes may be linked. However, very few studies have simultaneously examined the relationships between personality, social connection and health.
Research objective:
We tested the association between neuroticism and cognitive measures while exploring the potential mediating roles of aspects of social connection (loneliness and social isolation).
Method:
We conducted a cross-sectional study with a secondary analysis of the Canadian Longitudinal Study on Aging (CLSA) Comprehensive Cohort, a sample of Canadians aged 45 to 85 years at baseline. We used only self-reported data collected at the first follow-up, between 2015 and 2018 (n= 27,765). We used structural equation modelling to assess the association between neuroticism (exposure) and six cognitive measures (Rey Auditory Verbal Learning Test immediate recall and delayed recall, Animal Fluency Test, Mental Alternation Test, Controlled Oral Word Association Test and Stroop Test interference ratio), with direct and indirect effects (through social isolation and loneliness). We included age, education and hearing in the models and stratified all analyses by sex, females (n= 14,133) and males (n=13,632).
Preliminary results of the ongoing study:
We found positive, statistically significant associations between neuroticism and social isolation (p<0.05) and loneliness (p<0.05), for both males and females. We also found inverse, statistically significant associations between neuroticism and all cognitive measures (p<0.05), except the Stroop Test interference ratio. In these models, there was consistent evidence of indirect effects (through social isolation and loneliness) and, in some cases, evidence of direct effects. We found sex differences in the model results.
Conclusion:
Our findings suggest that the association between neuroticism and cognitive outcomes may be mediated by aspects of social connection and differ by sex. Understanding if and how modifiable risk factors mediate the association between personality and cognitive outcomes would help develop and target intervention strategies that improve social connection and brain health.
To compare the agreement and cost of two recall methods for estimating children’s minimum dietary diversity (MDD).
Design:
We assessed child’s dietary intake on two consecutive days: an observation on day one, followed by two recall methods (list-based recall and multiple-pass recall) administered in random order by different enumerators at two different times on day two. We compared the estimated MDD prevalence using survey-weighted linear probability models following a two one-sided test equivalence testing approach. We also estimated the cost-effectiveness of the two methods.
Setting:
Cambodia (Kampong Thom, Siem Reap, Battambang, and Pursat provinces) and Zambia (Chipata, Katete, Lundazi, Nyimba, and Petauke districts).
Participants:
Children aged 6–23 months: 636 in Cambodia and 608 in Zambia.
Results:
MDD estimations from both recall methods were equivalent to the observation in Cambodia but not in Zambia. Both methods were equivalent to the observation in capturing most food groups. Both methods were highly sensitive although the multiple-pass method accurately classified a higher proportion of children meeting MDD than the list-based method in both countries. Both methods were highly specific in Cambodia but moderately so in Zambia. Cost-effectiveness was better for the list-based recall method in both countries.
Conclusion:
The two recall methods estimated MDD and most other infant and young child feeding indicators equivalently in Cambodia but not in Zambia, compared to the observation. The list-based method produced slightly more accurate estimates of MDD at the population level, took less time to administer and was less costly to implement.
Phase three trials of the monoclonal antibodies lecanemab and donanemab, which target brain amyloid, have reported statistically significant differences in clinical end-points in early Alzheimer's disease. These drugs are already in use in some countries and are going through the regulatory approval process for use in the UK. Concerns have been raised about the ability of healthcare systems, including those in the UK, to deliver these treatments, considering the resources required for their administration and monitoring.
Aims
To estimate the scale of real-world demand for monoclonal antibodies for Alzheimer's disease in the UK.
Method
We used anonymised patient record databases from two National Health Service trusts for the year 2019 to collect clinical, demographic, cognitive and neuroimaging data for these cohorts. Eligibility for treatment was assessed using the inclusion criteria from the clinical trials of donanemab and lecanemab, with consideration given to diagnosis, cognitive performance, cerebrovascular disease and willingness to receive treatment.
Results
We examined the records of 82 386 people referred to services covering around 2.2 million people. After applying the trial criteria, we estimate that a maximum of 906 people per year would start treatment with monoclonal antibodies in the two services, equating to 30 200 people if extrapolated nationally.
Conclusions
Monoclonal antibody treatments for Alzheimer's disease are likely to present a significant challenge for healthcare services to deliver in terms of the neuroimaging and treatment delivery. The data provided here allows health services to understand the potential demand and plan accordingly.
Primary surgical resection remains the mainstay of management in locally advanced differentiated thyroid cancer. Tyrosine kinase inhibitors have recently shown promising results in patients with recurrent locally advanced differentiated thyroid cancer. This study discussed four patients with locally advanced differentiated thyroid cancer managed with tyrosine kinase inhibitors used prior to surgery in the ‘neoadjuvant’ setting.
Method
Prospective data collection through a local thyroid database from February 2016 identified four patients with locally advanced differentiated thyroid cancer unsuitable for primary surgical resection commenced on neoadjuvant tyrosine kinase inhibitor therapy.
Results
All cases had T4a disease at presentation. Three cases tolerated tyrosine kinase inhibitor therapy for more than 14 months while the last case failed to tolerate treatment at 1 month. All patients subsequently underwent total thyroidectomy to facilitate adjuvant radioactive iodine treatment. Disease-specific survival remains at 100 per cent currently (range, 29–75 months).
Conclusion
Neoadjuvant tyrosine kinase inhibitors in locally advanced differentiated thyroid cancer can be effective in reducing primary tumour extent to potentially facilitate a more limited surgical resection for local disease control.
Major Depressive Disorder (MDD) is prevalent, often chronic, and requires ongoing monitoring of symptoms to track response to treatment and identify early indicators of relapse. Remote Measurement Technologies (RMT) provide an exciting opportunity to transform the measurement and management of MDD, via data collected from inbuilt smartphone sensors and wearable devices alongside app-based questionnaires and tasks.
Objectives
To describe the amount of data collected during a multimodal longitudinal RMT study, in an MDD population.
Methods
RADAR-MDD is a multi-centre, prospective observational cohort study. People with a history of MDD were provided with a wrist-worn wearable, and several apps designed to: a) collect data from smartphone sensors; and b) deliver questionnaires, speech tasks and cognitive assessments and followed-up for a maximum of 2 years.
Results
A total of 623 individuals with a history of MDD were enrolled in the study with 80% completion rates for primary outcome assessments across all timepoints. 79.8% of people participated for the maximum amount of time available and 20.2% withdrew prematurely. Data availability across all RMT data types varied depending on the source of data and the participant-burden for each data type. We found no evidence of an association between the severity of depression symptoms at baseline and the availability of data. 110 participants had > 50% data available across all data types, and thus able to contribute to multiparametric analyses.
Conclusions
RADAR-MDD is the largest multimodal RMT study in the field of mental health. Here, we have shown that collecting RMT data from a clinical population is feasible.
Monoclonal antibody therapeutics to treat coronavirus disease (COVID-19) have been authorized by the US Food and Drug Administration under Emergency Use Authorization (EUA). Many barriers exist when deploying a novel therapeutic during an ongoing pandemic, and it is critical to assess the needs of incorporating monoclonal antibody infusions into pandemic response activities. We examined the monoclonal antibody infusion site process during the COVID-19 pandemic and conducted a descriptive analysis using data from 3 sites at medical centers in the United States supported by the National Disaster Medical System. Monoclonal antibody implementation success factors included engagement with local medical providers, therapy batch preparation, placing the infusion center in proximity to emergency services, and creating procedures resilient to EUA changes. Infusion process challenges included confirming patient severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) positivity, strained staff, scheduling, and pharmacy coordination. Infusion sites are effective when integrated into pre-existing pandemic response ecosystems and can be implemented with limited staff and physical resources.
The Variables and Slow Transients Survey (VAST) on the Australian Square Kilometre Array Pathfinder (ASKAP) is designed to detect highly variable and transient radio sources on timescales from 5 s to $\sim\!5$ yr. In this paper, we present the survey description, observation strategy and initial results from the VAST Phase I Pilot Survey. This pilot survey consists of $\sim\!162$ h of observations conducted at a central frequency of 888 MHz between 2019 August and 2020 August, with a typical rms sensitivity of $0.24\ \mathrm{mJy\ beam}^{-1}$ and angular resolution of $12-20$ arcseconds. There are 113 fields, each of which was observed for 12 min integration time, with between 5 and 13 repeats, with cadences between 1 day and 8 months. The total area of the pilot survey footprint is 5 131 square degrees, covering six distinct regions of the sky. An initial search of two of these regions, totalling 1 646 square degrees, revealed 28 highly variable and/or transient sources. Seven of these are known pulsars, including the millisecond pulsar J2039–5617. Another seven are stars, four of which have no previously reported radio detection (SCR J0533–4257, LEHPM 2-783, UCAC3 89–412162 and 2MASS J22414436–6119311). Of the remaining 14 sources, two are active galactic nuclei, six are associated with galaxies and the other six have no multi-wavelength counterparts and are yet to be identified.
The Rapid ASKAP Continuum Survey (RACS) is the first large-area survey to be conducted with the full 36-antenna Australian Square Kilometre Array Pathfinder (ASKAP) telescope. RACS will provide a shallow model of the ASKAP sky that will aid the calibration of future deep ASKAP surveys. RACS will cover the whole sky visible from the ASKAP site in Western Australia and will cover the full ASKAP band of 700–1800 MHz. The RACS images are generally deeper than the existing NRAO VLA Sky Survey and Sydney University Molonglo Sky Survey radio surveys and have better spatial resolution. All RACS survey products will be public, including radio images (with $\sim$ 15 arcsec resolution) and catalogues of about three million source components with spectral index and polarisation information. In this paper, we present a description of the RACS survey and the first data release of 903 images covering the sky south of declination $+41^\circ$ made over a 288-MHz band centred at 887.5 MHz.
Over the past decade, a growing interest has developed on the archaeology, palaeontology, and palaeoenvironments of the Arabian Peninsula. It is now clear that hominins repeatedly dispersed into Arabia, notably during pluvial interglacial periods when much of the peninsula was characterised by a semiarid grassland environment. During the intervening glacial phases, however, grasslands were replaced with arid and hyperarid deserts. These millennial-scale climatic fluctuations have subjected bones and fossils to a dramatic suite of environmental conditions, affecting their fossilisation and preservation. Yet, as relatively few palaeontological assemblages have been reported from the Pleistocene of Arabia, our understanding of the preservational pathways that skeletal elements can take in these types of environments is lacking. Here, we report the first widespread taxonomic and taphonomic assessment of Arabian fossil deposits. Novel fossil fauna are described and overall the fauna are consistent with a well-watered semiarid grassland environment. Likewise, the taphonomic results suggest that bones were deposited under more humid conditions than present in the region today. However, fossils often exhibit significant attrition, obscuring and fragmenting most finds. These are likely tied to wind abrasion, insolation, and salt weathering following fossilisation and exhumation, processes particularly prevalent in desert environments.
This article involved a broad search of applied sciences for milestone technologies we deem to be the most significant innovations applied by the North American pork industry, during the past 10 to 12 years. Several innovations shifted the trajectory of improvement or resolved significant production limitations. Each is being integrated into practice, with the exception being gene editing technology, which is undergoing the federal approval process. Advances in molecular genomics have been applied to gene editing for control of porcine reproductive and respiratory syndrome and to identify piglet genome contributions from each parent. Post-cervical artificial insemination technology is not novel, but this technology is now used extensively to accelerate the rate of genetic progress. A milestone was achieved with the discovery that dietary essential fatty acids, during lactation, were limiting reproduction. Their provision resulted in a dose-related response for pregnancy, pregnancy maintenance and litter size, especially in maturing sows and ultimately resolved seasonal infertility. The benefit of segregated early weaning (12 to 14 days of age) was realized for specific pathogen removal for genetic nucleus and multiplication. Application was premature for commercial practice, as piglet mortality and morbidity increased. Early weaning impairs intestinal barrier and mucosal innate immune development, which coincides with diminished resilience to pathogens and viability later in life. Two important milestones were achieved to improve precision nutrition for growing pigs. The first involved the updated publication of the National Research Council nutrient requirements for pigs, a collaboration between scientists from America and Canada. Precision nutrition advanced further when ingredient description, for metabolically available amino acids and net energy (by source plant), became a private sector nutrition product. The past decade also led to fortuitous discoveries of health-improving components in ingredients (xylanase, soybeans). Finally, two technologies converged to facilitate timely detection of multiple pathogens in a population: oral fluids sampling and polymerase chain reaction (PCR) for pathogen analysis. Most critical diseases in North America are now routinely monitored by oral fluid sampling and prepared for analysis using PCR methods.
To evaluate long-term efficacy of deutetrabenazine in patients with tardive dyskinesia (TD) by examining response rates from baseline in Abnormal Involuntary Movement Scale (AIMS) scores. Preliminary results of the responder analysis are reported in this analysis.
Background
In the 12-week ARM-TD and AIM-TD studies, the odds of response to deutetrabenazine treatment were higher than the odds of response to placebo at all response levels, and there were low rates of overall adverse events and discontinuations associated with deutetrabenazine.
Method
Patients with TD who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration and a long-term maintenance phase. The cumulative proportion of AIMS responders from baseline was assessed. Response was defined as a percent improvement from baseline for each patient from 10% to 90% in 10% increments. AlMS score was assessed by local site ratings for this analysis.
Results
343 patients enrolled in the extension study (111 patients received placebo in the parent study and 232 patients received deutetrabenazine). At Week 54 (n=145; total daily dose [mean±standard error]: 38.1±0.9mg), 63% of patients receiving deutetrabenazine achieved ≥30% response, 48% of patients achieved ≥50% response, and 26% achieved ≥70% response. At Week 80 (n=66; total daily dose: 38.6±1.1mg), 76% of patients achieved ≥30% response, 59% of patients achieved ≥50% response, and 36% achieved ≥70% response. Treatment was generally well tolerated.
Conclusions
Patients who received long-term treatment with deutetrabenazine achieved response rates higher than those observed in positive short-term studies, indicating clinically meaningful long-term treatment benefit.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California, USA.
Funding Acknowledgements: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel.
To evaluate the long-term safety and tolerability of deutetrabenazine in patients with tardive dyskinesia (TD) at 2years.
Background
In the 12-week ARM-TD and AIM-TD studies, deutetrabenazine showed clinically significant improvements in Abnormal Involuntary Movement Scale scores compared with placebo, and there were low rates of overall adverse events (AEs) and discontinuations associated with deutetrabenazine.
Method
Patients who completed ARM-TD or AIM-TD were included in this open-label, single-arm extension study, in which all patients restarted/started deutetrabenazine 12mg/day, titrating up to a maximum total daily dose of 48mg/day based on dyskinesia control and tolerability. The study comprised a 6-week titration period and a long-term maintenance phase. Safety measures included incidence of AEs, serious AEs (SAEs), and AEs leading to withdrawal, dose reduction, or dose suspension. Exposure-adjusted incidence rates (EAIRs; incidence/patient-years) were used to compare AE frequencies for long-term treatment with those for short-term treatment (ARM-TD and AIM-TD). This analysis reports results up to 2 years (Week106).
Results
343 patients were enrolled (111 patients received placebo in the parent study and 232 received deutetrabenazine). There were 331.4 patient-years of exposure in this analysis. Through Week 106, EAIRs of AEs were comparable to or lower than those observed with short-term deutetrabenazine and placebo, including AEs of interest (akathisia/restlessness [long-term EAIR: 0.02; short-term EAIR range: 0–0.25], anxiety [0.09; 0.13–0.21], depression [0.09; 0.04–0.13], diarrhea [0.06; 0.06–0.34], parkinsonism [0.01; 0–0.08], somnolence/sedation [0.09; 0.06–0.81], and suicidality [0.02; 0–0.13]). The frequency of SAEs (EAIR 0.15) was similar to those observed with short-term placebo (0.33) and deutetrabenazine (range 0.06–0.33) treatment. AEs leading to withdrawal (0.08), dose reduction (0.17), and dose suspension (0.06) were uncommon.
Conclusions
These results confirm the safety outcomes seen in the ARM-TD and AIM-TD parent studies, demonstrating that deutetrabenazine is well tolerated for long-term use in TD patients.
Presented at: American Academy of Neurology Annual Meeting; April 21–27, 2018, Los Angeles, California,USA
Funding Acknowledgements: Funding: This study was supported by Teva Pharmaceuticals, Petach Tikva, Israel
Under current Australian industry pre-slaughter guidelines, lambs may be off feed for up to 48 h before slaughter. The purpose of this study was to examine what proportion of circulating metabolites at slaughter are due to stress and feed deprivation and if this response differs between Merino and Terminal genotypes. In addition the effect of feed deprivation on carcass weight and meat quality was examined. Jugular blood samples were collected from 88 Merino and Terminal sired lambs at rest and at slaughter following 24, 36 and 48 h of feed deprivation and plasma analysed for glucose, lactate, non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHOB). From the same carcasses hot carcass weight (HCWT) were measured as well as a suite of meat quality traits measured such as M. longissimus lumborum (loin) and M. semitendinosus pH at 24 h postmortem. Loin samples were also analysed for intramuscular fat content and Warner–Bratzer Shear Force. Merino sired lambs had a higher NEFA response compared to Terminal sired lambs at slaughter after 24, 36 and 48 h of feed deprivation, with NEFA levels up to 35% higher than previously reported in the same animals at rest in animal house conditions, whereas BHOB response to feed deprivation was not affected by sire type (P>0.05) and similar to previously reported at rest. In addition to the metabolic effects, increasing feed deprivation from 36 h was associated with a 3% reduction in HCWT and dressing percentage as well as causing increased ultimate pH in the M. semitendinosus in Merino sired lambs. Findings from this study demonstrate that Merino and Terminal sired lambs differ in their metabolic response to feed deprivation under commercial slaughter conditions. In addition, commercial feed deprivation appears to have a negative effect on ultimate pH and carcass weight and warrants further investigation.
The aim of this study was to examine the metabolic response to feed deprivation up to 48 h in low and high yielding lamb genotypes. It was hypothesised that Terminal sired lambs would have decreased plasma glucose and increased plasma non-esterified fatty acids (NEFA) and β-hydroxybutyrate (BHOB) concentrations in response to feed deprivation compared to Merino sired lambs. In addition, it was hypothesised that the metabolic changes due to feed deprivation would also be greater in progeny of sires with breeding values for greater growth, muscling and leanness. Eighty nine lambs (45 ewes, 44 wethers) from Merino dams with Merino or Terminal sires with a range in Australian Sheep Breeding Values (ASBVs) for post-weaning weight (PWT), post-weaning eye muscle depth and post-weaning fat depth (PFAT) were used in this experiment. Blood samples were collected via jugular cannulas every 6 h from time 0 to 48 h of feed deprivation for the determination of plasma glucose, NEFA, BHOB and lactate concentration. From 12 to 48 h of feed deprivation plasma glucose concentration decreased (P < 0.05) by 25% from 4.04 ± 0.032 mmol/l to 3.04 ± 0.032 mmol/l. From 6 h NEFA concentration increased (P < 0.05) from 0.15 ± 0.021 mmol/l by almost 10-fold to 1.34 ± 0.021 mmol/l at 48 h of feed deprivation. Feed deprivation also influenced BHOB concentrations and from 12 to 48 h it increased (P < 0.05) from 0.15 ± 0.010 mmol/l to 0.52 ± 0.010 mmol/l. Merino sired lambs had a 8% greater reduction in glucose and 29% and 10% higher NEFA and BHOB response, respectively, compared to Terminal sired lambs (P < 0.05). In Merino sired lambs, increasing PWT was also associated with an increase in glucose and decline in NEFA and BHOB concentration (P < 0.05). In Terminal sired lambs, increasing PFAT was associated with an increase in glucose and decline in NEFA concentration (P < 0.05). Contrary to the hypothesis, Merino sired lambs showed the greatest metabolic response to fasting especially in regards to fat metabolism.