We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Targeting the glutamatergic system is posited as a potentially novel therapeutic strategy for psychotic disorders. While studies in subjects indicate that antipsychotic medication reduces brain glutamatergic measures, they were unable to disambiguate clinical changes from drug effects.
Aims
To address this, we investigated the effects of a dopamine D2 receptor partial agonist (aripiprazole) and a dopamine D2 receptor antagonist (amisulpride) on glutamatergic metabolites in the anterior cingulate cortex (ACC), striatum and thalamus in healthy controls.
Method
A double-blind, within-subject, cross-over, placebo-controlled study design with two arms (n = 25 per arm) was conducted. Healthy volunteers received either aripiprazole (up to 10 mg/day) for 7 days or amisulpride (up to 400 mg/day) and a corresponding period of placebo treatment in a pseudo-randomised order. Magnetic resonance spectroscopy (1H-MRS) was used to measure glutamatergic metabolite levels and was carried out at three different time points: baseline, after 1 week of drug and after 1 week of placebo. Values were analysed as a combined measure across the ACC, striatum and thalamus.
Results
Aripiprazole significantly increased glutamate + glutamine (Glx) levels compared with placebo (β = 0.55, 95% CI [0.15, 0.95], P = 0.007). At baseline, the mean Glx level was 8.14 institutional units (s.d. = 2.15); following aripiprazole treatment, the mean Glx level was 8.16 institutional units (s.d. = 2.40) compared with 7.61 institutional units (s.d. = 2.36) for placebo. This effect remained significant after adjusting for plasma parent and active metabolite drug levels. There was an observed increase with amisulpride that did not reach statistical significance.
Conclusions
One week of aripiprazole administration in healthy participants altered brain Glx levels as compared with placebo administration. These findings provide novel insights into the relationship between antipsychotic treatment and brain metabolites in a healthy participant cohort.
Sustainability of DBT programmes and the factors which potentially influence this has received little attention from researchers. In this article, we review the literature reporting on sustainability of DBT programmes in outpatient settings. We also seek to advance the limited knowledge on this topic by reporting on the sustainability of DBT programmes delivered by teams that trained via a coordinated implementation approach in Ireland. As part of this perspective piece we conducted a systematic literature search which identified four studies reporting on DBT programme sustainability. All four reported on programmes delivered by teams that had received training as per the DBT Intensive Training Model. The findings of these studies are summarised and we consider the effect on DBT programme sustainability of introducing a coordinated implementation approach in Ireland.
Background: Following craniotomy, there is widespread agreement that post-operative neurological impairments require specialized evaluation to evaluate fitness to drive. However, for patients who had a craniotomy and do not have neurological deficits or known seizures, there is less consensus as to when return to driving is safe. In this study, we aim to review existing guidelines regarding driving post-craniotomy and assess the current practices for post-craniotomy recommendations in Canada. Methods: Our study has three components: 1) systematic review of existing guidelines for return to driving after cranial procedure; 2) review of primary evidence (cohort studies) regarding seizure risk following a craniotomy, depending of the underlying pathology; 3) online questionnaire distributed to Canadian neurosurgeons by the Canadian Neurosurgery Collaborative (CNRC) network. Results: Our systematic review unveiled various sets of guidelines for driving after a craniotomy. For instance, UK Driving and Vehicle Licensing Agency writes into law specific guidelines for return to driving varying based on underlying pathology. Their results were drawn from large cohort studies measuring the occurrence of post-operative seizures after craniotomy for a variety of conditions. The questionnaire is currently being distributed to Canadian neurosurgeons. Conclusions: Our study lays the first steps towards the development of Canadian guidelines for return to driving post-craniotomy.
Research on proactive and reactive aggression has identified covariates unique to each function of aggression, but hypothesized correlates have often not been tested with consideration of developmental changes in or the overlap between the types of aggression. The present study examines the unique developmental trajectories of proactive and reactive aggression over adolescence and young adulthood and tests these trajectories’ associations with key covariates: callous–unemotional (CU) traits, impulsivity, and internalizing emotions. In a sample of 1,211 justice-involved males (ages 15–22), quadratic growth models (i.e., intercepts, linear slopes, and quadratic slopes) of each type of aggression were regressed onto quadratic growth models of the covariates while controlling for the other type of aggression. After accounting for the level of reactive aggression, the level of proactive aggression was predicted by the level of CU traits. However, change in proactive aggression over time was not related to the change in any covariates. After accounting for proactive aggression, reactive aggression was predicted by impulsivity, both at the initial level and in change over time. Results support that proactive and reactive aggression are unique constructs with separate developmental trajectories and distinct covariates.
Primary surgical resection remains the mainstay of management in locally advanced differentiated thyroid cancer. Tyrosine kinase inhibitors have recently shown promising results in patients with recurrent locally advanced differentiated thyroid cancer. This study discussed four patients with locally advanced differentiated thyroid cancer managed with tyrosine kinase inhibitors used prior to surgery in the ‘neoadjuvant’ setting.
Method
Prospective data collection through a local thyroid database from February 2016 identified four patients with locally advanced differentiated thyroid cancer unsuitable for primary surgical resection commenced on neoadjuvant tyrosine kinase inhibitor therapy.
Results
All cases had T4a disease at presentation. Three cases tolerated tyrosine kinase inhibitor therapy for more than 14 months while the last case failed to tolerate treatment at 1 month. All patients subsequently underwent total thyroidectomy to facilitate adjuvant radioactive iodine treatment. Disease-specific survival remains at 100 per cent currently (range, 29–75 months).
Conclusion
Neoadjuvant tyrosine kinase inhibitors in locally advanced differentiated thyroid cancer can be effective in reducing primary tumour extent to potentially facilitate a more limited surgical resection for local disease control.
Optimum nutrition plays a major role in the achievement and maintenance of good health. The Nutrition Society of the UK and Ireland and the Sabri Ülker Foundation, a charity based in Türkiye and focused on improving public health, combined forces to highlight this important subject. A hybrid conference was held in Istanbul, with over 4000 delegates from sixty-two countries joining the proceedings live online in addition to those attending in person. The primary purpose was to inspire healthcare professionals and nutrition policy makers to better consider the role of nutrition in their interactions with patients and the public at large to reduce the prevalence of non-communicable diseases such as obesity and type 2 diabetes. The event provided an opportunity to share and learn from different approaches in the UK, Türkiye and Finland, highlighting initiatives to strengthen research in the nutritional sciences and translation of that research into nutrition policy. The presenters provided evidence of the links between nutrition and disease risk and emphasised the importance of minimising risk and implementing early treatment of diet-related disease. Suggestions were made including improving health literacy and strengthening policies to improve the quality of food production and dietary behaviour. A multidisciplinary approach is needed whereby Governments, the food industry, non-governmental groups and consumer groups collaborate to develop evidence-based recommendations and appropriate joined-up policies that do not widen inequalities. This summary of the proceedings will serve as a gateway for those seeking to access additional information on nutrition and health across the globe.
We examine a Query Theory account of risky choice framing effects — when risky choices are framed as a gain, people are generally risky averse but, when an equivalent choice is framed as a loss, people are risk seeking. Consistent with Query Theory, frames affected the structure of participants’ arguments: gain frame participants listed arguments favoring the certain option earlier and more often than loss frame participants. These argumentative shifts mediated framing effects; manipulating participants initial arguments attenuated them. While emotions, as measured by PANAS, were related to frames but not related to choices, an exploratory text analysis of the affective valence of arguments was related to both. Compared to loss-frame participants, gain-frame participants expressed more positive sentiment towards the certain option than the risky option. This relative-sentiment index predicted choices by itself but not when included with structure of arguments. Further, manipulated initial arguments did not significantly affect participant’s relative sentiment. Prior to changing choices, risky choice frames alter both the structure and emotional valence of participants’ internal arguments.
On soils dominated by high proportions of clay and organic matter, soil acidity and poor nutrient use efficiency have a major impact on output potential. Due to the inherent chemical properties of these soils, reducing soil acidity and the prevalence of undesirable metallic cations poses challenges. As a result, these soils have a large capacity for phosphorus (P) fixation, therefore reducing plant P availability. Limestone (CaCO3 or MgCO3) is applied to agricultural soils to counteract soil acidity and reduce P fixation. The current study investigates the effects of four contrasting annual P application rates (0, 50, 100, 150 kg P/ha); split (50:50) between spring and summer, across soils with a range of soil pH values from a previous liming trial. The effect of soil pH ranges and P treatment rates on seasonal herbage growth and herbage P concentration was investigated over three years. Soil nutrient status was also investigated. Soil pH had a significant impact on the rate of mineralization and soil P concentration across each site. A soil pH of 6.2 caused a 1.8 mg/l increase in soil test P. An annual P application was necessary to maintain sufficient herbage P concentration for animal dietary requirements (0.35% DM), however there was no effect of P application or liming rate on herbage productivity across the three sites as all sites possessed sufficient soil P reserves. The current experiment has shown that despite optimal soil fertility status, ensuring sufficient plant available P is a problem on these particular soils.
Dietary pattern analysis is typically based on dimension reduction and summarises the diet with a small number of scores. We assess ‘joint and individual variance explained’ (JIVE) as a method for extracting dietary patterns from longitudinal data that highlights elements of the diet that are associated over time. The Auckland Birthweight Collaborative Study, in which participants completed an FFQ at ages 3·5 (n 549), 7 (n 591) and 11 (n 617), is used as an example. Data from each time point are projected onto the directions of shared variability produced by JIVE to yield dietary patterns and scores. We assess the ability of the scores to predict future BMI and blood pressure measurements of the participants and make a comparison with principal component analysis (PCA) performed separately at each time point. The diet could be summarised with three JIVE patterns. The patterns were interpretable, with the same interpretation across age groups: a vegetable and whole grain pattern, a sweets and meats pattern and a cereal v. sweet drinks pattern. The first two PCA-derived patterns were similar across age groups and similar to the first two JIVE patterns. The interpretation of the third PCA pattern changed across age groups. Scores produced by the two techniques were similarly effective in predicting future BMI and blood pressure. We conclude that when data from the same participants at multiple ages are available, JIVE provides an advantage over PCA by extracting patterns with a common interpretation across age groups.
Although the DSM-5 was adopted in 2013, the validity of the new substance use disorder (SUD) diagnosis and craving criterion has not been investigated systematically across substances.
Methods
Adults (N = 588) who engaged in binge drinking or illicit drug use and endorsed at least one DSM-5 SUD criterion were included. DSM-5 SUD criteria were assessed for alcohol, tobacco, cannabis, cocaine, heroin, and opioids. Craving was considered positive if “wanted to use so badly that could not think of anything else” (severe craving) or “felt a very strong desire or urge to use” (moderate craving) was endorsed. Baseline information on substance-related variables and psychopathology was collected, and electronic daily assessment queried substance use for the following 90 days. For each substance, logistic regression estimated the association between craving and validators, i.e. variables expected to be related to craving/SUD, and whether association with the validators differed for DSM-5 SUD diagnosed with craving as a criterion v. without.
Results
Across substances, craving was associated with most baseline validators (p values<0.05); neither moderate nor severe craving consistently showed greater associations. Baseline craving predicted subsequent use [odds ratios (OR): 4.2 (alcohol) – 234.3 (heroin); p's ⩽ 0.0001], with stronger associations for moderate than severe craving (p's < 0.05). Baseline DSM-5 SUD showed stronger associations with subsequent use when diagnosed with craving than without (p's < 0.05).
Conclusion
The DSM-5 craving criterion as operationalized in this study is valid. Including craving improves the validity of DSM-5 SUD diagnoses, and clinical relevance, since craving may cause impaired control over use and development and maintenance of SUD.
Enrichment of the heavy rare earth elements (HREE) in carbonatites is rare as carbonatite petrogenesis favours the light (L)REE. We describe HREE enrichment in fenitized phonolite breccia, focusing on small satellite occurrences 1–2 km from the Songwe Hill carbonatite, Malawi. Within the breccia groundmass, a HREE-bearing mineral assemblage comprises xenotime, zircon, anatase/rutile and minor huttonite/thorite, as well as fluorite and apatite.
A genetic link between HREE mineralization and carbonatite emplacement is indicated by the presence of Sr-bearing carbonate veins, carbonatite xenoliths and extensive fenitization. We propose that the HREE are retained in hydrothermal fluids which are residually derived from a carbonatite after precipitation of LREE minerals. Brecciation provides a focusing conduit for such fluids, enabling HREE transport and xenotime precipitation in the fenite. Continued fluid–rock interaction leads to dissolution of HREE-bearing minerals and further precipitation of xenotime and huttonite/thorite.
At a maximum Y content of 3100 µg g−1, HREE concentrations in the presented example are not sufficient to constitute ore, but the similar composition and texture of these rocks to other cases of carbonatite-related HREE enrichment suggests that all form via a common mechanism linked to fenitization. Precipitation of HREE minerals only occurs where a pre-existing structure provides a focusing conduit for fenitizing fluids, reducing fluid – country-rock interaction. Enrichment of HREE and Th in fenite breccia serves as an indicator of fluid expulsion from a carbonatite, and may indicate the presence of LREE mineralization within the source carbonatite body at depth.
Soil acidity and poor nutrient use efficiency are major limiting factors as regards output potential on heavy soils, soils which are dominated by high proportions of clay and organic matter, with impeded drainage, high buffering capacity and located in high rainfall areas. Lime is applied in order to counteract these limiting factors and in turn improve agricultural output and productivity. The current study investigates the effects of two commonly used lime products at three comparable treatment rates, ground lime (7.5, 5 and 2.5 tonne/ha) and granulated lime (7.5, 2.5 and 1.5 tonne/ha), applied across three distinct sites. The ability of each lime product and treatment rate to counteract soil acidity, increase nutrient availability and influence soil physical structure was assessed over time. On average across sites, 1 tonne/ha of each lime product increased soil pH by 0.15 and 0.21 pH units between ground and granulated lime, respectively. Site 3 experienced the greatest increase change in soil pH in comparison to the other two sites, largely due to lower clay content and cation exchange capacity. Granulated lime was 5.7 times more expensive than ground lime in its ability to reduce soil acidity. The high treatment rate showed the greatest reduction in soil acidity, aluminium and iron concentration as a mean across all sites. Morgan's soil test phosphorus concentration increased across all sites, with treatment rates having no effect on the rate of increase. There was evidence of reduced soil compaction and lime application showed no negative implication on soil physical structure.
To assess the relationship between food insecurity, sleep quality, and days with mental and physical health issues among college students.
Design:
An online survey was administered. Food insecurity was assessed using the ten-item Adult Food Security Survey Module. Sleep was measured using the nineteen-item Pittsburgh Sleep Quality Index (PSQI). Mental health and physical health were measured using three items from the Healthy Days Core Module. Multivariate logistic regression was conducted to assess the relationship between food insecurity, sleep quality, and days with poor mental and physical health.
Setting:
Twenty-two higher education institutions.
Participants:
College students (n 17 686) enrolled at one of twenty-two participating universities.
Results:
Compared with food-secure students, those classified as food insecure (43·4 %) had higher PSQI scores indicating poorer sleep quality (P < 0·0001) and reported more days with poor mental (P < 0·0001) and physical (P < 0·0001) health as well as days when mental and physical health prevented them from completing daily activities (P < 0·0001). Food-insecure students had higher adjusted odds of having poor sleep quality (adjusted OR (AOR): 1·13; 95 % CI 1·12, 1·14), days with poor physical health (AOR: 1·01; 95 % CI 1·01, 1·02), days with poor mental health (AOR: 1·03; 95 % CI 1·02, 1·03) and days when poor mental or physical health prevented them from completing daily activities (AOR: 1·03; 95 % CI 1·02, 1·04).
Conclusions:
College students report high food insecurity which is associated with poor mental and physical health, and sleep quality. Multi-level policy changes and campus wellness programmes are needed to prevent food insecurity and improve student health-related outcomes.
Maintaining nutritional adequacy contributes to successful ageing. B vitamins involved in one-carbon metabolism regulation (folate, riboflavin, vitamins B6 and B12) are critical nutrients contributing to homocysteine and epigenetic regulation. Although cross-sectional B vitamin intake in ageing populations is characterised, longitudinal changes are infrequently reported. This systematic review explores age-related changes in dietary adequacy of folate, riboflavin, vitamins B6 and B12 in community-dwelling older adults (≥65 years at follow-up). Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, databases (MEDLINE, Embase, BIOSIS, CINAHL) were systematically screened, yielding 1579 records; eight studies were included (n 3119 participants, 2–25 years of follow-up). Quality assessment (modified Newcastle–Ottawa quality scale) rated all of moderate–high quality. The estimated average requirement cut-point method estimated the baseline and follow-up population prevalence of dietary inadequacy. Riboflavin (seven studies, n 1953) inadequacy progressively increased with age; the prevalence of inadequacy increased from baseline by up to 22·6 and 9·3 % in males and females, respectively. Dietary folate adequacy (three studies, n 2321) improved in two studies (by up to 22·4 %), but the third showed increasing (8·1 %) inadequacy. Evidence was similarly limited (two studies, respectively) and inconsistent for vitamins B6 (n 559; −9·9 to 47·9 %) and B12 (n 1410; −4·6 to 7·2 %). This review emphasises the scarcity of evidence regarding micronutrient intake changes with age, highlighting the demand for improved reporting of longitudinal changes in nutrient intake that can better direct micronutrient recommendations for older adults. This review was registered with PROSPERO (CRD42018104364).
Acute cannabis administration can produce transient psychotic-like effects in healthy individuals. However, the mechanisms through which this occurs and which factors predict vulnerability remain unclear. We investigate whether cannabis inhalation leads to psychotic-like symptoms and speech illusion; and whether cannabidiol (CBD) blunts such effects (study 1) and adolescence heightens such effects (study 2).
Methods
Two double-blind placebo-controlled studies, assessing speech illusion in a white noise task, and psychotic-like symptoms on the Psychotomimetic States Inventory (PSI). Study 1 compared effects of Cann-CBD (cannabis containing Δ-9-tetrahydrocannabinol (THC) and negligible levels of CBD) with Cann+CBD (cannabis containing THC and CBD) in 17 adults. Study 2 compared effects of Cann-CBD in 20 adolescents and 20 adults. All participants were healthy individuals who currently used cannabis.
Results
In study 1, relative to placebo, both Cann-CBD and Cann+CBD increased PSI scores but not speech illusion. No differences between Cann-CBD and Cann+CBD emerged. In study 2, relative to placebo, Cann-CBD increased PSI scores and incidence of speech illusion, with the odds of experiencing speech illusion 3.1 (95% CIs 1.3–7.2) times higher after Cann-CBD. No age group differences were found for speech illusion, but adults showed heightened effects on the PSI.
Conclusions
Inhalation of cannabis reliably increases psychotic-like symptoms in healthy cannabis users and may increase the incidence of speech illusion. CBD did not influence psychotic-like effects of cannabis. Adolescents may be less vulnerable to acute psychotic-like effects of cannabis than adults.
The 2017 solar eclipse was associated with mass gatherings in many of the 14 states along the path of totality. The Kentucky Department for Public Health implemented an enhanced syndromic surveillance system to detect increases in emergency department (ED) visits and other health care needs near Hopkinsville, Kentucky, where the point of greatest eclipse occurred.
Methods:
EDs flagged visits of patients who participated in eclipse events from August 17–22. Data from 14 area emergency medical services and 26 first-aid stations were also monitored to detect health-related events occurring during the eclipse period.
Results:
Forty-four potential eclipse event-related visits were identified, primarily injuries, gastrointestinal illness, and heat-related illness. First-aid stations and emergency medical services commonly attended to patients with pain and heat-related illness.
Conclusions:
Kentucky’s experience during the eclipse demonstrated the value of patient visit flagging to describe the disease burden during a mass gathering and to investigate epidemiological links between cases. A close collaboration between public health authorities within and across jurisdictions, health information exchanges, hospitals, and other first-response care providers will optimize health surveillance activities before, during, and after mass gatherings.
Dietary patterns describe the combination of foods and beverages in a diet and the frequency of habitual consumption. Better understanding of childhood dietary patterns and antenatal influences could inform intervention strategies to prevent childhood obesity. We derived empirical dietary patterns in 1142 children (average age 6·0 (sd 0·2) years) in New Zealand, whose mothers had participated in the Screening for Pregnancy Endpoints (SCOPE) cohort study and explored associations with measures of body composition. Participants (Children of SCOPE) had their diet assessed by FFQ, and dietary patterns were extracted using factor analysis. Three distinct dietary patterns were identified: ‘Healthy’, ‘Traditional’ and ‘Junk’. Associations between dietary patterns and measures of childhood body composition (waist, hip, arm circumferences, BMI, bioelectrical impedance analysis-derived body fat % and sum of skinfold thicknesses (SST)) were assessed by linear regression, with adjustment for maternal influences. Children who had higher ‘Junk’ dietary pattern scores had 0·24 (sd 0·08; 95 % CI 0·04, 0·13) cm greater arm and 0·44 (sd 0·05; 95 % CI 0·01, 0·10) cm greater hip circumferences and 1·13 (sd 0·07; 95 % CI 0·03, 0·12) cm greater SST and were more likely to be obese (OR 1·74; 95 % CI 1·07, 2·82); those with higher ‘Healthy’ pattern scores were less likely to be obese (OR 0·62; 95 % CI 0·39, 1·00). In a large mother–child cohort, a dietary pattern characterised by high-sugar and -fat foods was associated with greater adiposity and obesity risk in children aged 6 years, while a ‘Healthy’ dietary pattern offered some protection against obesity. Targeting unhealthy dietary patterns could inform public health strategies to reduce the prevalence of childhood obesity.
Early detection and intervention strategies in patients at clinical high-risk (CHR) for syndromal psychosis have the potential to contain the morbidity of schizophrenia and similar conditions. However, research criteria that have relied on severity and number of positive symptoms are limited in their specificity and risk high false-positive rates. Our objective was to examine the degree to which measures of recency of onset or intensification of positive symptoms [a.k.a., new or worsening (NOW) symptoms] contribute to predictive capacity.
Methods
We recruited 109 help-seeking individuals whose symptoms met criteria for the Progression Subtype of the Attenuated Positive Symptom Psychosis-Risk Syndrome defined by the Structured Interview for Psychosis-Risk Syndromes and followed every three months for two years or onset of syndromal psychosis.
Results
Forty-one (40.6%) of 101 participants meeting CHR criteria developed a syndromal psychotic disorder [mostly (80.5%) schizophrenia] with half converting within 142 days (interquartile range: 69–410 days). Patients with more NOW symptoms were more likely to convert (converters: 3.63 ± 0.89; non-converters: 2.90 ± 1.27; p = 0.001). Patients with stable attenuated positive symptoms were less likely to convert than those with NOW symptoms. New, but not worsening, symptoms, in isolation, also predicted conversion.
Conclusions
Results suggest that the severity and number of attenuated positive symptoms are less predictive of conversion to syndromal psychosis than the timing of their emergence and intensification. These findings also suggest that the earliest phase of psychotic illness involves a rapid, dynamic process, beginning before the syndromal first episode, with potentially substantial implications for CHR research and understanding the neurobiology of psychosis.
Folic acid (FA) supplementation is recommended in the periconceptional period, for the prevention of neural tube defects. Limited data are available on the folate status of New Zealand (NZ) pregnant women and its association with FA supplementation intake. Objectives were to examine the relationship between plasma folate (PF) and reported FA supplement use at 15 weeks’ gestation and to explore socio-demographic and lifestyle factors associated with PF. We used data and blood samples from NZ participants of the Screening for Pregnancy Endpoints cohort study. Healthy nulliparous women with singleton pregnancy (n 1921) were interviewed and blood samples collected. PF was analysed via microbiological assay. Of the participants, 73 % reported taking an FA supplement at 15 weeks’ gestation – of these, 79 % were taking FA as part of/alongside a multivitamin supplement. Of FA supplement users, 56 % reported consuming a daily dose of ≥800 μg; 39 % reported taking less than 400 µg/d. Mean PF was significantly higher in women reporting FA supplementation (54·6 (se 1·5) nmol/l) v. no FA supplementation (35·1 (se 1·6) nmol/l) (P<0·0001). Reported daily FA supplement dose and PF were significantly positively correlated (r 0·41; P<0·05). Younger maternal age, Pacific and Maori ethnicity and obesity were negatively associated with PF levels; vegetarianism was positively associated with PF. Reported FA supplement dose was significantly associated with PF after adjustment for socio-demographic, lifestyle confounders and multivitamin intake. The relationship observed between FA supplementation and PF demonstrates that self-reported intake is a reliable proxy for FA supplement use in this study population.