We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Approximately half of Australian universities offer a degree in nutrition, nutrition science, human nutrition or nutrition combined with another discipline. In the absence of any formal accreditation requirements, the design of nutrition undergraduate courses is guided by national nutrition science competencies(1). Degrees might also include specialisations such as in public health, food industry or animal nutrition, and a range of special interest topics included to enhance the graduate skillset for the workforce. This diversity in degrees develops graduates with broad and transferrable skills, thought to be desirable to industry employers. In an earlier study by the authors(2), it was identified that graduates placed high value on nutrition science theory and practical content, but there was an expressed desire for more work-integrated-learning opportunities and professional skill development for work in private practice. To triangulate the perspectives of students and graduates, there is a need for universities to understand how nutrition graduates are received by employers. This paper will present preliminary findings from the Working in Nutrition Employer (WIN-E) study. The aims of this study were to build on findings from the WIN-G study(2) and explore the perceptions of nutrition graduate employers in Australia regarding aspects of the graduate skillset that were highly valued, and identify training gaps. After being tested for face validity, the purpose-built WIN-E survey was delivered online via Qualtrics between June–December 2022. The survey included a mix of 32 closed- and open-ended questions about employer characteristics, additional education, employment and professional experience, and employers perceived graduate preparedness for the workforce. An interim analysis revealed that 110 participants had given informed consent; of these, 41 completed 75% of the questions, with 32 having relevance to nutrition graduate employment, and were included in the current analyses. Most respondents were female (n = 25, 78%) aged 25–34 years (n = 13, 52%). Respondents predominantly identified as working in education (n = 8), research (n = 5) or in ‘other settings’ (n = 5) such as community aged care, digital media/coaching, food preparation, agriculture or homelessness project work. To a lesser extent domains in retail/hospitality (n = 4), food industry (n = 3), public health/not for profit (n = 2), clinical (n = 2) and sport and fitness (n = 1) were represented. Fifteen (47%) employers felt nutrition graduates had all, or some of the expected skills and attributes at time of employment. More development of skills in written health translation (n = 2), data analysis (n = 1), working collaboratively in health systems (e.g., aged care) (n = 2), marketing (n = 1), and understanding transferable skillsets with the motivation for ongoing professional development were also valued (n = 2). Further data analysis will provide more context around the roles and responsibilities employers typically assign to nutrition graduates, highlighting potential training gaps and opportunities for universities to better prepare graduates for the workforce.
The International Theological Commission, in its document on synodality, observed that ‘the concept of synodality refers to the involvement and participation of the whole People of God in the life and mission of the Church’.1 More specifically, it refers to participation in the exercise of discernment and decision making that is intrinsic to that mission. This discernment and decision making requires structures to make it clear not merely when decisions have been made and who makes them, but how they are made and how the people who make them work together. Such structures are embodied in legal instruments that are appropriate to the context, which here is the area of Church life which is between the properly local – the diocese – and the properly universal – the Holy See. In history the usual word for the gatherings that embodied these attempts was ‘synod’. Although the synods are seen as gatherings of bishops it is clear throughout that history that many others have been present at them: a notable and famous example was the presence of the Alexandrian deacon Athanasius at the Council of Nicea. Discernment and decision making at the supra-diocesan level has always involved bishops – but not only bishops. This article lays out the provisions of the current law surrounding this task in the Roman Catholic Church.
In the nineteenth century, it is difficult to discern anything in an age-old form of warfare that was not almost instinctive reaction on the part of those opposing conquest, occupation or the legitimacy of authority. Nineteenth-century guerrilla warfare was highly diverse but always the resort of the weak in face of the strong. There could be little expectation that a guerrilla strategy of itself could result in victory in such circumstances unless guerrillas could transform themselves to meet regular forces conventionally or co-operate with regular forces in a partisan role. There are few examples where setting objectives, priorities and allocating resources can be readily identified among those who led guerrillas in the nineteenth century. Four case studies are chosen to illustrate contrasting circumstances pertaining to how far strategic analysis can be applied to nineteenth-century guerrilla warfare. These are the attempt to control Spanish resistance to Napoleon after 1808, the decision of the Southern Confederacy not to pursue a guerrilla strategy at the end of the American Civil War in 1865, Burmese resistance to British annexation between 1885 and 1895, and the decision of the Boer leadership to undertake guerrilla warfare in 1900 during the South African War.
Fruit and vegetable intakes are major modifiable determinants of risk for non-communicable disease(1), yet intake levels remain low(2) and multiple barriers (cost, access, perishability, preparation skills) exist(3,4). 100% fruit and vegetable juices contain key micronutrients and bioactive compounds (5–7) and may help circumvent these barriers to consumption(6,7). However, their role in dietary guidelines and models of healthy eating remains controversial due to their free sugars and reduced dietary fibre content, relative to whole fruits and vegetables(6,7). Therefore, we conducted a systematic umbrella review of systematic literature reviews (SLRs) with meta-analyses assessing the relationships between 100% juice consumption and human health outcomes. Four databases (Medline, Cochrane Library, EMBASE, and CINAHL) were systematically searched for SLRs with meta-analyses of human prospective cohort, case-control, and intervention studies examining the relationship between 100% juice and any health outcome through to 20th October 2022. Screening (Covidence), quality (GRADE)(8), risk of bias (ROBIS)(9) and content overlap (corrected covered area(10)) tools were applied, and extracted data were narratively synthesised. The protocol was pre-registered (PROSPERO) and conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) checklists. 15 SLRs on 100% fruit juice including 51 primary meta-analyses, 6 dose-response analyses, and 87 sub-analyses were eligible for inclusion. No eligible studies on vegetable juice were found. Included studies represented data from almost 2 million subjects with a range of doses (50-1200mL/day) and timeframes (hours to years). Significant improvements in health outcomes were found in 19.6% of included meta-analyses (blood pressure, flow-mediated dilation, IL-6, c-reactive protein, and stroke mortality), and increased disease risks were found in 5.9% of included meta-analyses (CVD mortality, prostate cancer, and type II diabetes). The remainder (74.5%) found no significant difference (blood lipids, weight, liver function, metabolic markers, colorectal and breast cancers, and multiple inflammatory markers). The ROBIS quality assessment rated nine SLRs as low risk of bias, three as unclear and three as high. Using GRADE, confidence in the body of evidence ranged from very low (27 primary and 79 secondary meta-analyses) to low (19 primary and 13 secondary meta-analyses), and medium (4 primary and one secondary meta-analyses.) Findings show 100% juice consumption has limited risks of harm and some potential benefits, over a broad range of doses, including some that are relatively high, and time periods. The positive associations between 100% juice consumption and specific health outcomes relevant to population health may be explained by multiple mechanisms, including the vitamin, mineral, and bioactive contents. The balance of evidence suggests that 100% may have a neutral or beneficial place in general, population-level dietary guidelines.
Sour foods, such as citrus fruits, some berries and fermented foods provide a range of nutrients and benefits important to mental health [1]. When sourness is perceived as unpleasant, intake of these foods may be reduced affecting mental health. Early research has shown changes to sour taste perception in depression and stress however, changes in anxiety have not been studied [4-8]. To address this gap and build on the knowledge base, a survey was conducted in which participants (n = 424) rated recalled intensity and liking of sour index foods and completed the Depression, Anxiety, and Stress Scale (DASS-21) to measure these states. Variations in sour taste and mood have been demonstrated between females and males, hence the data were explored for sex-differences. Standard least squares regression (post hoc Tukey’s HSD) compared means between groups, and nominal logistic regression assessed differences in distributions between categories. Recalled sour intensity was 16-19.2% higher in those with scores indicative of mild depression than in those with normal scores in the total sample (p range 0.03-0.04), and 17.9-21.3% higher in females (p values were 0.03). There were no differences in sour taste intensity between the intergroup means for anxiety or stress and no associations between sour liking and any of the mood states. The results suggest that the sourness of index foods increases in depression. Further research to elucidate the biological processes and possible taste-related genetic influences that may be occurring would be beneficial. With this knowledge it may be possible to screen for mood conditions by measuring changes to sour taste that appear alongside other signs and symptoms, create more tailored dietary interventions and develop additional therapeutics.
Degenerative cervical myelopathy(DCM) is the most debilitating form of degenerative disc disease, and is the most common acquired cause of spinal cord dysfunction in adults. DCM is caused by progressive abnormalities of the vertebral column that result in spinal cord damage due to both primary mechanical and secondary biological injury. DCM pathohistology demonstrates a consistent pattern of deleterious changes including severe Wallerian degeneration cephalad and caudal to the level of compression, apoptotic oligodendrocyte cell loss, and anterior horn dropout. Spinal cord ischemia and hypoxia play a major role in DCM pathogenesis. Novel spinal cord imaging studies such as MR spectroscopy and diffusion tensor imaging have provided novel insights into the neurobiology of this disorder. The central nervous system effects of DCM not only involve the spinal cord, but also include upstream functional and structural alterations that can influence disease progression and response to surgical intervention.
Between 1914 and 1918 the United Kingdom equated to a ‘nation in arms’ for the first time in a century. Yet there were wide variations between nations, regions and localities. The national war effort concealed myriad national and local loyalties. National and local political, social, economic and cultural factors all impacted upon military participation both under voluntary enlistment prior to 1916 and conscription thereafter.
The Victoria County History (VCH) was founded in 1899 as a private enterprise sponsored by the publishing company Archibald Constable & Co. It was designed to be a scheme for ‘the compilation of a history of each county of England’. The expectation was that over time it would become a collective account of the localities amounting to ‘a National Survey … tracing … the story of England’s growth … into a nation which is now [in 1899] the greatest on the globe’. Arthur Doubleday, the first editor, persuaded the marquess of Lorne to solicit the personal approval of Queen Victoria. Lorne was Queen Victoria’s son-in-law. As a result, the ‘big red books’ were dedicated to her memory, and the present monarch, Queen Elizabeth II, renewed that association. Since 2012 new volumes have been dedicated to her. VCH Warwickshire was completed in 1969. This chapter looks at its setting up, research and writing, and completion. It ends with a reflection on the influence of the VCH on the history of the county.
Origins
The VCH was founded in 1899 and sponsored by Constables through one of its directors, H. Arthur Doubleday. The original scheme envisaged for each county two volumes of general essays guiding the reader through the most up-to-date research, at that time, and two volumes of topographical studies. The four volumes made a ‘set’ in the terminology used by the VCH. The topographical volumes were intended to complement the general volumes, not simply to be parish studies. To run the VCH a General Committee was established in London, and a county committee was set up for each county.
It says a good deal for the thinking of the publisher that the county committee was designed to be chaired by the lord lieutenant and to include peers, bishops, gentlemen and justices of the peace (JPs). As was his normal practice, Arthur Doubleday approached the Lord Lieutenant of Warwickshire, Lord Leigh, with a view to persuading him to take the chair of the county committee. He was blunt and to the point: ‘in connection with the Victoria History of the Counties of England, it is proposed to form in each County a Committee for the purpose of obtaining the interest and influence of those who have information or documents which if made available, would largely help in the work’.
This is an article about two things. First, the bifurcation of public international law (PIL) into two distinct forms: The material and the narrative. And second, the methodological fragmentation of international lawyers into discrete communities. After setting the substantive fragmentation of PIL as the context of analysis, I deploy Susan Marks’ concept of “false contingency” to distinguish material and narrative PIL. I briefly examine each, and their interactions, before turning to a specific manifestation of material PIL that I call the Global Legal Order (GLO).
I then sketch the radical indeterminacy of narrative PIL, its manifestations in the ontological indeterminacy of the commonly accepted sources of PIL, and its source in PIL’s lack of authority and institutionalization. This contrasts with the determinacy and authority of the GLO. Next, I turn to the “fragmentation” of international lawyers into distinct “communities of practice.” In fact, this process turns out to be one of agglomeration, international lawyers are constructed within communities of practice, which glom together to create the appearance of PIL.
Finally, I turn to how these communities function by pitting “performances of legality” in “vicarious litigation,” using the Chagos Islands case as an illustration. This is contrasted with the functioning of the operative legal system that is the GLO. I examine the constituent institutions of this system, and how they operate together to produce direct and indirect governance in under-developed states. In practice, this policy imposition immiserates states and antagonizes local populations. It necessitates oppressive governance which entails what narrative PIL determines to be “human rights abuses.”
Photosynthetic organisms have evolved a great variety of mechanisms to optimize their use of sunlight. Some of the clearest examples of adaptations can be seen by comparing photosynthesis in different species and in different individuals of the same species that grow under high and low light levels. While the adaptations of sun and shade higher plants have been relatively well studied, much less information is available on the photobionts of lichenized Ascomycetes. An important adaptation that can protect photosynthetic organisms from the potentially harmful effects of excess light is non-photochemical quenching (NPQ); NPQ can dissipate unused light energy as heat. Here we used chlorophyll fluorescence to compare the induction and relaxation of NPQ and the induction of electron transport (rETR) in collections of the same lichen species from exposed and from more shaded locations. All species have trebouxioid photobionts and normally grow in more exposed microhabitats but can also be readily collected from more shaded locations. Shade forms display generally higher NPQ, presumably to protect lichens from occasional rapid increases in light that occur during sunflecks. Furthermore, the NPQ of shade forms relaxes quickly when light levels are reduced, presumably to ensure efficient photosynthesis after a sunfleck has passed. The maximal relative electron transport rate is lower in shade than sun collections, probably reflecting a downregulation of photosynthetic capacity to reduce energy costs. We also compared collections of pale and melanized thalli from three species of shade lichens with Symbiochloris as their photobiont. Interestingly, NPQ in melanized thalli from slightly more exposed microhabitats induced and relaxed in a way that resembled shade rather than sun forms of the trebouxioid lichens. This might suggest that in some locations melanization induced during a temporary period of high light may be excessive and could potentially reduce photosynthesis later in the growing season. Taken together, the results suggest that lichen photobionts can flexibly adjust the amount and type of NPQ, and their levels of rETR in response to light availability.
Clostridioides difficile infection (CDI) is the leading cause of infectious nosocomial diarrhea. Although initial fidaxomicin or vancomycin treatment is recommended by most major guidelines to treat severe CDI, there exists varied recommendations for first-episode non-severe CDI. Given the discrepancy in current treatment guidelines, we sought to evaluate the use of initial vancomycin versus metronidazole for first-episode non-severe CDI.
Methods:
We conducted a retrospective cohort study of all adult inpatients with first-episode CDI at our institution from January 2013 to May 2018. The initial vancomycin versus initial metronidazole cohorts were examined using a multivariate logistic regression model.
Results:
The study cohort of 737 patients had a median age of 72.3 years, and 357 of these patients (48.4%) had hospital-acquired infection. Among 326 patients with non-severe CDI, recurrence, new incident infection, and 30-day mortality rates were 16.2%, 10.9%, and 5.3%, respectively, when treated with initial metronidazole, compared to 20.0%, 1.4%, and 10.0%, respectively, when treated with initial vancomycin. In an adjusted multivariable analysis, the use of initial vancomycin for the treatment of non-severe CDI was associated with a reduction in new incident infection (adjusted odds ratio [ORadj], 0.11; 95% confidence interval [CI], 0.02–0.86; P = .035), compared to initial metronidazole.
Conclusions:
Initial vancomycin was associated with a reduced rate of new incident infection in the treatment of adult inpatients with first-episode non-severe CDI. These findings support the use of initial vancomycin for all inpatients with CDI, when fidaxomicin is unavailable.