We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article offers a critical analysis of the norms, policy, procedures and outcomes associated with contemporary decision-making under the ‘character test’ per Migration Act 1958 (Cth) s 501. Of late there has been a steep increase in the number of visa refusals and cancellations on adverse character grounds due to the convergence of a reformulated character test and single-minded, authoritarian, administration by ministerial office-holders. This article teases out the significant and, arguably, adverse consequences for the quality of administrative justice of ministerial control over visa decisions absent independent administrative review. It is argued that the integrity of ministerial decision-making and the legitimacy of outcomes are dubious. This is because the process of identifying and balancing the important countervailing community interests and individual (human) rights, in the course of reaching the preferable decision, does not appear to be carried out in a detached, proper and genuine manner, pursuant to rational and intelligible reasoning processes. In conclusion, when viewed holistically, the judicial decisions analysed in this article suggest that the unwavering pursuit of community protection has come at a significant adverse cost to administrative justice and, necessarily, to individuals/families who bear the harsh consequences.
Health technology assessment (HTA) organizations vary in terms of how they conduct assessments. We assess whether and to what extent HTA bodies have adopted societal and novel elements of value in their economic evaluations.
Methods
After categorizing “societal” and “novel” elements of value, we reviewed fifty-three HTA guidelines. We collected data on whether each guideline mentioned each societal or novel element of value, and if so, whether the guideline recommended the element’s inclusion in the base case, sensitivity analysis, or qualitative discussion in the HTA.
Results
The HTA guidelines mention on average 5.9 of the twenty-one societal and novel value elements we identified (range 0–16), including 2.3 of the ten societal elements and 3.3 of the eleven novel value elements. Only four value elements (productivity, family spillover, equity, and transportation) appear in over half of the HTA guidelines, whereas thirteen value elements are mentioned in fewer than one-sixth of the guidelines, and two elements receive no mention. Most guidelines do not recommend value element inclusion in the base case, sensitivity analysis, or qualitative discussion in the HTA.
Conclusions
Ideally, more HTA organizations will adopt guidelines for measuring societal and novel value elements, including analytic considerations. Importantly, simply recommending in guidelines that HTA bodies consider novel elements may not lead to their incorporation into assessments or ultimate decision making.
Evidence suggests that cognitive subtypes exist in schizophrenia that may reflect different neurobiological trajectories. We aimed to identify whether IQ-derived cognitive subtypes are present in early-phase schizophrenia-spectrum disorder and examine their relationship with brain structure and markers of neuroinflammation.
Method
161 patients with recent-onset schizophrenia spectrum disorder (<5 years) were recruited. Estimated premorbid and current IQ were calculated using the Wechsler Test of Adult Reading and a 4-subtest WAIS-III. Cognitive subtypes were identified with k-means clustering. Freesurfer was used to analyse 3.0 T MRI. Blood samples were analysed for hs-CRP, IL-1RA, IL-6 and TNF-α.
Results
Three subtypes were identified indicating preserved (PIQ), deteriorated (DIQ) and compromised (CIQ) IQ. Absolute total brain volume was significantly smaller in CIQ compared to PIQ and DIQ, and intracranial volume was smaller in CIQ than PIQ (F(2, 124) = 6.407, p = 0.002) indicative of premorbid smaller brain size in the CIQ group. CIQ had higher levels of hs-CRP than PIQ (F(2, 131) = 5.01, p = 0.008). PIQ showed differentially impaired processing speed and verbal learning compared to IQ-matched healthy controls.
Conclusions
The findings add validity of a neurodevelopmental subtype of schizophrenia identified by comparing estimated premorbid and current IQ and characterised by smaller premorbid brain volume and higher measures of low-grade inflammation (CRP).
This study explored the concept of ‘giving up’ from the perspective of care staff working in care homes, and their everyday communication and hidden knowledge concerning what they think about this taboo topic and the context it reflects. Moving to a care home is a major transition where cumulative losses can pose risks to mental health in later life. If not recognised, this vulnerability can lead to depression which extends to suicide ideation and behaviours in the form of self-harm and self-neglect. Care homes are a significant place of care until death, yet a discourse of silence means that self-harm and suicide is under-reported or not attended to with specialist expertise. The layperson's concept of an older person ‘giving up’ on life is hardly discussed in the literature. This co-produced qualitative study used an inductive approach to explore this phenomenon through focus groups with 33 care staff across four care homes in South-East England. Findings paint a complex picture, highlighting tensions in providing the right support and creating spaces to respond to such challenging situations. ‘Giving up’ requires skilled detailed assessment to respond to risks alongside improved training and support for paid carers, to achieve a more holistic strategy which capitalises on significant relationships within a wider context.
Social scientists are increasingly approaching the World Heritage Committee itself as an entry-point to understanding global heritage processes and phenomena. This article explores the subject of human rights in the operations of the World Heritage Committee—the decision-making body established by the 1972 UNESCO World Heritage Convention. It seeks to address the epistemological and methodological implications of approaching the World Heritage Committee as a point of departure for understanding global heritage and rights dynamics. It builds on an “event ethnography” undertaken by the authors to understand how rights discourse appeared in multiple contexts during the Thirty-Ninth World Heritage Committee session held in Bonn, Germany, in June 2015.
In this article, we discuss the methodological and ontological implications of studying rights discourses in the context of World Heritage events and processes. We have a particular interest in the interplay of formal and informal dynamics, revealing the entangled and multi-sited processes that shape and are shaped by the annual event. While much of the debate and analysis in heritage studies is understandably concerned with formal decision-making processes and position-taking, this work demonstrates the significance of a range of informal dynamics in appreciating future possibilities.
In Australia, at the federal, State and Territory levels, legislative steps have been taken to enhance the efficacy of judicial review over administrative action. The purpose of statutory codification of judicial review was twofold: first, to enhance access to justice for individuals aggrieved by government action or inaction; and, second, to promote, and affirm the importance of, legal accountability for public administration. This was to be achieved by, inter alia:
Simplifying the procedures for accessing the courts and applying for judicial review;
Codifying the common law grounds for review; and
Providing for a right to written reasons in respect of certain administrative decisions.
This chapter examines whether legislative codification has been ‘worth it’, in view of the rationale underpinning it. Put another way, has codification constrained, or hampered, the law of judicial review in Australia?
Chapter 1 has assessed the role that the Kerr and Ellicott committees played leading up to the enactment of the Administrative Decisions (Judicial Review) Act 1977 (Cth) (‘ADJR Act’). The ADJR Act was ‘an important milestone in the evolution of Australian administrative law’. It was the irst attempt in Australia ‘to codify both the law and much of the procedure of judicial review’. The ADJR Act has been judicially described as ‘one of the most important Australian legal reforms of the last century’. Groves has observed that ‘during the irst decade after its enactment, the ADJR Act was the leading avenue of judicial review and clearly exerted great inluence over Australian administrative law’. This assessment is supported by the Administrative Review Council (‘ARC’) in a report issued in 1989. In that report, the ARC set out statistics regarding judicial review applications federally under the ADJR Act but also via s 39B of the Judiciary Act 1903 (Cth) which corresponds to the High Court’s jurisdiction under s 75 of the Commonwealth Constitution (see Figure 9.1). The preponderance of ADJR Act applications is striking.
Depression is expensive to treat, but providing ineffective treatment is more expensive. Such is the case for many patients who do not respond to antidepressant medication.
Aims
To assess the cost-effectiveness of cognitive–behavioural therapy (CBT) plus usual care for primary care patients with treatment-resistant depression compared with usual care alone.
Method
Economic evaluation at 12 months alongside a randomised controlled trial. Cost-effectiveness assessed using a cost-consequences framework comparing cost to the health and social care provider, patients and society, with a range of outcomes. Cost-utility analysis comparing health and social care costs with quality-adjusted life-years (QALYs).
Results
The mean cost of CBT per participant was £910. The difference in QALY gain between the groups was 0.057, equivalent to 21 days a year of good health. The incremental cost-effectiveness ratio was £14 911 (representing a 74% probability of the intervention being cost-effective at the National Institute of Health and Care Excellence threshold of £20 000 per QALY). Loss of earnings and productivity costs were substantial but there was no evidence of a difference between intervention and control groups.
Conclusions
The addition of CBT to usual care is cost-effective in patients who have not responded to antidepressants. Primary care physicians should therefore be encouraged to refer such individuals for CBT.
The purpose of this study was to evaluate levels of spiritual well-being over time in populations with advanced congestive heart failure (CHF) or chronic obstructive lung disease (COPD).
Method:
In a prospective, longitudinal study, patients with CHF or COPD (each n = 103) were interviewed at baseline and every 3 months for up to 30 months. At each interview, patients completed: the basic faith subscale of the Functional Assessment of Chronic Illness Therapy-Spiritual Well-Being (FACIT-Sp) questionnaire, the Memorial Symptom Assessment Scale (MSAS), the Rand Mental Health Inventory (MHI), the Multidimensional Index of Life Quality (MILQ), the Sickness Impact Profile (SIP), and the Short Portable Mental Health Questionnaire (SPMSQ).
Result:
The mean age was 65 years, 59% were male, 78% were Caucasian, 50% were married, 29% lived alone, and there was no significant cognitive impairment. Baseline median FACIT-Sp score was 10.0 on a scale of 0–16. FACIT-Sp scores did not change over time and multivariate longitudinal analysis revealed higher scores for black patients and lower scores for those with more symptom distress on the MSAS-Global Distress Index (GDI) (both p = 0.02). On a separate multivariate longitudinal analysis, MILQ scores were positively associated with the FACIT-Sp and the MHI, and negatively associated with the MSAS-GDI and the SIP (all p-values < 0.001).
Significance of results:
In advanced CHF and COPD, spiritual well-being remains stable over time, it varies by race and symptom distress, and contributes to quality of life, in combination with symptom distress, mental health and physical functioning.
Deep boreholes have been proposed for many decades as an option for permanent disposal of high-level radioactive waste and spent nuclear fuel. Disposal concepts are straightforward, and generally call for drilling boreholes to a depth of four to five kilometers (or more) into crystalline basement rocks. Waste is placed in the lower portion of the hole, and the upper several kilometers of the hole are sealed to provide effective isolation from the biosphere. The potential for excellent long-term performance has been recognized in many previous studies. This paper reports updated results of what is believed to be the first quantitative analysis of releases from a hypothetical disposal borehole repository using the same performance assessment methodology applied to mined geologic repositories for high-level radioactive waste. Analyses begin with a preliminary consideration of a comprehensive list of potentially relevant features, events, and processes (FEPs) and the identification of those FEPs that appear to be most likely to affect long-term performance in deep boreholes. The release pathway selected for preliminary performance assessment modeling is thermally-driven flow and radionuclide transport upwards from the emplacement zone through the borehole seals or the surrounding annulus of disturbed rock. Estimated radionuclide releases from deep borehole disposal of spent nuclear fuel, and the annual radiation doses to hypothetical future humans associated with those releases, are extremely small, indicating that deep boreholes may be a viable alternative to mined repositories for disposal of both high-level radioactive waste and spent nuclear fuel.
Public interest in local food continues to grow, but few analyses have examined the capacity for the US population to be supplied through local and regional food systems. This paper extends earlier work that demonstrated a method for mapping potential foodsheds and estimating the potential for New York to meet the food needs of the state's population centers. It provides a methodology for addressing the question, ‘If land is limited, which foods should be grown locally?’ A spatial model was developed to allocate the available agricultural land of New York State (NYS) to meet in-state food needs for six distinct food groups (grains, vegetables, fruits, dairy, meat and eggs) across the eight largest population centers. An optimization routine was used to allocate land to maximize economic land use value (LUV). Eleven scenarios were examined, ranging from a baseline level of consumption of New York produced foods to a 100% local diet. Across the 11 scenarios, the amount of food supplied, the LUV attained, and the area of land allocated increased as the ‘willingness’ to consume local products increased. This approach dictated that land was preferentially devoted to higher-value food groups relative to lower-value groups, and no scenario used all available land. Under the 100% local scenario, 69% of total food needs (on a fresh weight basis) were supplied in-state with an average food distance of 238 km. This scenario provided food from only four of the six groups, namely, dairy, eggs, fruit and vegetables. These results suggest that a much larger proportion of total food needs (on a weight basis) might be provided from in-state production than was found in previous work. LUV serves as a compelling optimization function, and future work should investigate the degree to which maximizing returns to land complements or conflicts with social and environmental goals of local and regional food systems.
from
Part VI
-
Emerging international and other efforts
By
Jens Klump, Helmholz Centre Potsdam German Research Centre for Geosciences,
Joachim Wächter, Helmholtz Centre,
Peter Löwe, Helmholz Centre Potsdam German Research Centre for Geosciences,
Ralf Bill, University of Rostock,
Matthias Lendholt, Helmholtz Centre
With computers becoming available for geosciences in the 1970s, the German research community in the earth sciences realized the potential of an informatics approach to scientific questions in the earth sciences. As early as 1979, groups started investigating how these “new media” could be used in earth science research (Vinken, 1983). The development of applications in the mid 1980s was driven mainly by the needs of land surveying and those of utility companies. By the end of the 1980s, the application of Geographic Information Systems (GIS) began to establish itself as a methodology in academic research, predominantly in the fields of geodesy and geography. However, progress in the field of geoinformatics was hampered by missing standards and immature technology. The term geoinformatik for the application of computer science in the earth sciences has been in use in Germany since the mid 1990s. However, its definition remains vague and other terms, such as “geoinformation science” and “geomatics” are still in use.
In the early 1990s, the German National Science Foundation (Deutsche Forschungsgemeinschaft, DFG) established a working group for interoperable GIS (AG IOGIS). The concept of IOGIS can be seen as a precursor to an interoperable geospatial infrastructure. This was also the time when geoinformatics began to establish itself as a subdiscipline of computer science and to reach beyond GIS as its application.
The amorphous Si–B–C–N ceramics with a similar Si/C/N atomic ratio and various boron contents of 3.7 and 6.0 at.% B were synthesized and then isothermally annealed at temperatures ranging from 1550 to 1775 °C. The course of crystallization for the modifications of Si3N4 was examined by quantitative analysis of the corresponding x-ray diffraction patterns. Additionally, recent results of similar investigations on the ceramic with 8.3 at.% B were also considered. The kinetic analysis demonstrates that the controlling mechanisms of the Si3N4 crystallization, continuous nucleation and diffusion-controlled growth, are independent of the boron content. Nevertheless, the estimated activation energy of the crystallization significantly increases from 7.8 to 11.5 eV with the amount of boron ranging from 3.7 to 8.3 at.%. It is concluded that the role of boron in the crystallization kinetics is mainly due to the effect of boron on the nucleation process. Beside the kinetic analysis, the correlation between the boron content and the Si3N4 crystallite size has been discussed.
To study the crystallization kinetics of β-Si3N4 in Si–B–C–N polymer-derived ceramics, the amorphous ceramics with composition SiC1.6N1.0B0.4 were synthesized and then isothermally annealed at 1700, 1775 and 1850 °C. The integrated intensities of β-Si3N4 x-ray diffraction (XRD) patterns were used to examine the course of crystallization. The average size of the Si3N4 nanocrystallites was analyzed by means of the XRD measurements and energy-filtering transmission electron microscopy. It was realized that the nanocrystallite dimensions change insignificantly within the time period of crystallization; however, they depend significantly on the temperature. Subsequently, the kinetics of the β-Si3N4 crystallization was analyzed. Consequently, large activation energy in the range of 11.5 eV was estimated. Moreover, continuous nucleation and diffusion-controlled growth have been concluded as the main mechanisms of the crystallization process. Further analysis points at the crucial role of the nucleation rate in the crystallization kinetics of β-Si3N4.
The combination of robotics and medical imaging may soon provide orthopaedic surgeons with a tool that significantly increases the precision of cementless total hip replacement operations and directly links preoperative planning with surgical execution. Twenty-six successful robot-assisted operations have been performed on dogs and the first clinical trials on human patients have recently taken place.
Growing interest in local food has sparked debate about the merits of attempting to reduce the distance food travels. One point of contention is the capacity of local agriculture to meet the food needs of local people. In hopes of informing this debate, this research presents a method for mapping potential foodsheds, land areas that could theoretically feed urban centers. The model was applied to New York State (NYS). Geographic information systems were used to estimate the spatial distribution of food production capacity relative to the food needs of NYS population centers. Optimization tools were then applied to allocate production potential to meet food needs in the minimum distance possible. Overall, the model showed that NYS could provide 34% of its total food needs within an average distance of just 49 km. However, the model did not allocate production potential evenly. Most NYS population centers could have the majority of their food needs sourced in-state, except for the greater New York City (NYC) area. Thus, the study presents a mixed review of the potential for local food systems to reduce the distance food travels. While small- to medium-sized cities of NYS could theoretically meet their food needs within distances two orders of magnitude smaller than the current American food system, NYC must draw on more distant food-producing resources. Nonetheless, the foodshed model provides a successful template for considering the geography of food production and food consumption simultaneously. Such a tool could be valuable for examining how cities might change their food procurement to curb greenhouse gas emissions and adapt to depletion of petroleum and other energy resources necessary for long-distance transport of food.