We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Network meta-analysis allows the synthesis of relative effects from several treatments. Two broad approaches are available to synthesize the data: arm-synthesis and contrast-synthesis, with several models that can be fitted within each. Limited evaluations comparing these approaches are available. We re-analyzed 118 networks of interventions with binary outcomes using three contrast-synthesis models (CSM; one fitted in a frequentist framework and two in a Bayesian framework) and two arm-synthesis models (ASM; both fitted in a Bayesian framework). We compared the estimated log odds ratios, their standard errors, ranking measures and the between-trial heterogeneity using the different models and investigated if differences in the results were modified by network characteristics. In general, we observed good agreement with respect to the odds ratios, their standard errors and the ranking metrics between the two Bayesian CSMs. However, differences were observed when comparing the frequentist CSM and the ASMs to each other and to the Bayesian CSMs. The network characteristics that we investigated, which represented the connectedness of the networks and rareness of events, were associated with the differences observed between models, but no single factor was associated with the differences across all of the metrics. In conclusion, we found that different models used to synthesize evidence in a network meta-analysis (NMA) can yield different estimates of odds ratios and standard errors that can impact the final ranking of the treatment options compared.
The Hippoboscidae are ectoparasites of birds and mammals, which, as a group, are known to vector multiple diseases. Avipoxvirus (APV) is mechanically vectored by various arthropods and causes seasonal disease in wild birds in the United Kingdom (UK). Signs of APV and the presence of louse flies (Hippoboscidae) on Dunnocks Prunella modularis were recorded over a 16·5-year period in a rural garden in Somerset, UK. Louse flies collected from this site and other sites in England were tested for the presence of APV DNA and RNA sequences. Louse flies on Dunnocks were seen to peak seasonally three weeks prior to the peak of APV lesions, an interval consistent with the previously estimated incubation period of APV in Dunnocks. APV DNA was detected on 13/25 louse flies, Ornithomya avicularia and Ornithomya fringillina, taken from Dunnocks, both with and without lesions consistent with APV, at multiple sites in England. Collectively these data support the premise that louse flies may vector APV. The detection of APV in louse flies, from apparently healthy birds, and from sites where disease has not been observed in any host species, suggests that the Hippoboscidae could provide a non-invasive and relatively cheap method of monitoring avian diseases. This could provide advanced warnings of disease, including zoonoses, before they become clinically apparent.
In low- and middle-income countries, fewer than 1 in 10 people with mental health conditions are estimated to be accurately diagnosed in primary care. This is despite more than 90 countries providing mental health training for primary healthcare workers in the past two decades. The lack of accurate diagnoses is a major bottleneck to reducing the global mental health treatment gap. In this commentary, we argue that current research practices are insufficient to generate the evidence needed to improve diagnostic accuracy. Research studies commonly determine accurate diagnosis by relying on self-report tools such as the Patient Health Questionnaire-9. This is problematic because self-report tools often overestimate prevalence, primarily due to their high rates of false positives. Moreover, nearly all studies on detection focus solely on depression, not taking into account the spectrum of conditions on which primary healthcare workers are being trained. Single condition self-report tools fail to discriminate among different types of mental health conditions, leading to a heterogeneous group of conditions masked under a single scale. As an alternative path forward, we propose improving research on diagnostic accuracy to better evaluate the reach of mental health service delivery in primary care. We recommend evaluating multiple conditions, statistically adjusting prevalence estimates generated from self-report tools, and consistently using structured clinical interviews as a gold standard. We propose clinically meaningful detection as ‘good-enough’ diagnoses incorporating multiple conditions accounting for context, health system and types of interventions available. Clinically meaningful identification can be operationalized differently across settings based on what level of diagnostic specificity is needed to select from available treatments. Rethinking research strategies to evaluate accuracy of diagnosis is vital to improve training, supervision and delivery of mental health services around the world.
Paediatric patients with heart failure requiring ventricular assist devices are at heightened risk of neurologic injury and psychosocial adjustment challenges, resulting in a need for neurodevelopmental and psychosocial support following device placement. Through a descriptive survey developed in collaboration by the Advanced Cardiac Therapies Improving Outcomes Network and the Cardiac Neurodevelopmental Outcome Collaborative, the present study aimed to characterise current neurodevelopmental and psychosocial care practices for paediatric patients with ventricular assist devices.
Method:
Members of both learning networks developed a 25-item electronic survey assessing neurodevelopmental and psychosocial care practices specific to paediatric ventricular assist device patients. The survey was sent to Advanced Cardiac Therapies Improving Outcomes Network site primary investigators and co-primary investigators via email.
Results:
Of the 63 eligible sites contacted, responses were received from 24 unique North and South American cardiology centres. Access to neurodevelopmental providers, referral practices, and family neurodevelopmental education varied across sites. Inpatient neurodevelopmental care consults were available at many centres, as were inpatient family support services. Over half of heart centres had outpatient neurodevelopmental testing and individual psychotherapy services available to patients with ventricular assist devices, though few centres had outpatient group psychotherapy (12.5%) or parent support groups (16.7%) available. Barriers to inpatient and outpatient neurodevelopmental care included limited access to neurodevelopmental providers and parent/provider focus on the child’s medical status.
Conclusions:
Paediatric patients with ventricular assist devices often have access to neurodevelopmental providers in the inpatient setting, though supports vary by centre. Strengthening family neurodevelopmental education, referral processes, and family-centred psychosocial services may improve current neurodevelopmental/psychosocial care for paediatric ventricular assist device patients.
Translational research needs to show value through impact on measures that matter to the public, including health and societal benefits. To this end, the Translational Science Benefits Model (TSBM) identified four categories of impact: Clinical, Community, Economic, and Policy. However, TSBM offers limited guidance on how these areas of impact relate to equity. Central to the structure of our Center for American Indian and Alaska Native Diabetes Translation Research are seven regional, independent Satellite Centers dedicated to community-engaged research. Drawing on our collective experience, we provide empirical evidence about how TSBM applies to equity-focused research that centers community partnerships and recognizes Indigenous knowledge. For this special issue – “Advancing Understanding and Use of Impact Measures in Implementation Science” – our objective is to describe and critically evaluate gaps in the fit of TSBM as an evaluation approach with sensitivity to health equity issues. Accordingly, we suggest refinements to the original TSBM Logic model to add: 1) community representation as an indicator of providing community partners “a seat at the table” across the research life cycle to generate solutions (innovations) that influence equity and to prioritize what to evaluate, and 2) assessments of the representativeness of the measured outcomes and benefits.
Contact tracing for COVID-19 in England operated from May 2020 to February 2022. The clinical, demographic and exposure information collected on cases and their contacts offered a unique opportunity to study secondary transmission. We aimed to quantify the relative impact of host factors and exposure settings on secondary COVID-19 transmission risk using 550,000 sampled transmission links between cases and their contacts. Links, or ‘contact episodes’, were established where a contact subsequently became a case, using an algorithm accounting for incubation period, setting, and contact date. A mixed-effects logistic regression model was used to estimate adjusted odds of transmission. Of sampled episodes, 8.7% resulted in secondary cases. Living with a case (71% episodes) was the most significant risk factor (aOR = 2.6, CI = 1.9–3.6). Other risk factors included unvaccinated status (aOR = 1.2, CI = 1.2–1.3), symptoms, and older age (66–79 years; aOR = 1.4, CI = 1.4–1.5). Whilst global COVID-19 strategies emphasized protection outside the home, including education, travel, and gathering restrictions, this study evidences the relative importance of household transmission. There is a need to reconsider the contribution of household transmission to future control strategies and the requirement for effective infection control within households.
Bitter taste perception plays a dual role in human nutrition and evolutionary biology; being identifiable in nutrient-dense foods such as cruciferous vegetables and historically signalled toxic compounds. The TAS2R38 gene, part of the taste 2 receptor family, is central to individual differences in bitter taste perception(1). While genetic variations are influential, dietary habits and food preparation also impact taste perception. However, research investigating the interplay between these factors and genetic variations in influencing bitter taste sensitivity and food intake is limited. This study aimed to elucidate the relationship between bitter taste sensitivity and TAS2R38 haplotype variations in the context of bitter food consumption among Australian adults. A cross-sectional, mixed-methods study was conducted. Healthy adults who had maintained a stable diet for at least three months were eligible. Data collection was via an online survey (REDCap), capturing self-reported demographics, dietary patterns specific to bitter foods including metrics of bitter food avoidance, frequency, liking and perceived healthfulness, alongside a Dietary Quality Index (DQI) derived from a food frequency questionnaire(2). Bitter taste sensitivity was assessed using self-reported intensity perceptions of 6-n-propylthiouracil (PROP) taste strips(3). Genotyping was conducted via TaqMan qPCR assays on DNA extracted from buccal swabs to ascertain TAS2R38 haplotypes. Data analysis utilised Analysis of Covariance (ANCOVA) and regression models, with all tests adjusted for confounding variables such as gender, age, and smoking status. A total of 222 participants (47.5 ± 17.7 years; 86% female; BMI 27.3 ± 7.1 kg/m2) completed the study. PROP sensitivity was strongly correlated with TAS2R38 haplotype, with supertasters predominantly having PAV/PAV, medium tasters with PAV/AVI, and non-tasters with AVI/AVI (p = 0.002). However, no relationship was observed between PROP sensitivity and either the frequency, liking, or avoidance of bitter foods (p>0.05). DQI was significantly related to bitter food consumption; individuals in the lowest DQI quintile consumed bitter foods more frequently than those in the third (p = 0.007) and top quintiles (p = 0.001). The perceived healthfulness of bitter foods was significantly higher in those with AVI/AVI haplotypes (non-tasters) compared to those with PAV/AVI (medium tasters) (p = 0.001). Counterintuitively, participants who reported greater enjoyment of bitter tastes consumed bitter foods less frequently (p<0.001). Our study confirms that TAS2R38 variants are predictive of PROP taste sensitivity, consistent with literature that identifies PAV/PAV individuals as supertasters. However, neither PROP sensitivity nor TAS2R38 haplotype influenced bitter food frequency or preference consumption patterns. Interestingly, those with lower Dietary Quality Index scores and less enjoyment of bitter taste consumed bitter foods more often. These observations highlight the need to investigate other factors influencing bitter food intake, such as additional sensory characteristics or psychological and behavioural aspects.
High-quality evidence is lacking for the impact on healthcare utilisation of short-stay alternatives to psychiatric inpatient services for people experiencing acute and/or complex mental health crises (known in England as psychiatric decision units [PDUs]). We assessed the extent to which changes in psychiatric hospital and emergency department (ED) activity were explained by implementation of PDUs in England using a quasi-experimental approach.
Methods
We conducted an interrupted time series (ITS) analysis of weekly aggregated data pre- and post-PDU implementation in one rural and two urban sites using segmented regression, adjusting for temporal and seasonal trends. Primary outcomes were changes in the number of voluntary inpatient admissions to (acute) adult psychiatric wards and number of ED adult mental health-related attendances in the 24 months post-PDU implementation compared to that in the 24 months pre-PDU implementation.
Results
The two PDUs (one urban and one rural) with longer (average) stays and high staff-to-patient ratios observed post-PDU decreases in the pattern of weekly voluntary psychiatric admissions relative to pre-PDU trend (Rural: −0.45%/week, 95% confidence interval [CI] = −0.78%, −0.12%; Urban: −0.49%/week, 95% CI = −0.73%, −0.25%); PDU implementation in each was associated with an estimated 35–38% reduction in total voluntary admissions in the post-PDU period. The (urban) PDU with the highest throughput, lowest staff-to-patient ratio and shortest average stay observed a 20% (−20.4%, CI = −29.7%, −10.0%) level reduction in mental health-related ED attendances post-PDU, although there was little impact on long-term trend. Pooled analyses across sites indicated a significant reduction in the number of voluntary admissions following PDU implementation (−16.6%, 95% CI = −23.9%, −8.5%) but no significant (long-term) trend change (−0.20%/week, 95% CI = −0.74%, 0.34%) and no short- (−2.8%, 95% CI = −19.3%, 17.0%) or long-term (0.08%/week, 95% CI = −0.13, 0.28%) effects on mental health-related ED attendances. Findings were largely unchanged in secondary (ITS) analyses that considered the introduction of other service initiatives in the study period.
Conclusions
The introduction of PDUs was associated with an immediate reduction of voluntary psychiatric inpatient admissions. The extent to which PDUs change long-term trends of voluntary psychiatric admissions or impact on psychiatric presentations at ED may be linked to their configuration. PDUs with a large capacity, short length of stay and low staff-to-patient ratio can positively impact ED mental health presentations, while PDUs with longer length of stay and higher staff-to-patient ratios have potential to reduce voluntary psychiatric admissions over an extended period. Taken as a whole, our analyses suggest that when establishing a PDU, consideration of the primary crisis-care need that underlies the creation of the unit is key.
Behavioural treatments are recommended first-line for insomnia, but long-term benzodiazepine receptor agonist (BZRA) use remains common and engaging patients in a deprescribing consultation is challenging. Few deprescribing interventions directly target patients. Prescribers’ support of patient-targeted interventions may facilitate their uptake. Recently assessed in the Your Answers When Needing Sleep in New Brunswick (YAWNS NB) study, Sleepwell (mysleepwell.ca) was developed as a direct-to-patient behaviour change intervention promoting BZRA deprescribing and non-pharmacological insomnia management. BZRA prescribers of YAWNS NB participants were invited to complete an online survey assessing the acceptability of Sleepwell as a direct-to-patient intervention. The survey was developed using the seven construct components of the theoretical framework of acceptability (TFA) framework. Respondents (40/250, 17.2%) indicated high acceptability, with positive responses per TFA construct averaging 32.3/40 (80.7%). Perceived as an ethical, credible, and useful tool, Sleepwell also promoted prescriber–patient BZRA deprescribing engagements (11/19, 58%). Prescribers were accepting of Sleepwell and supported its application as a direct-to-patient intervention.
Odd Radio Circles (ORCs) are a class of low surface brightness, circular objects approximately one arcminute in diameter. ORCs were recently discovered in the Australian Square Kilometre Array Pathfinder (ASKAP) data and subsequently confirmed with follow-up observations on other instruments, yet their origins remain uncertain. In this paper, we suggest that ORCs could be remnant lobes of powerful radio galaxies, re-energised by the passage of a shock. Using relativistic hydrodynamic simulations with synchrotron emission calculated in post-processing, we show that buoyant evolution of remnant radio lobes is alone too slow to produce the observed ORC morphology. However, the passage of a shock can produce both filled and edge-brightnened ORC-like morphologies for a wide variety of shock and observing orientations. Circular ORCs are predicted to have host galaxies near the geometric centre of the radio emission, consistent with observations of these objects. Significantly offset hosts are possible for elliptical ORCs, potentially causing challenges for accurate host galaxy identification. Observed ORC number counts are broadly consistent with a paradigm in which moderately powerful radio galaxies are their progenitors.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
This chapter explores the ways irony unfolds in music. Turner and DiBernardo examine representative pop songs, both original compositions and cover versions, to suggest several ways that irony is created and perhaps detected by listeners. As they argue, “Musical irony requires an interpretive ear for hearing contradictory or disjunctive sounds (and lyrics) within a musical context.” But inferring irony from music involves a special challenge given that music lacks it own semantic or representational signification. Lyrics are clearly a driving force in expressing ironic intent, but instrumental sounds often interact with the spoken words to convey richer ironic complexes, including both rhetorical and situational ironies. Listeners may be especially attentive to the tension, or the discrepancy, between the musical form, style, or genre of a song (e.g., the upbeat, lyrical form in Randy Newman’s song “Political Science”) and its lyrical content (e.g., the use of weapons of mass destruction). Many musical ironies may be “post-modern” because of their self-referential style (e.g., not just criticizing others, but ourselves as well). This chapter offers a compelling, beautifully detailed, argument that “music is a largely underexplored wellspring of ironic activity.”
Even a casual perusal of the works of Tirso de Molina reveals his fascination with various forms of disguise. These range from simple erasure, such as when Diego de Marsilla covers his face with a cloth in Los amantes de Teruel [The Lovers of Teruel], or misdirection, where Don Juan de Cardona changes his voice in Privar contra su gusto [The Reluctant Councilor], to much more complex structures that layer on changes of gender, nationality, and language such as we see in El amor médico [Love the Doctor]. In the latter case, disguise forms a complicated dance where language, clothing, and performance rebut or reinforce each other depending on Jerónima’s need. Tirso’s use of disguise is so ubiquitous that this theatrical technique becomes more than recourse to further the plot or add drama to the performance. It is, in fact, a thematic element. In the following pages, I will provide the reader with a sense of how Tirso makes use of disguise and an understanding of the various permutations of disguise that are common in Tirso’s theater, in the hope of illustrating how disguise plays a key role in exploring and explicating the themes of justice and identity that permeate Tirso’s theater.
At its heart, disguise is an attempt to separate oneself from one’s self, frequently in the hope of a better future. The justifications, excuses, and mechanisms may vary, but disguise permits words, actions, and thoughts that are not tolerable within the strictures of society or that are denied to the person because of their status, gender, or race, for example—strictures that Michel Foucault describes as a “normalizing gaze” (1995: 184). In this view, one that is seen again and again in Tirso’s plays, disguise, in any form, is used to break with the social roles that the characters are expected to play. Thus, disguise is a liberating escape from both social surveillance and a means to achieve a character’s goals. In Tirso, the use of disguise is generally the resort of the powerless and used to achieve a morally defensible goal or to right a moral wrong, although it is occasionally used to hide abuses as well.
Disguise, in one form or another, is used in more than fifty of Tirso’s plays.
Gestational diabetes is treated with medical nutrition therapy, delivered by healthcare professionals; however, the optimal diet for affected women is unknown. Randomised controlled trials, such as the DiGest (Dietary Intervention in Gestational Diabetes) trial, will address this knowledge gap, but the acceptability of whole-diet interventions in pregnancy is unclear. Whole-diet approaches reduce bias but require high levels of participant commitment and long intervention periods to generate meaningful clinical outcomes. We aimed to assess healthcare professionals’ views on the acceptability of the DiGest dietbox intervention for women with gestational diabetes and to identify any barriers to adherence which could be addressed to support good recruitment and retention to the DiGest trial. Female healthcare professionals (n 16) were randomly allocated to receive a DiGest dietbox containing 1200 or 2000 kcal/d including at least one weeks’ food. A semi-structured interview was conducted to explore participants’ experience of the intervention. Interviews were audio-recorded, transcribed verbatim and analysed thematically using NVivo software. Based on the findings of qualitative interviews, modifications were made to the dietboxes. Participants found the dietboxes convenient and enjoyed the variety and taste of the meals. Factors which facilitated adherence included participants having a good understanding of study aims and sufficient organisational skills to facilitate weekly meal planning in advance. Barriers to adherence included peer pressure during social occasions and feelings of deprivation or hunger (affecting both standard and reduced calorie groups). Healthcare professionals considered random allocation to a whole-diet replacement intervention to be acceptable and feasible in a clinical environment and offered benefits to participants including convenience.
Poor air quality is associated with poor health. Little attention is given to the complex array of environmental exposures and air pollutants that affect mental health during the life course.
Aims
We gather interdisciplinary expertise and knowledge across the air pollution and mental health fields. We seek to propose future research priorities and how to address them.
Method
Through a rapid narrative review, we summarise the key scientific findings, knowledge gaps and methodological challenges.
Results
There is emerging evidence of associations between poor air quality, both indoors and outdoors, and poor mental health more generally, as well as specific mental disorders. Furthermore, pre-existing long-term conditions appear to deteriorate, requiring more healthcare. Evidence of critical periods for exposure among children and adolescents highlights the need for more longitudinal data as the basis of early preventive actions and policies. Particulate matter, including bioaerosols, are implicated, but form part of a complex exposome influenced by geography, deprivation, socioeconomic conditions and biological and individual vulnerabilities. Critical knowledge gaps need to be addressed to design interventions for mitigation and prevention, reflecting ever-changing sources of air pollution. The evidence base can inform and motivate multi-sector and interdisciplinary efforts of researchers, practitioners, policy makers, industry, community groups and campaigners to take informed action.
Conclusions
There are knowledge gaps and a need for more research, for example, around bioaerosols exposure, indoor and outdoor pollution, urban design and impact on mental health over the life course.
Oil palm is one of Southeast Asia’s most common crops, and its expansion has caused substantial modification of natural habitats and put increasing pressure on biodiversity. Rising global demand for vegetable oil, coupled with oil palm’s high yield per unit area and the versatility of the palm oil product, has driven the expansion of oil palm agriculture in the region. Therefore, it is critical to identify management practices that can support biodiversity in plantations without exacerbating negative impacts on the environment. This study focuses on day-flying Lepidoptera (butterflies and moths), which contribute to the ecosystem functioning as pollinators, prey, and herbivore species. We assessed whether density and behaviours of day-flying Lepidoptera varied between different habitats within oil palm plantations and across seasons. We surveyed the density and behaviours of Lepidoptera communities in mature industrial oil palm plantations within the Biodiversity and Ecosystem Function in Tropical Agriculture (BEFTA) Programme sites, in Riau, Indonesia. We surveyed two distinct habitats within the plantations in March and September 2013: Edge habitats, which were bordered by plantation roads on one side, and Core habitats in the centre of oil palm planting blocks. We conducted analyses on the effect of habitat type and season on both the overall density and behaviour of Lepidoptera communities and, independently, on the most common species. In our surveys, we observed 1464 individuals across 41 species, with a significantly higher density in Edge than in Core habitats. While there was no significant difference between overall density in March and September surveys, there was an interaction between season and habitat, with density increasing more markedly in Edge than Core areas in September. There was also a significant effect of habitat and season on behavioural time budget for the community as a whole, with more active behaviours, such as foraging and mating, being recorded more frequently in Edge than Core habitats, and more commonly in September than March. The effect of habitat type, season, and their interaction differed between the six most common species. Our findings indicate that Lepidoptera abundance is affected by habitat characteristics in a plantation and can therefore be influenced by plantation management practices. In particular, our study highlights the value of road edges and paths in plantations for day-flying Lepidoptera. We suggest that increased non-crop vegetation in these areas, achieved through reduced clearing practices or planting of flowering plants, could foster abundant and active butterfly communities in plantations. These practices could form part of sustainability management recommendations for oil palm, such as those of the Roundtable on Sustainable Palm Oil.
In November 1995, the Laboratory of Archaeology at the University of Georgia submitted inventories and summaries of Indigenous ancestors and funerary objects in its holdings to comply with the passage of the Native American Graves Protection and Repatriation Act (NAGPRA). However, after this submission, the Laboratory attempts at consultation with federally recognized descendant Tribal communities who have cultural ties in the state of Georgia were not successful, and NAGPRA-related activities essentially stalled at the Laboratory. Beginning in 2019, the Laboratory's staff recognized a lack of formal NAGPRA policies or standards, which led to a complete reevaluation of the Laboratory's approach to NAGPRA. In essence, it was the Laboratory's renewed engagement with NAGPRA and descendan tribal communities that became the catalyst for change in the Laboratory's philosophy as a curation repository. This shift in thinking set the Laboratory on a path toward building a descendant community–informed institutional integrity (DCIII) level of engagement with consultation and collaborative efforts in all aspects of collections management and archaeological research. In this article, we outline steps that the Laboratory has taken toward implementing meaningful policies and practices created with descendant Tribal communities that both fulfill and extend bounds of NAGPRA compliance.
The term “blue justice” was coined in 2018 during the 3rd World Small-Scale Fisheries Congress. Since then, academic engagement with the concept has grown rapidly. This article reviews 5 years of blue justice scholarship and synthesizes some of the key perspectives, developments, and gaps. We then connect this literature to wider relevant debates by reviewing two key areas of research – first on blue injustices and second on grassroots resistance to these injustices. Much of the early scholarship on blue justice focused on injustices experienced by small-scale fishers in the context of the blue economy. In contrast, more recent writing and the empirical cases reviewed here suggest that intersecting forms of oppression render certain coastal individuals and groups vulnerable to blue injustices. These developments signal an expansion of the blue justice literature to a broader set of affected groups and underlying causes of injustice. Our review also suggests that while grassroots resistance efforts led by coastal communities have successfully stopped unfair exposure to environmental harms, preserved their livelihoods and ways of life, defended their culture and customary rights, renegotiated power distributions, and proposed alternative futures, these efforts have been underemphasized in the blue justice scholarship, and from marine and coastal literature more broadly. We conclude with some suggestions for understanding and supporting blue justice now and into the future.