We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
People with non-communicable diseases (NCDs) have a higher prevalence of comorbid depression than the general population. While previous research has shown that behavioural activation is effective for general depression, its efficacy and safety in treating depression associated with NCDs remains unclear.
Aims
To compare the efficacy and safety of behavioural activation against comparators in reducing depression symptoms in people with NCDs.
Method
We searched six databases from inception until 30 March 2023 (updated 23 September 2024) for randomised controlled trials (RCTs) comparing behavioural activation with comparators for depression in people with NCDs. Risk of bias was assessed using the Cochrane Collaboration’s ‘risk-of-bias 2 tool’. We calculated a random-effects, inverse-variance weighting meta-analysis.
Results
Of the 21 386 initial studies, 12 RCTs (with 2144 patients) comparing behavioural activation with any comparator on treatment outcomes for depression with comorbid NCD met the inclusion criteria. Six studies rated as low risk of bias. For short-term follow-ups (up to 6 months), meta-analysis showed behavioural activation had little effect on depression symptom improvement in people with NCDs (Hedges’ g = −0.24; 95% CI, −0.62 to 0.15), compared to comparators, with high heterogeneity (I2 = 91.91%). Of the 12 included studies, three RCTs provided data on adverse events occurring during the trial.
Conclusions
Evidence from this systematic review is not sufficient to draw clear conclusions about the efficacy and safety of behavioural activation for reducing depression symptoms in people with NCDs. Future reviews need to include more high-quality, well-designed RCTs to better understand the potential benefits of behavioural activation for comorbid depression.
Contemporary Screen Ethics focuses on the intertwining of the ethical with the socio-political, considering such topics as: care, decolonial feminism, ecology, histories of political violence, intersectionality, neoliberalism, race, and sexual and gendered violence. The collection advocates looking anew at the global complexity and diversity of such ethical issues across various screen media: from Netflix movies to VR, from Chinese romcoms to Brazilian pornochanchadas, from documentaries to drone warfare, from Jordan Peele movies to Google Earth. The analysis exposes the ethical tension between the inclusions and exclusions of global structural inequality (the identities of the haves, the absences of the have nots), alongside the need to understand our collective belonging to the planet demanded by the climate crisis. Informing the analysis, established thinkers like Deleuze, Irigaray, Jameson and Rancière are joined by an array of different voices - Ferreira da Silva, Gill, Lugones, Milroy, Muñoz, Sheshadri-Crooks, Vergès - to unlock contemporary screen ethics.
Climate change exacerbates existing risks and vulnerabilities for people globally, and migration is a longstanding adaptation response to climate risk. The mechanisms through which climate change shapes human mobility are complex, however, and gaps in data and knowledge persist. In response to these gaps, the United Nations Development Programme’s (UNDP) Predictive Analytics, Human Mobility, and Urbanization Project employed a hybrid approach that combined predictive analytics with participatory foresight to explore climate change-related mobility in Pakistan and Viet Nam from 2020 to 2050. Focusing on Karachi and Ho Chi Minh City, the project estimated temporal and spatial mobility patterns under different climate change scenarios and evaluated the impact of such in-migration across key social, political, economic, and environmental domains. Findings indicate that net migration into these cities could significantly increase under extreme climate scenarios, highlighting both the complex spatial patterns of population change and the potential for anticipatory policies to mitigate these impacts. While extensive research exists on foresight methods and theory, process reflections are underrepresented. The innovative approach employed within this project offers valuable insights on foresight exercise design choices and their implications for effective stakeholder engagement, as well as the applicability and transferability of insights in support of policymaking. Beyond substantive findings, this paper offers a critical reflection on the methodological alignment of data-driven and participatory foresight with the aim of anticipatory policy ideation, seeking to contribute to the enhanced effectiveness of foresight practices.
Vaccines have revolutionised the field of medicine, eradicating and controlling many diseases. Recent pandemic vaccine successes have highlighted the accelerated pace of vaccine development and deployment. Leveraging this momentum, attention has shifted to cancer vaccines and personalised cancer vaccines, aimed at targeting individual tumour-specific abnormalities. The UK, now regarded for its vaccine capabilities, is an ideal nation for pioneering cancer vaccine trials. This article convened experts to share insights and approaches to navigate the challenges of cancer vaccine development with personalised or precision cancer vaccines, as well as fixed vaccines. Emphasising partnership and proactive strategies, this article outlines the ambition to harness national and local system capabilities in the UK; to work in collaboration with potential pharmaceutic partners; and to seize the opportunity to deliver the pace for rapid advances in cancer vaccine technology.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
Control of carbapenem-resistant Acinetobacter baumannii and Pseudomonas aeruginosa spread in healthcare settings begins with timely and accurate laboratory testing practices. Survey results show most Veterans Affairs facilities are performing recommended tests to identify these organisms. Most facilities report sufficient resources to perform testing, though medium-complexity facilities report some perceived barriers.
The psychometric rigor of unsupervised, smartphone-based assessments and factors that impact remote protocol engagement is critical to evaluate prior to the use of such methods in clinical contexts. We evaluated the validity of a high-frequency, smartphone-based cognitive assessment protocol, including examining convergence and divergence with standard cognitive tests, and investigating factors that may impact adherence and performance (i.e., time of day and anticipated receipt of feedback vs. no feedback).
Methods:
Cognitively unimpaired participants (N = 120, Mage = 68.8, 68.3% female, 87% White, Meducation = 16.5 years) completed 8 consecutive days of the Mobile Monitoring of Cognitive Change (M2C2), a mobile app-based testing platform, with brief morning, afternoon, and evening sessions. Tasks included measures of working memory, processing speed, and episodic memory. Traditional neuropsychological assessments included measures from the Preclinical Alzheimer’s Cognitive Composite battery.
Results:
Findings showed overall high compliance (89.3%) across M2C2 sessions. Average compliance by time of day ranged from 90.2% for morning sessions, to 77.9% for afternoon sessions, and 84.4% for evening sessions. There was evidence of faster reaction time and among participants who expected to receive performance feedback. We observed excellent convergent and divergent validity in our comparison of M2C2 tasks and traditional neuropsychological assessments.
Conclusions:
This study supports the validity and reliability of self-administered, high-frequency cognitive assessment via smartphones in older adults. Insights into factors affecting adherence, performance, and protocol implementation are discussed.
Agriculture can be pivotal in mitigating climate change through soil carbon sequestration. Land conversion to pasture has been identified as the most effective method to achieve this. Yet, it creates a perceived trade-off between increasing soil carbon and maintaining arable food crop production. In this on-farm study, we assessed the potential of incorporating a 2-year diverse ley (consisting of 23 species of legumes, herbs, and grasses) within a 7-year arable crop rotation for soil organic matter accumulation. We established upper and lower boundaries of soil organic matter accumulation by comparing this approach to positive (permanent ley, akin to conversion to permanent pasture) and negative (bare soil) references. Our findings in the 2-year diverse ley treatment show greater soil organic matter accumulation in plots with lower baseline levels, suggesting a potential plateau of carbon sequestration under this management practice. In contrast, the positive reference consistently showed a steady rate of organic matter accumulation regardless of baseline levels. Moreover, we observed a concurrent increase in labile carbon content in the 2-year ley treatment and positive reference, indicating improved soil nutrient cycling and ecological processes that facilitate soil carbon sequestration. Our results demonstrate that incorporating a 2-year diverse ley within arable rotations surpasses the COP21 global target of a 0.4% annual increase in soil organic carbon. These findings, derived from a working farm's practical and economic constraints, provide compelling evidence that productive arable agriculture can contribute to climate change mitigation efforts.
Background: Carbapenem-resistant Acinetobacter baumannii (CRAB) and Pseudomonas aeruginosa (CRPA) are drug-resistant pathogens causing high mortality rates with limited treatment options. Understanding the incidence of these organisms and laboratory knowledge of testing protocols is important for controlling their spread in healthcare settings. This project assessed how often Veterans Affairs (VA) healthcare facilities identify CRAB and CRPA and testing practices used. Method: An electronic survey was distributed to 126 VA acute care facilities September-October 2023. The survey focused on CRAB and CRPA incidence, testing and identification, and availability of testing resources. Responses were analyzed by complexity of patients treated at VA facilities (High, Medium, Low) using Fisher’s exact tests. Result: 77 (61.1%) facilities responded, most in urban settings (85.4%). Most respondents were lead or supervisory laboratory technologists (84.2%) from high complexity facilities (69.0%). Few facilities detected CRAB ≥ once/month (4.4%), with most reporting that they have not seen CRAB at their facility (55.0%). CRPA was detected more frequently: 19% of facilities with isolates ≥ once/month, 29.2% a few times per year, and 26.9% reporting had not seen the organism. No differences in CRAB or CRPA incidence was found by facility complexity. Nearly all facilities, regardless of complexity, utilize the recommended methods of MIC or disk diffusion to identify CRAB or CRPA (91.9%) with remaining facilities reporting that testing is done off-site (7.8%). More high complexity facilities perform on-site testing compared to low complexity facilities (32.0% vs 2.7%, p=0.04). 83% of laboratories test for Carbapenemase production, with one-fourth using off-site reference labs. One-fourth of facilities perform additional antibiotic susceptibility testing for CRAB and CRPA isolates, most of which test for susceptibility to combination antibiotics; no differences between complexities were found. Agreement that sufficient laboratory and equipment resources were available was higher in high complexity than in medium complexity facilities (70.7% vs 33.3%, p=0.01), but not low complexity facilities (43.8%). Conclusion: Having timely and accurate testing protocols for CRAB and CRPA are important to quickly control spread and reduce associated mortality. This study shows that most VA protocols follow recommended testing and identification guidelines. Interestingly, there was no difference in CRAB or CRPA incidence for facilities providing higher vs lower complexity of care. While high and low complexity facilities generally reported sufficient resources for CRAB and CRPA evaluation, some medium-complexity labs, who may feel more compelled than low-complexity labs to bring testing in house, reported that additional resources would be required.
Previous research showed that behavioural activation is as effective as cognitive–behavioural therapy for general depression. However, it remains unclear if it leads to greater improvement in depressive symptoms when compared with standard treatment for post-stroke depression.
Aims
To compare the effectiveness of behavioural activation against control conditions in reducing depression symptoms in individuals with post-stroke depression.
Method
This review searched five databases from inception until 13 July 2021 (updated 15 September 2023) for randomised controlled trials comparing behavioural activation and any control conditions for post-stroke depression. Risk of bias was assessed with the Cochrane Collaboration's Risk-of-Bias 2 tool. The primary outcome was improvement in depressive symptoms in individuals with post-stroke depression. We calculated a random-effects, inverse variance weighting meta-analysis.
Results
Of 922 initial studies, five randomised controlled trials with 425 participants met the inclusion criteria. Meta-analysis showed that behavioural activation was associated with reduced depressive symptoms in individuals with post-stroke depression at 6-month follow-up (Hedges’ g −0.39; 95% CI −0.64 to −0.14). The risk of bias was low for two (40%) of five trials, and the remaining three (60%) trials were rated as having a high risk of bias. Heterogeneity was low, with no indication of inconsistency.
Conclusions
Evidence from this review was too little to confirm the effectiveness of behavioural activation as a useful treatment for post-stroke depression when compared with control conditions. Further high-quality studies are needed to conclusively establish the efficacy of behavioural activation as a treatment option for post-stroke depression.
Develop and implement a system in the Veterans Health Administration (VA) to alert local medical center personnel in real time when an acute- or long-term care patient/resident is admitted to their facility with a history of colonization or infection with a multidrug-resistant organism (MDRO) previously identified at any VA facility across the nation.
Methods:
An algorithm was developed to extract clinical microbiology and local facility census data from the VA Corporate Data Warehouse initially targeting carbapenem-resistant Enterobacterales (CRE) and methicillin-resistant Staphylococcus aureus (MRSA). The algorithm was validated with chart review of CRE cases from 2010-2018, trialed and refined in 24 VA healthcare systems over two years, expanded to other MDROs and implemented nationwide on 4/2022 as “VA Bug Alert” (VABA). Use through 8/2023 was assessed.
Results:
VABA performed well for CRE with recall of 96.3%, precision of 99.8%, and F1 score of 98.0%. At the 24 trial sites, feedback was recorded for 1,011 admissions with a history of CRE (130), MRSA (814), or both (67). Among Infection Preventionists and MDRO Prevention Coordinators, 338 (33%) reported being previously unaware of the information, and of these, 271 (80%) reported they would not have otherwise known this information. By fourteen months after nationwide implementation, 113/130 (87%) VA healthcare systems had at least one VABA subscriber.
Conclusions:
A national system for alerting facilities in real-time of patients admitted with an MDRO history was successfully developed and implemented in VA. Next steps include understanding facilitators and barriers to use and coordination with non-VA facilities nationwide.
The moderation of user-generated content on online platforms remains a key solution to protecting people online, but also remains a perpetual challenge as the appropriateness of content moderation guidelines depends on the online community that they aim to govern. This challenge affects marginalized groups in particular, as they more frequently experience online abuse but also end up falsely being the target of content-moderation guidelines. While there have been calls for democratic, community-moderation, there has so far been little research into how to implement such approaches. Here, we present the co-creation of content moderation strategies with the users of an online platform to address some of these challenges. Within the context of AutSPACEs—an online citizen science platform that aims to allow autistic people to share their own sensory processing experiences publicly—we used a community-based and participatory approach to co-design a content moderation solution that would fit the preferences, priorities, and needs of its autistic user community. We outline how this approach helped us discover context-specific moderation dilemmas around participant safety and well-being and how we addressed those. These trade-offs have resulted in a moderation design that differs from more general social networks in aspects such as how to contribute, when to moderate, and what to moderate. While these dilemmas, processes, and solutions are specific to the context of AutSPACEs, we highlight how the co-design approach itself could be applied and useful for other communities to uncover challenges and help other online spaces to embed safety and empowerment.
Decreasing the time to contact precautions (CP) is critical to carbapenem-resistant Enterobacterales (CRE) prevention. Identifying factors associated with delayed CP can decrease the spread from patients with CRE. In this study, a shorter length of stay was associated with being placed in CP within 3 days.
OBJECTIVES/GOALS: The Appalachian Translational Research Network (ATRN) Newsletter provides a unique platform that facilitates communication among Appalachian-serving CTSAs/CTSIs and partnering academic and community organizations that strengthens research efforts and advances translational science across the region. METHODS/STUDY POPULATION: Published biannually, each ATRN Newsletter features content submitted by ATRN member universities and organizations. Members of the Communications Committee, who represent both CTSA- or non-CTSA- affiliated ATRN member institutions, provide as well as review and edit content for the Newsletter. Regular features include researcher and community member spotlights; funding opportunity announcements; information on upcoming seminars, trainings, and special events; and opportunities for collaborations among partnering ATRN institutions. Complementing regularly scheduled Newsletters, special editions are released as warranted, such as a special COVID-19 focused edition published in 2020. RESULTS/ANTICIPATED RESULTS: First published in 2012, the ATRN Newsletter initially represented founding ATRN institutions, the University of Kentucky and the Ohio State University CTSAs, and a readership of 50. Reflecting ATRN growth that now represents 9 academic centers including NCATS- and IDeA-funded hubs, affiliated universities and partnering organizations, readership has grown to include 500 subscribers from across the U.S. and 3 other countries. With the establishment of the official ATRN website in 2019, the ATRN Newsletter became a prominent addition, providing ATRN members’ access to both new and archived editions, thereby expanding reach and further strengthening critical communication across the Network. DISCUSSION/SIGNIFICANCE: Providing a vehicle for communication that supports ATRN collaborations and networking, the Newsletter is foundational to the success of the ATRN mission to improve health outcomes across Appalachia by fostering collaborative inter-institutional and community-academic research partnerships.
Class and social disadvantage have long been identified as significant factors in the etiology and epidemiology of psychosis. Few studies have explicitly examined the impact of intersecting social disadvantage on long-term employment and financial independence.
Methods
We applied latent class analysis (LCA) to 20-year longitudinal data from participants with affective and non-affective psychosis (n = 256) within the Chicago Longitudinal Research. LCA groups were modeled using multiple indicators of pre-morbid disadvantage (parental social class, educational attainment, race, gender, and work and social functioning prior to psychosis onset). The comparative longitudinal work and financial functioning of LCA groups were then examined.
Results
We identified three distinct latent classes: one comprised entirely of White participants, with the highest parental class and highest levels of educational attainment; a second predominantly working-class group, with equal numbers of Black and White participants; and a third with the lowest parental social class, lowest levels of education and a mix of Black and White participants. The latter, our highest social disadvantage group experienced significantly poorer employment and financial outcomes at all time-points, controlling for diagnosis, symptoms, and hospitalizations prior to baseline. Contrary to our hypotheses, on most measures, the two less disadvantaged groups did not significantly differ from each other.
Conclusions
Our analyses add to a growing literature on the impact of multiple forms of social disadvantage on long-term functional trajectories, underscoring the importance of proactive attention to sociostructural disadvantage early in treatment, and the development and evaluation of interventions designed to mitigate ongoing social stratification.
Wastewater-based epidemiology (WBE) has proven to be a powerful tool for the population-level monitoring of pathogens, particularly severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). For assessment, several wastewater sampling regimes and methods of viral concentration have been investigated, mainly targeting SARS-CoV-2. However, the use of passive samplers in near-source environments for a range of viruses in wastewater is still under-investigated. To address this, near-source passive samples were taken at four locations targeting student hall of residence. These were chosen as an exemplar due to their high population density and perceived risk of disease transmission. Viruses investigated were SARS-CoV-2 and its variants of concern (VOCs), influenza viruses, and enteroviruses. Sampling was conducted either in the morning, where passive samplers were in place overnight (17 h) and during the day, with exposure of 7 h. We demonstrated the usefulness of near-source passive sampling for the detection of VOCs using quantitative polymerase chain reaction (qPCR) and next-generation sequencing (NGS). Furthermore, several outbreaks of influenza A and sporadic outbreaks of enteroviruses (some associated with enterovirus D68 and coxsackieviruses) were identified among the resident student population, providing evidence of the usefulness of near-source, in-sewer sampling for monitoring the health of high population density communities.
To explore the benefits and barriers of using an interactive robotic seal (PARO, Figure 1) based on the experiences of nursing home residents living with dementia and chronic pain, their family members, and formal caregivers.
Methods:
Semi-structured interviews were conducted alongside a feasibility randomized controlled trial at one nursing home in Brisbane, Australia between July 2021 and January 2022 (Trial registration: ACTRN 12621000837820). Residents with dementia and chronic pain interacted with PARO individually for 15 min once or twice daily, five days per week for three consecutive weeks. After which, individual interviews were conducted with residents who were capable of communicating (n=13), family members (n=3), registered nurses (n=4), care assistants (n=11), a physical therapist (n=1), a diversional therapist (n=1) and the facility manager (n=1) who experienced or observed the residents’ interactions with PARO. The interviews were audio-recorded, transcribed, and analyzed using thematic analysis.
Results:
Almost all participants reported that interacting with PARO benefited residents with dementia and their caregivers. These benefits included (1) reducing pain by providing distraction and stimulation; (2) reducing behavioral and psychological symptoms of dementia; (3) promoting positive emotions by recalling memories; and (4) reducing anxiety and care burden for family and formal caregivers. Neutral attitudes toward PARO were reported by three residents with mild cognitive impairment as they reported it did not make any difference. Barriers to using PARO included limited staff training and the implementation of person‐ centered care due to limited resources.
Conclusion:
Overall, multiple stakeholders were positive about using PARO to reduce pain and behavioral symptoms of nursing home residents living with dementia and chronic pain. PARO may also reduce the care burden of family and formal caregivers. PARO might be incorporated into daily practice to support nursing home residents living with dementia. Improving staff training and understanding individual preferences of residents may enhance the implementation of PARO in this population.
Figure 1
A resident living with dementia and her family after interacting with PARO (Distribution of this photo has been approved by the resident and her family)
The UK Soft Drinks Industry Levy (SDIL) (announced in March 2016; implemented in April 2018) aims to incentivise reformulation of soft drinks to reduce added sugar levels. The SDIL has been applauded as a policy success, and it has survived calls from parliamentarians for it to be repealed. We aimed to explore parliamentary reaction to the SDIL following its announcement until two years post-implementation in order to understand how health policy can become established and resilient to opposition.
Design:
Searches of Hansard for parliamentary debate transcripts that discussed the SDIL retrieved 186 transcripts, with 160 included after screening. Five stages of Applied Thematic Analysis were conducted: familiarisation and creation of initial codebooks; independent second coding; codebook finalisation through team consensus; final coding of the dataset to the complete codebook; and theme finalisation through team consensus.
Setting:
The United Kingdom Parliament
Participants:
N/A
Results:
Between the announcement (16/03/2016) – royal assent (26/04/2017), two themes were identified 1: SDIL welcomed cross-party 2: SDIL a good start but not enough. Between royal assent – implementation (5/04/2018), one theme was identified 3: The SDIL worked – what next? The final theme identified from implementation until 16/03/2020 was 4: Moving on from the SDIL.
Conclusions:
After the announcement, the SDIL had cross-party support and was recognised to have encouraged reformulation prior to implementation. Lessons for governments indicate that the combination of cross-party support and a policy’s documented success in achieving its aim can help cement the resilience of it to opposition and threats of repeal.
Grain-cooking traditions in Neolithic China have been characterised as a ‘wet’ cuisine based on the boiling and steaming of sticky varieties of cereal. One of these, broomcorn millet, was one of the earliest Chinese crops to move westward into Central Asia and beyond, into regions where grains were typically prepared by grinding and baking. Here, the authors present the genotypes and reconstructed phenotypes of 13 desiccated broomcorn millet samples from Xinjiang (1700 BC–AD 700). The absence in this area of sticky-starch millet and vessels for boiling and steaming suggests that, as they moved west, East Asian cereal crops were decoupled from traditional cooking practices and were incorporated into local cuisines.
Clay minerals are abundant in soils and sediments and often contain Fe. Some varieties, such as nontronites, contain as much as 40 wt.% Fe2O3 within their molecular structure. Several studies have shown that various Fe-reducing micro-organisms can use ferric iron in Fe-bearing clay minerals as their terminal electron acceptor, thereby reducing it to ferrous iron. Laboratory experiments have also demonstrated that chemically or bacterially reduced clays can promote the reductive degradation of various organics, including chlorinated pesticides and nitroaromatics. Therefore, Fe-bearing clays may play a crucial role in the natural attenuation of various redox-sensitive contaminants in soils and sediments. Although the organochlorinated pesticide p,p′-DDT is one of the most abundant and recalcitrant sources of contamination in many parts of the world, the impact of reduced Fe-bearing clays on its degradation has never been documented. The purpose of the present study was to evaluate the extent of degradation of p,p′-DDT during the bacterial reduction of Fe(III) in an Fe-rich clay. Microcosm experiments were conducted under anaerobic conditions using nontronite (sample NAu-2) spiked with p,p′-DDT and the metal-reducing bacteria Shewanella oneidensis MR-1. Similar experiments were conducted using a sand sample to better ascertain the true impact of the clay vs. the bacteria on the degradation of DDT. Samples were analyzed for DDT and degradation products after 0, 3, and 6 weeks of incubation at 30°C. Results revealed a progressive decrease in p,p′-DDT and increase in p,p′-DDD concentrations in the clay experiments compared to sand and abiotic controls, indicating that Fe-bearing clays may substantially contribute toward the reductive degradation of DDT in soils and sediments. These new findings further demonstrate the impact that clay materials can have on the natural attenuation of pollutants in natural and artificial systems and open new avenues for the passive treatment of contaminated land.