We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In March 1933, the United States Congress declared beer up to 3.2 percent alcohol by weight to be “non-intoxicating,” thus allowing it to be produced and sold while the nation was still under the 18th Amendment’s ban of intoxicating liquors. Brewers had long argued that beer was a temperance beverage that should be regulated with a lighter touch than harder liquor. In fact, the declaration that 3.2 beer was non-intoxicating opened several markets that would otherwise have been closed to brewers. In the decades that followed Repeal, 3.2 beer continued to be treated differently than stronger alcohol with respect to who, when, where, and how it was legally available. This paper explores the important—and continuing—role that 3.2 beer has played in the post-Prohibition United States.
The Pediatric Acute Care Cardiology Collaborative (PAC3) previously showed decreased postoperative chest tube duration and length of stay in children undergoing 9 Society of Thoracic Surgeons benchmark operations. Here we report how these gains were sustained over time and spread to 8 additional centers within the PAC3 network.
Methods:
Patient data were prospectively collected across baseline and intervention phases at the original 9 centres (Pioneer) and 8 new centres (Spread). The Pioneer baseline phase was 6/2017–6/2018 and Spread was 5/2019–9/2019. The Pioneer intervention phase was 7/2018–7/2021 and Spread 10/2019–7/2021. The primary outcome measure was postoperative chest tube duration in hours, with the aim of 20% overall reduction. Balancing measures included chest tube reinsertion and readmission for pleural effusion. Statistical process control methods and traditional statistics were used to analyse outcomes over time.
Results:
Among 5,042 patients at 17 centres, demographics were comparable. The Pioneer cohort (n = 3,383) sustained a 22.6% reduction in mean chest tube duration (from 91.9 hours to 70.5 hours), while the Spread cohort (n = 1,659) showed a 9.7% reduction (from 73.1 hours to 66.0 hours) in the first 13 months following intervention. Across both cohorts, rates of reinsertion (2.0% versus 2.1%, p = 0.869) and readmission for effusion did not change (0.3% versus 0.5%, p = 0.285).
Conclusions:
This multicenter prospective quality improvement study demonstrated sustained reduction in chest tube duration at 9 centres while successfully spreading improvement to 8 additional centres. This project serves as a model for post-operative multicentre quality improvement across a large cohort of congenital cardiac surgery patients.
The Australian SKA Pathfinder (ASKAP) offers powerful new capabilities for studying the polarised and magnetised Universe at radio wavelengths. In this paper, we introduce the Polarisation Sky Survey of the Universe’s Magnetism (POSSUM), a groundbreaking survey with three primary objectives: (1) to create a comprehensive Faraday rotation measure (RM) grid of up to one million compact extragalactic sources across the southern $\sim50$% of the sky (20,630 deg$^2$); (2) to map the intrinsic polarisation and RM properties of a wide range of discrete extragalactic and Galactic objects over the same area; and (3) to contribute interferometric data with excellent surface brightness sensitivity, which can be combined with single-dish data to study the diffuse Galactic interstellar medium. Observations for the full POSSUM survey commenced in May 2023 and are expected to conclude by mid-2028. POSSUM will achieve an RM grid density of around 30–50 RMs per square degree with a median measurement uncertainty of $\sim$1 rad m$^{-2}$. The survey operates primarily over a frequency range of 800–1088 MHz, with an angular resolution of 20” and a typical RMS sensitivity in Stokes Q or U of 18 $\mu$Jy beam$^{-1}$. Additionally, the survey will be supplemented by similar observations covering 1296–1440 MHz over 38% of the sky. POSSUM will enable the discovery and detailed investigation of magnetised phenomena in a wide range of cosmic environments, including the intergalactic medium and cosmic web, galaxy clusters and groups, active galactic nuclei and radio galaxies, the Magellanic System and other nearby galaxies, galaxy halos and the circumgalactic medium, and the magnetic structure of the Milky Way across a very wide range of scales, as well as the interplay between these components. This paper reviews the current science case developed by the POSSUM Collaboration and provides an overview of POSSUM’s observations, data processing, outputs, and its complementarity with other radio and multi-wavelength surveys, including future work with the SKA.
This study uses established procedures to estimate the effects of changes in mortality and growth implant protocols on feedlot net returns (NRs). We then propose new methods for estimating concurrent impacts to feedlot greenhouse gas emissions intensity. Reducing mortality consistently increases NRs and reduces greenhouse gas emissions intensity in the feedlot regardless of sex or placement weight. Results indicate that use of two implants in the feedlot may increase NRs and reduce greenhouse gas emissions per pound of dressed beef produced, compared to just one growth implant.
As the use of guided digitally-delivered cognitive-behavioral therapy (GdCBT) grows, pragmatic analytic tools are needed to evaluate coaches’ implementation fidelity.
Aims
We evaluated how natural language processing (NLP) and machine learning (ML) methods might automate the monitoring of coaches’ implementation fidelity to GdCBT delivered as part of a randomized controlled trial.
Method
Coaches served as guides to 6-month GdCBT with 3,381 assigned users with or at risk for anxiety, depression, or eating disorders. CBT-trained and supervised human coders used a rubric to rate the implementation fidelity of 13,529 coach-to-user messages. NLP methods abstracted data from text-based coach-to-user messages, and 11 ML models predicting coach implementation fidelity were evaluated.
Results
Inter-rater agreement by human coders was excellent (intra-class correlation coefficient = .980–.992). Coaches achieved behavioral targets at the start of the GdCBT and maintained strong fidelity throughout most subsequent messages. Coaches also avoided prohibited actions (e.g. reinforcing users’ avoidance). Sentiment analyses generally indicated a higher frequency of coach-delivered positive than negative sentiment words and predicted coach implementation fidelity with acceptable performance metrics (e.g. area under the receiver operating characteristic curve [AUC] = 74.48%). The final best-performing ML algorithms that included a more comprehensive set of NLP features performed well (e.g. AUC = 76.06%).
Conclusions
NLP and ML tools could help clinical supervisors automate monitoring of coaches’ implementation fidelity to GdCBT. These tools could maximize allocation of scarce resources by reducing the personnel time needed to measure fidelity, potentially freeing up more time for high-quality clinical care.
The recommended first-line treatment for insomnia is cognitive behavioral therapy for insomnia (CBTi), but access is limited. Telehealth- or internet-delivered CBTi are alternative ways to increase access. To date, these intervention modalities have never been compared within a single study. Further, few studies have examined (a) predictors of response to the different modalities, (b) whether successfully treating insomnia can result in improvement of health-related biomarkers, and (c) mechanisms of change in CBTi. This protocol was designed to compare the three CBTi modalities to each other and a waitlist control for adults aged 50–65 years (N = 100). Participants are randomly assigned to one of four study arms: in-person- (n = 30), telehealth- (n = 30) internet-delivered (n = 30) CBTi, or 12-week waitlist control (n = 10). Outcomes include self-reported insomnia symptom severity, polysomnography, circadian rhythms of activity and core body temperature, blood- and sweat-based biomarkers, cognitive functioning and magnetic resonance imaging.
The Global Legend of Prester John delves into the enduring fascination with Prester John, an unreachable, collectively-imagined Christian priest-king who figured prominently in Europe's entrance into an interconnected global world. This Element draws on “The International Prester John Project,” an archive of Prester John narratives, from papal epistles to missionary diaries to Marvel comics, all of which respond to the Christian heterotopia promised in the twelfth-century Letter of Prester John. During the medieval and early modern periods, the desire to legitimize the letter's contents influenced military tactics and papal policy while serving as a cultural touchstone for medieval maps, travel narratives, and romance tales. By providing an overview of distinct narrative paths the legend took along with an analysis of the themes of malleability and elasticity within and across these paths, this Element addresses how belief in Prester John persisted for six centuries despite a lack of evidence.
We aimed to explore participant perspectives on social prescribing (SP) for mental health and well-being and the acceptability of community pharmacists (CP) as members of SP pathways that support people with mild to moderate depression and anxiety.
Background:
SP aims to support people with poor health related to socio-demographic determinants. Positive effects of SP on self-belief, mood, well-being, and health are well documented, including a return to work for long-term unemployed.
Methods:
The study was set in a city in southwest England with diverse cultural and socio-demographics. We recruited SP stakeholders, including CP, to either one of 17 interviews or a focus group with nine members of the public.
Findings:
An inductive iterative approach to thematic analysis produced four superordinate themes: (1) offering choice a non-pharmacological option, (2) supporting pharmacy communities – ‘it is an extension of what we do’, (3) stakeholder perspectives – pharmacists are very busy and their expertise unknown by some, and (4) potential for pharmacy in primary care.
Stakeholders viewed CP as local to and accessible by their community. Pharmacists perceived referral to SP services as part of their current role. General practitioner participants considered pharmacy involvement could reduce their workload and expand the primary healthcare team. Importantly, general practitioners and CP viewed SP as a non-pharmacological alternative to prescribing unnecessary antidepressants and reduce associated adverse effects. All participants voiced concerns about pharmacy dispensing busyness as a potential barrier to involvement and pharmacists requesting mental health training updates.
Key findings suggest CP offer a potential alternative to the general practitioner for people with mild to moderate depression and anxiety seeking access to support and health information. However, CP need appropriately commissioned and funded involvement in SP, including backfill for ongoing dispensing, medicines optimization, and mental health first aid training.
This study investigated cases of pregnancy-related listeriosis in British Columbia (BC), Canada, from 2005 to 2014. We described all diagnosed cases in pregnant women (n = 15) and neonates (n = 7), estimated the excess healthcare costs associated with listeriosis, and calculated the fraction of stillbirths attributable to listeriosis, and mask cell sizes 1–5 due to data requirements. Pregnant women had a median gestational age of 31 weeks at listeriosis onset (range: 20–39) and on average delivered at a median of 37 weeks gestation (range: 20–40). Neonates experienced complications but no fatalities. Stillbirths occurred in 1–5 of 15 pregnant women with listeriosis, and very few (0.05–0.24%) of the 2,088 stillbirths in BC in the 10 years were attributed to listeriosis (exact numbers masked). Pregnant women and neonates with listeriosis had significantly more hospital visits, days in the hospital and physician visits than those without listeriosis. Pregnant women with listeriosis had 2.59 times higher mean total healthcare costs during their pregnancy, and neonates with listeriosis had 9.85 times higher mean total healthcare costs during their neonatal period, adjusting for various factors. Despite small case numbers and no reported deaths, these results highlight the substantial additional health service use and costs associated with individual cases of pregnancy-related listeriosis in BC.
Observational studies consistently report associations between tobacco use, cannabis use and mental illness. However, the extent to which this association reflects an increased risk of new-onset mental illness is unclear and may be biased by unmeasured confounding.
Methods
A systematic review and meta-analysis (CRD42021243903). Electronic databases were searched until November 2022. Longitudinal studies in general population samples assessing tobacco and/or cannabis use and reporting the association (e.g. risk ratio [RR]) with incident anxiety, mood, or psychotic disorders were included. Estimates were combined using random-effects meta-analyses. Bias was explored using a modified Newcastle–Ottawa Scale, confounder matrix, E-values, and Doi plots.
Results
Seventy-five studies were included. Tobacco use was associated with mood disorders (K = 43; RR: 1.39, 95% confidence interval [CI] 1.30–1.47), but not anxiety disorders (K = 7; RR: 1.21, 95% CI 0.87–1.68) and evidence for psychotic disorders was influenced by treatment of outliers (K = 4, RR: 3.45, 95% CI 2.63–4.53; K = 5, RR: 2.06, 95% CI 0.98–4.29). Cannabis use was associated with psychotic disorders (K = 4; RR: 3.19, 95% CI 2.07–4.90), but not mood (K = 7; RR: 1.31, 95% CI 0.92–1.86) or anxiety disorders (K = 7; RR: 1.10, 95% CI 0.99–1.22). Confounder matrices and E-values suggested potential overestimation of effects. Only 27% of studies were rated as high quality.
Conclusions
Both substances were associated with psychotic disorders and tobacco use was associated with mood disorders. There was no clear evidence of an association between cannabis use and mood or anxiety disorders. Limited high-quality studies underscore the need for future research using robust causal inference approaches (e.g. evidence triangulation).
In Georgia plasticulture vegetable production, a single installation of plastic mulch is used for up to five cropping cycles over an 18-mo period. Preplant applications of glyphosate and glufosinate ensure fields are weed-free before transplanting, but recent data suggest that residual activity of these herbicides may pose a risk to transplanted vegetables. Glyphosate and glufosinate were applied preplant in combination with three different planting configurations, including 1) a new plant hole into new mulch, 2) a preexisting plant hole, 3) or a new plant hole spaced 15 cm from a preexisting plant hole (adjacent). Following herbicide application, overhead irrigation was used to remove residues from the mulch before punching transplanting holes for tomato, cucumber, or squash. Visible injury; widths; biomass; and yield of tomato, cucumber, or squash were not influenced by herbicide in the new mulch or adjacent planting configurations. When glyphosate was applied at 5.0 kg ae ha−1 and the new crop was planted into preexisting holes, tomato was injured by 45%, with reduced heights, biomass, and yields; at 2.5 kg ae ha−1 injury of 8% and a biomass reduction was observed. Cucumber and squash were injured by 23% to 32% by glyphosate at 5.0 kg ae ha−1, with reductions in growth and early-season yield; lower rates did not influence crop growth or production when the crop was placed into a preexisting plant hole. Glufosinate applied at the same rates did not affect tomato growth or yield when planted into preexisting plant holes. Cucumber, when planted into preexisting plant holes, was injured by 43% to 75% from glufosinate, with reductions in height and biomass, and yield losses of 1.3 to 2.6 kg ai ha−1; similar results from glufosinate were observed in squash. In multi-crop plasticulture production, growers should ensure vegetable transplants are placed a minimum of 15 cm away from soil exposed to these herbicides.
Herbicide drift to sensitive crops can result in significant injury, yield loss, and even crop destruction. When pesticide drift is reported to the Georgia Department of Agriculture (GDA), tissue samples are collected and analyzed for residues. Seven field studies were conducted in 2020 and 2021 in cooperation with the GDA to evaluate the effect of (1) time interval between simulated drift event and sampling, (2) low-dose herbicide rates, and (3) the sample collection methods on detecting herbicide residues in cotton (Gossypium hirsutum L.) foliage. Simulated drift rates of 2,4-D, dicamba, and imazapyr were applied to non-tolerant cotton in the 8- to 9-leaf stage with plant samples collected at 7 or 21 d after treatment (DAT). During collection, plant sampling consisted of removing entire plants or removing new growth occurring after the 7-leaf stage. Visual cotton injury from 2,4-D reached 43% to 75% at 0.001 and 0.004 kg ae ha−1, respectively; for dicamba, it was 9% to 41% at 0.003 or 0.014 kg ae ha−1, respectively; and for imazapyr, it was 1% to 74% with 0.004 and 0.03 kg ae ha−1 rates, respectively. Yield loss was observed with both rates of 2,4-D (11% to 51%) and with the high rate of imazapyr (52%); dicamba did not influence yield. Herbicide residues were detected in 88%, 88%, and 69% of samples collected from plants treated with 2,4-D, dicamba, and imazapyr, respectively, at 7 DAT compared with 25%, 16%, and 22% when samples were collected at 21 DAT, highlighting the importance of sampling quickly after a drift event. Although the interval between drift event and sampling, drift rate, and sampling method can all influence residue detection for 2,4-D, dicamba, and imazapyr, the factor with the greatest influence is the amount of time between drift and sample collection.
Evaluate system-wide antimicrobial stewardship program (ASP) update impact on intravenous (IV)-to-oral (PO) antimicrobial conversion in select community hospitals through pre- and postimplementation trend analysis.
Methods:
Retrospective study across seven hospitals: region one (four hospitals, 827 beds) with tele-ASP managed by infectious diseases (ID)-trained pharmacists and region two (three hospitals, 498 beds) without. Data were collected pre- (April 2022–September 2022) and postimplementation (April 2023–September 2023) on nine antimicrobials for the IV to PO days of therapy (DOTs). Antimicrobial administration route and (DOTs)/1,000 patient days were extracted from the electronical medical record (EMR). Primary outcome: reduction in IV DOTs/1,000 patient days. Secondary outcomes: decrease in IV usage via PO:total antimicrobial ratios and cost reduction.
Results:
In region one, IV usage decreased from 461 to 209/1,000 patient days (P = < .001), while PO usage increased from 289 to 412/1,000 patient days (P = < .001). Total antimicrobial use decreased from 750 to 621/1,000 patient days (P = < .001). In region two, IV usage decreased from 300 to 243/1,000 patient days (P = .005), and PO usage rose from 154 to 198/1,000 patient days (P = .031). The PO:total antimicrobial ratios increased in both regions, from .42–.52 to .60–.70 in region one and from .36–.55 to .46–.55 in region two. IV cost savings amounted to $19,359.77 in region one and $4,038.51 in region two.
Conclusion:
The ASP intervention improved IV-to-PO conversion rates in both regions, highlighting the contribution of ID-trained pharmacists in enhancing ASP initiatives in region one and suggesting tele-ASP expansion may be beneficial in resource-constrained settings.
Removal and disposal of nonnative trees is expensive and time-consuming. Using these nonnative trees as a substrate to produce edible mushrooms could diversify farming operations and provide additional income to small-scale farmers. This research compared the production of shiitake mushrooms (Lentinula edodes) on nonnative tree logs to shiitake mushroom production on native oak (Quercus L.) logs, which are the traditional substrate. In a 2-yr study, we evaluated nonnative tree species as alternate substrates for growing shiitake mushrooms at farms in northern Florida and southern Georgia. A mix of native Quercus spp. and nonnative trees was targeted for removal on participating farms. Five nonnative tree species were initially tested for their ability to produce edible mushrooms, either shiitake or oyster (Pleurotus ostreatus var. florida). Of the nonnative trees we tested: Chinaberry (Melia azedarach L.), Chinese tallowtree [Triadica sebifera (L.) Small], silktree (Albizia julibrissin Durazz.), earleaf acacia (Acacia auriculiformis A. Cunn. ex Benth.), and paperbark tree [Melaleuca quinquenervia (Cav.) S.F. Blake], only T. sebifera produced shiitake mushrooms, and none produced native Florida oyster mushrooms. In on-farm trials, Quercus spp. logs produced more total mushrooms and more mushrooms per log and had a higher total mushroom yield per log. However, mushrooms produced on T. sebifera logs had higher mean weight per mushroom. Edible fungi can be used to recycle invasive, nonnative T. sebifera and transform their biomass from waste into an income-producing resource.
Operative cancellations adversely affect patient health and impose resource strain on the healthcare system. Here, our objective was to describe neurosurgical cancellations at five Canadian academic institutions.
Methods:
The Canadian Neurosurgery Research Collaborative performed a retrospective cohort study capturing neurosurgical procedure cancellation data at five Canadian academic centres, during the period between January 1, 2014 and December 31, 2018. Demographics, procedure type, reason for cancellation, admission status and case acuity were collected. Cancellation rates were compared on the basis of demographic data, procedural data and between centres.
Results:
Overall, 7,734 cancellations were captured across five sites. Mean age of the aggregate cohort was 57.1 ± 17.2 years. The overall procedure cancellation rate was 18.2%. The five-year neurosurgical operative cancellation rate differed between Centre 1 and 2 (Centre 1: 25.9%; Centre 2: 13.0%, p = 0.008). Female patients less frequently experienced procedural cancellation. Elective, outpatient and spine procedures were more often cancelled. Reasons for cancellation included surgeon-related factors (28.2%), cancellation for a higher acuity case (23.9%), patient condition (17.2%), other factors (17.0%), resource availability (7.0%), operating room running late (6.4%) and anaesthesia-related (0.3%). When clustered, the reason for cancellation was patient-related in 17.2%, staffing-related in 28.5% and operational or resource-related in 54.3% of cases.
Conclusions:
Neurosurgical operative cancellations were common and most often related to operational or resource-related factors. Elective, outpatient and spine procedures were more often cancelled. These findings highlight areas for optimizing efficiency and targeted quality improvement initiatives.
Globally, human rights violations experienced by persons with psychosocial, intellectual or cognitive disabilities continue to be a concern. The World Health Organization's (WHO) QualityRights initiative presents practical remedies to address these abuses. This paper presents an overview of the implementation of the initiative in Ghana.
Aims
The main objective of the QualityRights initiative in Ghana was to train and change attitudes among a wide range of stakeholders to promote recovery and respect for human rights for people with psychosocial, intellectual and cognitive disabilities.
Method
Reports of in-person and online training, minutes of meetings and correspondence among stakeholders of the QualityRights initiative in Ghana, including activities of international collaborators, were analysed to shed light on the implementation of the project in Ghana.
Results
In-person and online e-training on mental health were conducted. At the time of writing, 40 443 people had registered for the training, 25 416 had started the training and 20 865 people had completed the training and obtained a certificate. The team conducted 27 in-person training sessions with 910 people. The successful implementation of the project is underpinned by a committed partnership among stakeholders, strong leadership from the coordinating agency, the acceptance of the initiative and the outcome. A few challenges, both in implementation and acceptance, are discussed.
Conclusions
The exposure of the WHO QualityRights initiative to a substantial number of key stakeholders involved in mental healthcare in Ghana is critical to reducing human rights abuses for people with psychosocial, intellectual and cognitive disabilities.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
The mycosis histoplasmosis is also considered a zoonosis that affects humans and other mammalian species worldwide. Among the wild mammals predisposed to be infected with the etiologic agent of histoplasmosis, bats are relevant because they are reservoir of Histoplasma species, and they play a fundamental role in maintaining and spreading fungal propagules in the environments since the infective mycelial phase of Histoplasma grows in their accumulated guano. In this study, we detected the fungal presence in organ samples of bats randomly captured in urban areas of Araraquara City, São Paulo, Brazil. Fungal detection was performed using a nested polymerase chain reaction to amplify a molecular marker (Hcp100) unique to H. capsulatum, which revealed the pathogen presence in organ samples from 15 out of 37 captured bats, indicating 40.5% of infection. Out of 22 Hcp100-amplicons generated, 41% corresponded to lung and trachea samples and 59% to spleen, liver, and kidney samples. Data from these last three organs suggest that bats develop disseminated infections. Considering that infected bats create environments with a high risk of infection, it is important to register the percentage of infected bats living in urban areas to avoid risks of infection to humans, domestic animals, and wildlife.