We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The perinatal period has gained increasing attention from developmental psychopathologists; however, experiences during birth have been minimally examined using this framework. The current study aimed to evaluate longitudinal associations between childhood maltreatment, negative birth experiences, and postpartum mental health across levels of self-reported emotion dysregulation and respiratory sinus arrhythmia (RSA). Expectant mothers (N = 223) participated in a longitudinal study from the third trimester of pregnancy to 7 months postpartum. Participants contributed prenatal resting RSA and completed questionnaires prenatally, 24 hours after birth, and 7 months postpartum. Results indicated that more childhood maltreatment was associated with higher birth fear and postpartum anxiety and depressive symptoms. Resting RSA moderated the association between childhood maltreatment and birth fear, such that more childhood maltreatment and higher resting RSA were associated with increased birth fear. Additionally, self-reported prenatal emotion dysregulation moderated the association between childhood maltreatment and postpartum depressive symptoms, such that more childhood maltreatment and higher emotion dysregulation were associated with increased depressive symptoms. Emotion dysregulation across multiple levels may amplify vulnerability to negative birth experiences and postpartum psychopathology among individuals with childhood maltreatment histories. Thus, emotion dysregulation in the context of trauma-informed care may be worthwhile intervention targets during the perinatal period.
To combat the decline in North American grasslands and prairies, innovative strategies to establish new native grass and forb plantings must be considered. Integrated vegetation management entails the use of many practices to cultivate desirable vegetation along roadsides, including mowing, applying herbicides, burning, and replanting. Currently, only a limited selection of postemergence herbicides are available to improve native plant establishment along roadsides. A greenhouse herbicide screen that included four postemergence herbicides registered for use on Conservation Reserve Program (CRP) acres and rights-of-way was conducted to test their safety for use on four native grasses (big bluestem, buffalograss, sideoats grama, and switchgrass) and seven forb species (ashy sunflower, black-eyed Susan, butterfly milkweed, desert false indigo, Illinois bundleflower, Mexican hat plant, and purple coneflower). Clopyralid (689 g ae ha−1), metsulfuron (4.18 g ai ha−1), and quinclorac (418 g ai ha−1) applied at labeled rates caused no injury to the native grass species or butterfly milkweed. However, florpyrauxifen-benzyl (38.4 g ai ha−1) caused significant injury to buffalograss and switchgrass. None of the herbicides tested were universally safe to use on all forb species evaluated in this trial, with each herbicide causing unacceptable injury (≥25%) to one or more forb species. None of the herbicides studied here would be completely safe for use on mixed stands of native grasses and native forbs at the seedling growth stage, indicating that prairie establishment must use alternative chemistries, plant mixes with fewer species, or avoid postemergence applications shortly after emergence of native forbs.
Around 1000 years ago, Madagascar experienced the collapse of populations of large vertebrates that ultimately resulted in many species going extinct. The factors that led to this collapse appear to have differed regionally, but in some ways, key processes were similar across the island. This review evaluates four hypotheses that have been proposed to explain the loss of large vertebrates on Madagascar: Overkill, aridification, synergy, and subsistence shift. We explore regional differences in the paths to extinction and the significance of a prolonged extinction window across the island. The data suggest that people who arrived early and depended on hunting, fishing, and foraging had little effect on Madagascar’s large endemic vertebrates. Megafaunal decline was triggered initially by aridification in the driest bioclimatic zone, and by the arrival of farmers and herders in the wetter bioclimatic zones. Ultimately, it was the expansion of agropastoralism across both wet and dry regions that drove large endemic vertebrates to extinction everywhere.
The Society for Healthcare Epidemiology of America, the Association of Professionals in Infection Control and Epidemiology, the Infectious Diseases Society of America, and the Pediatric Infectious Diseases Society represent the core expertise regarding healthcare infection prevention and infectious diseases and have written multisociety statement for healthcare facility leaders, regulatory agencies, payors, and patients to strengthen requirements and expectations around facility infection prevention and control (IPC) programs. Based on a systematic literature search and formal consensus process, the authors advocate raising the expectations for facility IPC programs, moving to effective programs that are:
• Foundational and influential parts of the facility’s operational structure
• Resourced with the correct expertise and leadership
• Prioritized to address all potential infectious harms
This document discusses the IPC program’s leadership—a dyad model that includes both physician and infection preventionist leaders—its reporting structure, expertise, and competencies of its members, and the roles and accountability of partnering groups within the healthcare facility. The document outlines a process for identifying minimum IPC program medical director support. It applies to all types of healthcare settings except post-acute long-term care and focuses on resources for the IPC program. Long-term acute care hospital (LTACH) staffing and antimicrobial stewardship programs will be discussed in subsequent documents.
In Georgia plasticulture vegetable production, a single installation of plastic mulch is used for up to five cropping cycles over an 18-mo period. Preplant applications of glyphosate and glufosinate ensure fields are weed-free before transplanting, but recent data suggest that residual activity of these herbicides may pose a risk to transplanted vegetables. Glyphosate and glufosinate were applied preplant in combination with three different planting configurations, including 1) a new plant hole into new mulch, 2) a preexisting plant hole, 3) or a new plant hole spaced 15 cm from a preexisting plant hole (adjacent). Following herbicide application, overhead irrigation was used to remove residues from the mulch before punching transplanting holes for tomato, cucumber, or squash. Visible injury; widths; biomass; and yield of tomato, cucumber, or squash were not influenced by herbicide in the new mulch or adjacent planting configurations. When glyphosate was applied at 5.0 kg ae ha−1 and the new crop was planted into preexisting holes, tomato was injured by 45%, with reduced heights, biomass, and yields; at 2.5 kg ae ha−1 injury of 8% and a biomass reduction was observed. Cucumber and squash were injured by 23% to 32% by glyphosate at 5.0 kg ae ha−1, with reductions in growth and early-season yield; lower rates did not influence crop growth or production when the crop was placed into a preexisting plant hole. Glufosinate applied at the same rates did not affect tomato growth or yield when planted into preexisting plant holes. Cucumber, when planted into preexisting plant holes, was injured by 43% to 75% from glufosinate, with reductions in height and biomass, and yield losses of 1.3 to 2.6 kg ai ha−1; similar results from glufosinate were observed in squash. In multi-crop plasticulture production, growers should ensure vegetable transplants are placed a minimum of 15 cm away from soil exposed to these herbicides.
Herbicide drift to sensitive crops can result in significant injury, yield loss, and even crop destruction. When pesticide drift is reported to the Georgia Department of Agriculture (GDA), tissue samples are collected and analyzed for residues. Seven field studies were conducted in 2020 and 2021 in cooperation with the GDA to evaluate the effect of (1) time interval between simulated drift event and sampling, (2) low-dose herbicide rates, and (3) the sample collection methods on detecting herbicide residues in cotton (Gossypium hirsutum L.) foliage. Simulated drift rates of 2,4-D, dicamba, and imazapyr were applied to non-tolerant cotton in the 8- to 9-leaf stage with plant samples collected at 7 or 21 d after treatment (DAT). During collection, plant sampling consisted of removing entire plants or removing new growth occurring after the 7-leaf stage. Visual cotton injury from 2,4-D reached 43% to 75% at 0.001 and 0.004 kg ae ha−1, respectively; for dicamba, it was 9% to 41% at 0.003 or 0.014 kg ae ha−1, respectively; and for imazapyr, it was 1% to 74% with 0.004 and 0.03 kg ae ha−1 rates, respectively. Yield loss was observed with both rates of 2,4-D (11% to 51%) and with the high rate of imazapyr (52%); dicamba did not influence yield. Herbicide residues were detected in 88%, 88%, and 69% of samples collected from plants treated with 2,4-D, dicamba, and imazapyr, respectively, at 7 DAT compared with 25%, 16%, and 22% when samples were collected at 21 DAT, highlighting the importance of sampling quickly after a drift event. Although the interval between drift event and sampling, drift rate, and sampling method can all influence residue detection for 2,4-D, dicamba, and imazapyr, the factor with the greatest influence is the amount of time between drift and sample collection.
Herbicides that inhibit protoporphyrinogen oxidase (PPO) are used in more than 40 agronomic and specialty crops across Georgia to manage weeds through residual and postemergence (POST) control. In 2017, a population of Palmer amaranth exhibiting reduced sensitivity to POST applications of PPO-inhibiting herbicides was identified by the University of Georgia. Seed were collected from the site along with a known sensitive population; distance between the samples was 200 m, increasing the likelihood of similar environmental and genetic characteristics. To quantify sensitivity for both preemergence (PRE) and POST uses, 21 greenhouse dose-response assessments were conducted from 2017 to 2022. After conducting initial rate-response studies, 13 doses per herbicide were chosen for the POST experiment; field use rates of fomesafen (420 g ai ha−1), lactofen (219 g ai ha−1), acifluorfen (420 g ai ha−1), and trifludimoxazin (25 g ai ha−1) ranging from 0× to 4× the field use rate for the susceptible population, and 0× to 40× for the suspect population were applied. Herbicide treatments included adjuvants and were applied to plants 8 to 10 cm in height. Relative resistance factors (RRFs) were calculated for control ratings, mortality, and biomass, and ranged from 105 to 318, 36 to 1,477, 215 to 316, and 9 to 49 for fomesafen, lactofen, acifluorfen, and trifludimoxazin, respectively. In the PRE experiment, herbicide applications included five to nine doses of fomesafen (1× = 210 g ai ha−1), flumioxazin (1× = 57 g ai ha−1), oxyfluorfen (1× = 561 g ai ha−1), and trifludimoxazin (1× = 38 g ai ha−1); doses ranged from 0× to 6× for the suspect population and 0× to 2× for the susceptible population. Visual control, mortality, and biomass RRFs ranged from 3 to 5 for fomesafen, 21 to 31 for flumioxazin, 6 to 22 for oxyfluorfen, and 8 to 38 for trifludimoxazin. Results confirm that a Georgia Palmer amaranth population is resistant to PPO-inhibiting herbicides applied both PRE and POST.
Individuals tend to overestimate their abilities in areas where they are less competent. This cognitive bias is known as the Dunning-Krueger effect. Research shows that Dunning-Krueger effect occurs in persons with traumatic brain injury and healthy comparison participants. It was suggested by Walker and colleagues (2017) that the deficits in cognitive awareness may be due to brain injury. Confrontational naming tasks (e.g., Boston Naming Test) are used to evaluate language abilities. The Cordoba Naming Test (CNT) is a 30-item confrontational naming task developed to be administered in multiple languages. Hardy and Wright (2018) conditionally validated a measure of perceived mental workload called the NASA Task Load Index (NASA-TLX). They found that workload ratings on the NASA-TLX increased with increased task demands on a cognitive task. The purpose of the present study was to determine whether the Dunning-Kruger effect occurs in a Latinx population and possible factors driving individuals to overestimate their abilities on the CNT. We predicted the low-performance group would report better CNT performance, but underperform on the CNT compared to the high-performance group.
Participants and Methods:
The sample consisted of 129 Latinx participants with a mean age of 21.07 (SD = 4.57). Participants were neurologically and psychologically healthy. Our sample was divided into two groups: the low-performance group and the high-performance group. Participants completed the CNT and the NASA-TLX in English. The NASA-TLX examines perceived workload (e.g., performance) and it was used in the present study to evaluate possible factors driving individuals to overestimate their abilities on the CNT. Participants completed the NASA-TLX after completing the CNT. Moreover, the CNT raw scores were averaged to create the following two groups: low-performance (CNT raw score <17) and high-performance (CNT raw score 18+). A series of ANCOVA's, controlling for gender and years of education completed were used to evaluate CNT performance and CNT perceived workloads.
Results:
We found the low-performance group reported better performance on the CNT compared to the high-performance, p = .021, np2 = .04. However, the high-performance group outperformed the low-performance group on the CNT, p = .000, np2 = .53. Additionally, results revealed the low-performance group reported higher temporal demand and effort levels on the CNT compared to the high-performance group, p's < .05, nps2 = .05.
Conclusions:
As we predicted, the low-performance group overestimated their CNT performance compared to the high-performance group. The current data suggest that the Dunning-Kruger effect occurs in healthy Latinx participants. We also found that temporal demand and effort may be influencing awareness in the low-performance group CNT performance compared to the high-performance group. The present study suggests subjective features on what may be influencing confrontational naming task performance in low-performance individuals more than highperformance individuals on the CNT. Current literature shows that bilingual speakers underperformed on confrontational naming tasks compared to monolingual speakers. Future studies should investigate if the Dunning-Kruger effects Latinx English monolingual speakers compared to Spanish-English bilingual speakers on the CNT.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Mental health problems are elevated in autistic individuals but there is limited evidence on the developmental course of problems across childhood. We compare the level and growth of anxious-depressed, behavioral and attention problems in an autistic and typically developing (TD) cohort.
Methods
Latent growth curve models were applied to repeated parent-report Child Behavior Checklist data from age 2–10 years in an inception cohort of autistic children (Pathways, N = 397; 84% boys) and a general population TD cohort (Wirral Child Health and Development Study; WCHADS; N = 884, 49% boys). Percentile plots were generated to quantify the differences between autistic and TD children.
Results
Autistic children showed elevated levels of mental health problems, but this was substantially reduced by accounting for IQ and sex differences between the autistic and TD samples. There was small differences in growth patterns; anxious-depressed problems were particularly elevated at preschool and attention problems at late childhood. Higher family income predicted lower base-level on all three dimensions, but steeper increase of anxious-depressed problems. Higher IQ predicted lower level of attention problems and faster decline over childhood. Female sex predicted higher level of anxious-depressed and faster decline in behavioral problems. Social-affect autism symptom severity predicted elevated level of attention problems. Autistic girls' problems were particularly elevated relative to their same-sex non-autistic peers.
Conclusions
Autistic children, and especially girls, show elevated mental health problems compared to TD children and there are some differences in predictors. Assessment of mental health should be integrated into clinical practice for autistic children.
The coronavirus disease 2019 (COVID-19) pandemic highlighted the lack of agreement regarding the definition of aerosol-generating procedures and potential risk to healthcare personnel. We convened a group of Massachusetts healthcare epidemiologists to develop consensus through expert opinion in an area where broader guidance was lacking at the time.
Cole crops including broccoli and collard contribute more than $119 million to Georgia’s farm gate value yearly. To ensure maximum profitability, these crops must be planted into weed-free fields. Glyphosate is a tool often used to help achieve this goal because of its broad-spectrum activity on weeds coupled with the knowledge that it poses no threat to the succeeding crop when used as directed. However, recent research suggests that with certain soil textures and production systems, the residual soil activity of glyphosate may damage some crops. Therefore, field experiments were conducted in fall 2019 and 2020 to evaluate transplanted broccoli and collard response to glyphosate applied preplant onto bare soil and what practical mitigation measures could be implemented to reduce crop injury. Herbicide treatments consisted oGf 0, 2.5, or 5 kg ae ha−1 glyphosate applied preplant followed by 1) no mitigation measure, 2) tillage, 3) irrigation, or 4) tillage and irrigation prior to transplanting broccoli and collard by hand. When no mitigation was implemented, the residual activity of glyphosate at 2.5 and 5.0 kg ae ha−1 resulted in 43% to 71% and 79% to 93% injury to broccoli and collard transplants, respectively. This resulted in a 35% to 50% reduction in broccoli marketable head weights and 63% to 71% reduction in collard leaf weights. Irrigation reduced visible damage by 28% to 48%, whereas tillage reduced injury by 43% to 76%, for both crops. Irrigation alleviated yield losses for broccoli but only tillage eliminated yield loss for both crops. Care must be taken when transplanting broccoli and collard into a field recently treated with glyphosate at rates ≥2.5 kg ae ha−1. Its residual activity can damage transplants with injury levels influenced by glyphosate rate, and tillage or irrigation after application and prior to planting.
While unobscured and radio-quiet active galactic nuclei are regularly being found at redshifts
$z > 6$
, their obscured and radio-loud counterparts remain elusive. We build upon our successful pilot study, presenting a new sample of low-frequency-selected candidate high-redshift radio galaxies (HzRGs) over a sky area 20 times larger. We have refined our selection technique, in which we select sources with curved radio spectra between 72–231 MHz from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey. In combination with the requirements that our GLEAM-selected HzRG candidates have compact radio morphologies and be undetected in near-infrared
$K_{\rm s}$
-band imaging from the Visible and Infrared Survey Telescope for Astronomy Kilo-degree Infrared Galaxy (VIKING) survey, we find 51 new candidate HzRGs over a sky area of approximately
$1200\ \mathrm{deg}^2$
. Our sample also includes two sources from the pilot study: the second-most distant radio galaxy currently known, at
$z=5.55$
, with another source potentially at
$z \sim 8$
. We present our refined selection technique and analyse the properties of the sample. We model the broadband radio spectra between 74 MHz and 9 GHz by supplementing the GLEAM data with both publicly available data and new observations from the Australia Telescope Compact Array at 5.5 and 9 GHz. In addition, deep
$K_{\rm s}$
-band imaging from the High-Acuity Widefield K-band Imager (HAWK-I) on the Very Large Telescope and from the Southern Herschel Astrophysical Terahertz Large Area Survey Regions
$K_{\rm s}$
-band Survey (SHARKS) is presented for five sources. We discuss the prospects of finding very distant radio galaxies in our sample, potentially within the epoch of reionisation at
$z \gtrsim 6.5$
.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Understanding place-based contributors to health requires geographically and culturally diverse study populations, but sharing location data is a significant challenge to multisite studies. Here, we describe a standardized and reproducible method to perform geospatial analyses for multisite studies. Using census tract-level information, we created software for geocoding and geospatial data linkage that was distributed to a consortium of birth cohorts located throughout the USA. Individual sites performed geospatial linkages and returned tract-level information for 8810 children to a central site for analyses. Our generalizable approach demonstrates the feasibility of geospatial analyses across study sites to promote collaborative translational research.
Potential loss of energetic ions including alphas and radio-frequency tail ions due to classical orbit effects and magnetohydrodynamic instabilities (MHD) are central physics issues in the design and experimental physics programme of the SPARC tokamak. The expected loss of fusion alpha power due to ripple-induced transport is computed for the SPARC tokamak design by the ASCOT and SPIRAL orbit-simulation codes, to assess the expected surface heating of plasma-facing components. We find good agreement between the ASCOT and SPIRAL simulation results not only in integrated quantities (fraction of alpha power loss) but also in the spatial, temporal and pitch-angle dependence of the losses. If the toroidal field (TF) coils are well-aligned, the SPARC edge ripple is small (0.15–0.30 %), the computed ripple-induced alpha power loss is small (${\sim } 0.25\,\%$) and the corresponding peak surface power density is acceptable ($244\ \textrm{kW}\ \textrm {m}^{-2}$). However, the ripple and ripple-induced losses increase strongly if the TF coils are assumed to suffer increasing magnitudes of misalignment. Surface heat loads may become problematic if the TF coil misalignment approaches the centimetre level. Ripple-induced losses of the energetic ion tail driven by ion cyclotron range of frequency (ICRF) heating are not expected to generate significant wall or limiter heating in the nominal SPARC plasma scenario. Because the expected classical fast-ion losses are small, SPARC will be able to observe and study fast-ion redistribution due to MHD including sawteeth and Alfvén eigenmodes (AEs). SPARC's parameter space for AE physics even at moderate $Q$ is shown to reasonably overlap that of the demonstration power plant ARC (Sorbom et al., Fusion Engng Des., vol. 100, 2015, p. 378), and thus measurements of AE mode amplitude, spectrum and associated fast-ion transport in SPARC would provide relevant guidance about AE behaviour expected in ARC.
The SPARC tokamak is a critical next step towards commercial fusion energy. SPARC is designed as a high-field ($B_0 = 12.2$ T), compact ($R_0 = 1.85$ m, $a = 0.57$ m), superconducting, D-T tokamak with the goal of producing fusion gain $Q>2$ from a magnetically confined fusion plasma for the first time. Currently under design, SPARC will continue the high-field path of the Alcator series of tokamaks, utilizing new magnets based on rare earth barium copper oxide high-temperature superconductors to achieve high performance in a compact device. The goal of $Q>2$ is achievable with conservative physics assumptions ($H_{98,y2} = 0.7$) and, with the nominal assumption of $H_{98,y2} = 1$, SPARC is projected to attain $Q \approx 11$ and $P_{\textrm {fusion}} \approx 140$ MW. SPARC will therefore constitute a unique platform for burning plasma physics research with high density ($\langle n_{e} \rangle \approx 3 \times 10^{20}\ \textrm {m}^{-3}$), high temperature ($\langle T_e \rangle \approx 7$ keV) and high power density ($P_{\textrm {fusion}}/V_{\textrm {plasma}} \approx 7\ \textrm {MW}\,\textrm {m}^{-3}$) relevant to fusion power plants. SPARC's place in the path to commercial fusion energy, its parameters and the current status of SPARC design work are presented. This work also describes the basis for global performance projections and summarizes some of the physics analysis that is presented in greater detail in the companion articles of this collection.
Simulations are playing an increasingly important role in paleobiology. When designing a simulation study, many decisions have to be made and common challenges will be encountered along the way. Here, we outline seven rules for executing a good simulation study. We cover topics including the choice of study question, the empirical data used as a basis for the study, statistical and methodological concerns, how to validate the study, and how to ensure it can be reproduced and extended by others. We hope that these rules and the accompanying examples will guide paleobiologists when using simulation tools to address fundamental questions about the evolution of life.
The pan-Canadian Oncology Drug Review (pCODR) evaluates new cancer drugs for public funding recommendations. While pCODR's deliberative framework evaluates overall clinical benefit and includes considerations for exceptional circumstances, rarity of indication is not explicitly addressed. Given the high unmet need that typically accompanies these indications, we explored the impact of rarity on oncology HTA recommendations and funding decisions.
Methods
We examined pCODR submissions with final recommendations from 2012 to 2017. Incidence rates were calculated using pCODR recommendation reports and statistics from the Canadian Cancer Society. Indications were classified as rare if the incidence rate was lower than 1/100,000 diagnoses, a definition referenced by the Canadian Agency for Drugs and Technologies in Health. Each pCODR final report was examined for the funding recommendation/justification, level of supporting evidence (presence of a randomized control trial [RCT]), and time to funding (if applicable).
Results
Of the ninety-six pCODR reviews examined, 16.6 percent were classified as rare indications per above criteria. While the frequency of positive funding recommendations were similar between rare and nonrare indication (78.6 vs. 75 percent), rare indications were less likely to be presented with evidence from RCT (50 vs. 90 percent). The average time to funding did not differ significantly across provinces.
Conclusion
Rare indications appear to be associated with weaker clinical evidence. There appears to be no association between rarity, positive funding recommendations, and time to funding. Further work will evaluate factors associated with positive recommendations and the real-world utilization of funded treatments for rare indications.