We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Newborn bloodspot screening (NBS) is used to identify rare, serious health conditions. The choice of conditions to screen for can be contentious, with no internationally agreed panel. Multicriteria decision analysis (MCDA), involving a priori identification and weighting of criteria, has been proposed as an approach to assess and rank conditions for inclusion in newborn screening panels or for further assessment.
Methods
The aim of this review was to summarize existing MCDA-like processes used internationally in the context of NBS conditions. Publications were identified using a scoping methodology including electronic database and gray literature searches. Information on the methodology used to derive criteria and weightings, the criteria used, the scoring systems, and the weightings were extracted. The results were synthesized narratively.
Results
Five publications reporting on the use of MCDA or MCDA-like processes in the context of NBS were identified. In three publications, the aim of the MCDA processes was to select conditions for inclusion in an NBS panel. Two referred to the potential use of MCDA to inform prioritization of candidate conditions for future evaluation. While the criteria used were broadly consistent across studies, assigned weightings varied. Key challenges noted in the studies in relation to applying MCDA included the subjectivity associated with the choice and weighting of criteria, the absence of data to inform criteria, and its low discriminatory power.
Conclusions
Choosing criteria and weightings for MCDA processes regarding NBS can be a highly deliberative process that is intrinsically linked to the values and perspectives of those involved. Given the lack of international consensus on criteria and weightings, MCDA processes to aid decision-making would likely require a de novo tool or careful adaptation of an existing tool to reflect local contexts.
Developmental dysplasia of the hip (DDH) is a congenital disease in which there is abnormal development of the hip in infancy. Ultrasound screening has the potential to enable earlier identification and diagnosis of DDH, facilitating earlier and less invasive treatment. Ultrasound screening programs can be selective or universal, but the optimal method is unclear.
Methods
The aim of this review was to examine the comparative effectiveness of universal and selective ultrasound screening for DDH in infants. The domains of the Health Technology Assessment Core Model® selected for assessment were consistent with a rapid relative effectiveness assessment approach (i.e., focusing on the clinical benefit of the intervention) and included the following: (i) the health problem; (ii) a description of the technology; and (iii) clinical effectiveness and safety outcomes. An expert advisory group comprising nominated representatives from key stakeholder groups was convened for the purposes of quality assurance and to assist in interpreting the evidence.
Results
DDH severity can range from mild dysplasia to complete dislocation, with incidence varying internationally. Ultrasound screening can result in unnecessary treatment given the potential for spontaneous correction of hip instability. Furthermore, treatment may give rise to complications. Appropriate governance of a screening program and associated training may reduce the risk of unnecessary treatment. Limited high quality evidence from four studies was identified. This evidence suggested that increased rates of non-surgical intervention were associated with universal ultrasound screening, compared with selective screening, without a corresponding reduction in the incidence of late DDH or requirement for surgical intervention.
Conclusions
The relative benefit of universal ultrasound screening, compared with selective screening, remains unclear. Screening all infants has the potential to lead to unnecessary treatment, with the risk of clinically significant consequences. Consideration could be given to implementing a selective ultrasound screening program, with appropriate governance, end-to-end care, quality assurance, and outcome monitoring.
Clinical research is critical for healthcare advancement, but participant recruitment remains challenging. Clinical research professionals (CRPs; e.g., clinical research coordinator, research assistant) perform eligibility prescreening, ensuring adherence to study criteria while upholding scientific and ethical standards. This study investigates the key information CRP prioritizes during eligibility prescreening, providing insights to optimize data standardization, and recruitment approaches.
Methods:
We conducted a freelisting survey targeting 150 CRPs from diverse domains (i.e., neurological disorders, rare diseases, and other diseases) where they listed essential information they look for from medical records, participant/caregiver inquiries, and discussions with principal investigators to determine a potential participant’s research eligibility. We calculated the salience scores of listed items using Anthropac, followed by a two-level analytic procedure to classify and thematically categorize the data.
Results:
The majority of participants were female (81%), identified as White (44%) and as non-Hispanic (64.5%). The first-level analysis universally emphasized age, medication list, and medical history across all domains. The second-level analysis illuminated domain-specific approaches in information retrieval: for instance, history of present illness was notably significant in neurological disorders during participant and principal investigator inquiries, while research participation was distinctly salient in potential participant inquiries within the rare disease domain.
Conclusion:
This study unveils the intricacies of eligibility prescreening, with both universal and domain-specific methods observed. Variations in data use across domains suggest the need for tailored prescreening in clinical research. Incorporating these insights into CRP training and refining prescreening tools, combined with an ethical, participant-focused approach, can advance eligibility prescreening practices.
To examine associations between three different plant-based diet quality indices, chronic kidney disease (CKD) prevalence and related risk factors in a nationally representative sample of the Australian population.
Design:
Cross-sectional analysis. Three plant-based diet scores were calculated using data from two 24-h recalls: an overall plant-based diet index (PDI), a healthy PDI (hPDI) and an unhealthy PDI (uPDI). Consumption of plant and animal ingredients from ‘core’ and ‘discretionary’ products was also differentiated. Associations between the three PDI scores and CKD prevalence, BMI, waist circumference (WC), blood pressure (BP) measures, blood cholesterol, apo B, fasting TAG, blood glucose levels (BGL) and HbA1c were examined.
Setting:
Australian Health Survey 2011–2013.
Participants:
n 2060 adults aged ≥ 18 years (males: n 928; females: n 1132).
Results:
A higher uPDI score was associated with a 3·7 % higher odds of moderate-severe CKD (OR: 1·037 (1·0057–1·0697); P = 0·021)). A higher uPDI score was also associated with increased TAG (P = 0·032) and BGL (P < 0·001), but lower total- and LDL-cholesterol (P = 0·035 and P = 0·009, respectively). In contrast, a higher overall PDI score was inversely associated with WC (P < 0·001) and systolic BP (P = 0·044), while higher scores for both the overall PDI and hPDI were inversely associated with BMI (P < 0·001 and P = 0·019, respectively).
Conclusions:
A higher uPDI score reflecting greater intakes of refined grains, salty plant-based foods and added sugars were associated with increased CKD prevalence, TAG and BGL. In the Australian population, attention to diet quality remains paramount, even in those with higher intakes of plant foods and who wish to reduce the risk of CKD.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
Despite evidence for favourable health outcomes associated with plant-based diets, a database containing the plant and animal content of all foods eaten is required to undertake a reliable assessment of plant-based diets within a population. This study aimed to expand an existing Australian food database to include the plant and animal content of all whole foods, beverages, multi-ingredient products and mixed dishes. Twenty-three plant- and animal-based food group classifications were first defined. The food servings per 100 g of each product were then systematically calculated using either a recipe-based approach, a food label-based approach, estimates based on similar products or online recipes. Overall, 4687 (83·5 %) foods and beverages were identified as plant or plant-containing products, and 3701 (65·9 %) were animal or animal-containing products. Results highlighted the versatility of plant and animal ingredients as they were found in various foods across many food categories, including savoury and sweet foods, as well as discretionary and core foods. For example, over 97 % of animal fat-containing foods were found in major food groups outside the AUSNUT 2011–2013 ‘fats and oils’ group. Surprisingly, fruits, nuts and seeds were present in a greater percentage of discretionary products than in core foods and beverages. This article describes a systematic approach that is suitable for the development of other novel food databases. This database allows more accurate quantitative estimates of plant and animal intakes, which is significant for future epidemiological and clinical research aiming to investigate plant-based diets and their related health outcomes.
The Trial Innovation Network (TIN) is a collaborative initiative within the National Center for Advancing Translational Science (NCATS) Clinical and Translational Science Awards (CTSA) Program. To improve and innovate the conduct of clinical trials, it is exploring the uses of gamification to better engage the trial workforce and improve the efficiencies of trial activities. The gamification structures described in this article are part of a TIN website gamification toolkit, available online to the clinical trial scientific community.
Methods:
The game designers used existing electronic trial platforms to gamify the tasks required to meet trial start-up timelines to create friendly competitions. Key indicators and familiar metrics were mapped to scoreboards. Webinars were organized to share and applaud trial and game performance.
Results:
Game scores were significantly associated with an increase in achieving start-up milestones in activation, institutional review board (IRB) submission, and IRB approval times, indicating the probability of completing site activation faster by using games. Overall game enjoyment and feelings that the game did not apply too much pressure appeared to be an important moderator of performance in one trial but had little effect on performance in a second.
Conclusion:
This retrospective examination of available data from gaming experiences may be a first-of-kind use in clinical trials. There are signals that gaming may accelerate performance and increase enjoyment during the start-up phase of a trial. Isolating the effect of gamification on trial outcomes will depend on a larger sampling from future trials, using well-defined, hypothesis-driven statistical analysis plans.
The risk of tort liability for health maintenance organizations (HMOs) and other managed care plans has dramatically increased in recent years. This is due in part to the growing percentage of health care rendered through managed care plans. The cost-containment mechanisms commonly used by managed care plans, such as limiting access to services and/or choice of providers, creates a climate ripe for disputes that may end up in court. As dissatisfied patients and providers seek recourse in the courts, tort doctrines are extended and new legal theories emerge as needed. For example, the concepts of direct and vicarious tort liability developed in the hospital context have been extended by courts to encompass HMOs. vicarious liability claims, based on ostensible agency or respondeat superior doctrines, have been brought against HMOs and managed care plans for negligent treatment by physicians selected to provide care to members.
Having a serious illness like breast cancer is a calamity for individuals and families. Along with the pain, discomfort, and dislocation comes the issue of how to pay the medical expenses for the care and treatment of the disease. If the seriously ill person has inadequate or no insurance, these problems are aggravated.
Stories abound about seriously ill people losing private health insurance following diagnosis with a catastrophic disease, remaining in jobs just to maintain health insurance, or facing financial hardship because of gaps in coverage. Yet surprisingly little research has focused on the problems that people with serious illness face with health coverage and, in particular, how concerns about access to health insurance coverage shape their lives.
Further, despite profoundly moving anecdotes of cancer victims and other seriously ill people about their problems with health insurance and despite recent federal and state efforts to reform the private health insurance market in ways discussed below, neither the federal government, states, nor the private sector has crafted comprehensive strategies to enhance health coverage for the seriously ill.
OBJECTIVES/GOALS: One of the most significant challenges to community engagement experienced by Clinical and Translational Science Award (CTSA) institutions is inadequate capacity of academic and community partners to engage in collaborative research. Several CTSAs within the consortium provide consultation services to help address this gap. METHODS/STUDY POPULATION: For over 10 years, the Michigan Institute for Clinical and Health Research (MICHR), a CTSA at the University of Michigan, has provided CEnR-specific consultations to partners seeking support for a variety of needs. Consultations can be requested for assistance with identifying potential partners, developing partnership infrastructure, finding CEnR funding opportunities, and incorporating CEnR approaches into research plans. When a consultation is requested, MICHR’s Community Engagement (CE) Program responds by planning a meeting with staff and faculty who have relevant skills, expertise, and connections. After the initial meeting, the CE Program provides follow-up communication and support based on the needs of the specific request, and often facilitates connections with potential partners. RESULTS/ANTICIPATED RESULTS: The two most frequent types of consultation requests involve 1) making connections with potential researchers or community partner organizations, and 2) providing guidance on research grant applications that involve community engagement. MICHR provides approximately 50 CEnR consultations each year, which have resulted in development of new partnerships, grant submissions, and research projects that utilize CEnR principles and address community-identified health priorities. DISCUSSION/SIGNIFICANCE OF IMPACT: This presentation will describe the evolution of MICHR’s CEnR consultation process and highlight successful outcomes and lessons learned over its 12-year history. CONFLICT OF INTEREST DESCRIPTION: NA
Lieder and Griffiths rightly urge that computational cognitive models be constrained by resource usage, but they should go further. The brain's primary function is to regulate resource usage. As a consequence, resource usage should not simply select among algorithmic models of “aspects of cognition.” Rather, “aspects of cognition” should be understood as existing in the service of resource management.
The 2017 Atlantic hurricane season was especially memorable for 3 major hurricanes—Harvey, Irma, and Maria—that devastated population centers across Texas, Florida, and Puerto Rico, respectively. Each storm had unique hazard properties that posed distinctive challenges for persons living with type 1 diabetes (T1D). Diabetes care specialists and educators took on leadership roles for coordinating care and establishing insulin supply lifelines for people with T1D living in the hardest-hit neighborhoods affected by these extreme storms. Strategies and resources were customized for each population. Diabetes specialists strategized to provide mutual support and shared insulins and supplies across sites.
OBJECTIVES/SPECIFIC AIMS: Glioblastoma (GBM) is a brain cancer with a devastatingly short overall survival of under two years. The poor prognosis of GBM is largely due to cell invasion and maintenance of cancer initiating cells that evade the brain’s innate and adaptive immune responses which enables escape from surgical resection and drives inevitable recurrence. While targeting the brain’s immune microenvironment has long been proposed as a strategy for treating GBM, translational progress has been slow, underscoring the need to investigate the brain’s immune microenvironment for therapeutic avenues. METHODS/STUDY POPULATION: Recent advancements in tunable synthetic immunomodulatory gene circuits targeting metastatic cancers has demonstrated the novel ability to use engineering principles to induce infiltrative cancer cells to express combinatorial immunomodulatory outputs that enable T-cell killing4. Our central hypothesis is: we will be able to significantly improve survival with a lasting immune-mediated control of GBM by using synthetic immunomodulatory gene circuits driving GBM cells to express a local combination of immunomodulatory proteins: human IL15, a surface T-cell engager, PD-L1-CD3 bispecific antibody, and the protein, LIGHT (TNFRSF14). Importantly, the co-expression of LIGHT and anti-PD-L1 therapies was recently shown to rescue PD-L1 checkpoint blockage in the preclinical models of brain tumors and significant enhance survival outcomes highlighting the benefits of novel combinations of immunomodulatory proteins for treatment of GBM. To identify genes whose expression is dramatically upregulated in GBM compared to normal human brain cells, a pooled of six thousand lentiviral oncogene promoters that drive expression of a red-fluorescent protein has been infected into three human GBM cell lines. RESULTS/ANTICIPATED RESULTS: We have successfully infected our GBM cells and are preparing samples for next generation DNA sequencing to determine highly active promoters in GBM that are not expressed in multiple normal brain cells types, astrocytes and neurons. These chosen promoters will then be used to drive an AND gate logic gene circuit immunotherapy outputs which is currently under development for both in-vitro and in-vivo experiments. DISCUSSION/SIGNIFICANCE OF IMPACT: We anticipate that local expression of multiple immune effectors proteins will significantly enhance tumor control and survival in both synergistic murine and human-murine xenograft pre-clinical models of GBM. Ultimately, our goal is to rapidly translate this technology advance into the clinical trial for adult GBM patients.
Survivors of adolescent and young adult (AYA) central nervous system (CNS) neoplasms are at risk for late effects (LE) - treatment-related health problems occurring more than 5 years after therapy). Since, in Canada, AYA survivors are usually followed in the community, information must be conveyed to primary care providers to guide risk-based follow-up care. Objective: To assess documentation of LE risks and screening recommendations (SR) in medical records of AYA CNS tumor survivors treated with radiation therapy. Methods: The medical records of all patients diagnosed with a CNS neoplasm (benign or malignant) at ages 15-39 years, treated between 1985 and 2010 in the province of British Columbia, surviving >5 years and discharged to the community were assessed. Documentation of LE and SR were extracted, and analyzed descriptively. Results: Among 132 survivors (52% female), treated with radiation therapy (95% partial brain, 10% craniospinal, 8% partial spine, and 4% whole brain) and chemotherapy (17%), 19% of charts included no documentation of LE risks, 26% included only non-specific documentation, and 55% had minimal documentation (1 or 2 LE). Documentation of at least one specific LE increased from 24% in 1980-1989, to 54% in 1990-1999, to 86% in 2000 – 2010. Based on treatment information, all survivors were at high-risk for LE, such as radiation induced neoplasm, meningioma and cerebrovascular events. Yet, SR were documented in only 25% of charts. Conclusions: The documentation of LE risks and screening recommendations has been limited, highlighting the need to improve written communication with primary care providers.
Purpose: Cranial radiotherapy (CRT) was commonly given for childhood leukemia and brain tumors. Survivors are at risk of late effects including radiation induced meningioma (RIM). Surveillance for RIM is not standardized . We aimed to determine the incidence, latency, and screening patterns for RIM. Materials and Methods: Retrospective chart review of all patients aged <18 years at the time of radiation (RT), treated with CRT for leukemia or a brain tumor in BC between 1981-2006. Patient, tumor, and treatment characteristics were collected. Actuarial statistics were calculated with Kaplan-Meier Curves. Patients were censored at the date of last normal cranial imaging, or development of a RIM. Results: 392 patients were identified. Median age (range) at CRT was 9.6 years. Median CRT dose was 28Gy. The original diagnosis was leukemia in 50%, glioma in 13%, medulloblastoma in 8%, ependymoma in 7%, neuroectodermal tumor in 7%, germ cell tumor in 5%, craniopharyngioma in 4%, and other pathologies in 6%. Median (range) of clinical follow-up (FU) was 13.2 (0-37.5) years. Median (range) of cranial imaging FU was 15.5 (0-21.2) years. There was no documented cranial imaging FU in 144 patients. Forty-eight patients developed a RIM. The median age (range) at RT for patients with RIM was 6.7 years. Only 8 of these cases presented with associated symptoms. The earliest RIM in our cohort occurred 10.2 years after CRT. On actuarial analysis, the median (95% CI) time to development of a meningioma was 29.8 (28.9-30.7) years. Incidence (95% CI) of meningioma at 10 years was 0%, 15 years was 5 (2-9)%, 20 years was 12 (6-18)%, 25 years was 33 (23-43)% and 30 years was 47 (37-68)%. Amongst patients with a RIM, the median dose of CRT was 45 Gy. The lowest dose of RT in a patient who developed RIM was 12 Gy. RT was delivered to the whole brain in 58% and partial brain in 42% of patients with a RIM. Conclusions: After CRT in pediatric patients, there is a significant risk of developing a RIM and a steady increase in this risk with ongoing follow-up. We recommend standardization of surveillance for these patients with screening beginning 10 years after completion of CRT.
The effect of folic acid (FA) on breast cancer (BC) risk is uncertain. We hypothesised that this uncertainty may be due, in part, to differential effects of FA between BC cells with different phenotypes. To test this we investigated the effect of treatment with FA concentrations within the range of unmetabolised FA reported in humans on the expression of the transcriptome of non-transformed (MCF10A) and cancerous (MCF7 and Hs578T) BC cells. The total number of transcripts altered was: MCF10A, seventy-five (seventy up-regulated); MCF7, twenty-four (fourteen up-regulated); and Hs578T, 328 (156 up-regulated). Only the cancer-associated gene TAGLN was altered by FA in all three cell lines. In MCF10A and Hs578T cells, FA treatment decreased pathways associated with apoptosis, cell death and senescence, but increased those associated with cell proliferation. The folate transporters SLC19A1, SLC46A1 and FOLR1 were differentially expressed between cell lines tested. However, the level of expression was not altered by FA treatment. These findings suggest that physiological concentrations of FA can induce cell type-specific changes in gene regulation in a manner that is consistent with proliferative phenotype. This has implications for understanding the role of FA in BC risk. In addition, these findings support the suggestion that differences in gene expression induced by FA may involve differential activities of folate transporters. Together these findings indicate the need for further studies of the effect of FA on BC.
Historically, economic development has been strongly correlated with increasing energy use and growth of greenhouse gas (GHG) emissions. Renewable energy (RE) can help decouple that correlation, contributing to sustainable development (SD). In addition, RE offers the opportunity to improve access to modern energy services for the poorest members of society, which is crucial for the achievement of any single of the eight Millennium Development Goals.
Theoretical concepts of SD can provide useful frameworks to assess the interactions between SD and RE. SD addresses concerns about relationships between human society and nature. Traditionally, SD has been framed in the three-pillar model—Economy, Ecology, and Society—allowing a schematic categorization of development goals, with the three pillars being interdependent and mutually reinforcing. Within another conceptual framework, SD can be oriented along a continuum between the two paradigms of weak sustainability and strong sustainability. The two paradigms differ in assumptions about the substitutability of natural and human-made capital. RE can contribute to the development goals of the three-pillar model and can be assessed in terms of both weak and strong SD, since RE utilization is defined as sustaining natural capital as long as its resource use does not reduce the potential for future harvest.
A method for removing SiO2 and producing an ordered Si(100) surface using Sr or SrO has been developed. In this technique, a few monolayers of Sr or SrO are deposited onto the as received Si(100) wafer in a ultrahigh vacuum molecular beam epitaxy system. The substrate is then heated to ∼800°C for about 5 minutes, the SiO2 is removed to leave behind a Sr terminated Si(100) surface. This Sr terminated Si(100) surface is well suited for the growth of crystalline high k dielectric SrTiO3 films. Temperature programmed desorption measurements were carried out to understand the mechanism of removing SiO2 from Si(100) using Sr or SrO. The species we observed coming off the surface during the temperature cycle was mainly SiO and O, no significant amount of Sr containing species was observed. We conclude that the SiO2 removal is due to the catalyst reaction SiO2 + Sr (or SrO) → SiO (g) + O + Sr (or SrO). The reaction happened through several intermediate steps. The reaction SiO2 + Si → 2SiO (g) at the SiO2/Si interface is limited and the pit formation is suppressed. The main roles that Sr or SrO play during the oxide removal process are catalysts promoting SiO formation and preventing further etching and the formation of pits in the substrate.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.