We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A museum should be a place where cultures, dialogue, and social relations are enhanced. Given the renewed public interest in the topic, the author poses the question: Is there a need and a possibility to decolonize ethnographic museums? Should we have common and shared practices? In an attempt to eliminate colonial vestiges in museums, an analysis of literature and practices leads the author to analyze five European ethnographic museums in order to understand their merits and shortcomings. The subjectivity of these institutions and the diversity with which colonization can be presented makes the proposal of a single generalized solution not preferable. An objective analysis, based on actions and variables, drives the author to determine, however, that in order to revitalize museum practices, there is a need to create a sharable framework. The design of minimum standards can help museums set clear and measurable goals to achieve a higher level of decolonization.
Background: The use of Tenecteplase (TNK) in Extended Time Window (ETW) for Acute Ischemic Stroke (AIS) remains an ongoing debate. Methods: Systematic review of 3 Randomized controlled trials (RCTs)- TIMELESS, TRACE 3, CHABLIS-T II was conducted. Results: 1198 patients were enrolled: 603 received TNK, while 595 were controls. All 3 trials included patients with Internal Carotid and/ or Proximal Middle Cerebral Artery Occlusions; however, in TRACE 3, patients did not have access to endovascular thrombectomy. TIMELESS and CHABLIS-T II showed better recanalization in the TNK group but the median Modified Rankin Score was 3 at 90 days in both groups, demonstrating no benefit in clinical outcomes. Symptomatic Intracranial hemorrhage (sICH) was similar in the two groups. In TRACE 3, there was an improvement in functional outcomes at 90 days in the TNK group (33.0% vs. 24.2%), but the incidence of sICH was also higher (3.0% and 0.8%, respectively). Conclusions: Better recanalization rates are seen with TNK in ETW, but may not be associated with improved functional outcomes at 90 days compared to medical management. Incidence of sICH also remains largely favorable, except in TRACE 3, which showed a higher incidence in the TNK group. There remains a need for more RCTs in this population.
Students’ interfering behaviour is a common concern among educators working in special and general education classrooms. Interfering behaviour can significantly compromise students’ educational experiences and educators’ ability to create a conducive learning environment. Evidence-based assessments and interventions for interfering behaviour in the classroom involve identifying the variables in the student’s immediate environment influencing these behaviours. There has been little to no dissemination of evidence-based assessments for classroom management in developing nations such as South Africa and Vietnam. In the current study, we used a single-case design to assess the effectiveness and acceptability of behavioural skills training (BST) in teaching educators from South Africa and Vietnam how to assess students’ interfering behaviour in the classroom. The training was divided into four phases, with the different steps involved in teaching participants how to assess interfering behaviour. All participants successfully acquired the trained skills and demonstrated a shift in their explanation of the causes of interfering behaviour.
Psychiatric symptoms are typically highly inter-correlated at the group level. Collectively, these correlations define the architecture of psychopathology – informing taxonomic and mechanistic models in psychiatry. However, to date, it remains unclear if this architecture differs between etiologically distinct subgroups, despite the core relevance of this understanding for personalized medicine. Here, we introduce a new analytic pipeline to probe group differences in the psychopathology architecture – demonstrated through the comparison of two distinct neurogenetic disorders.
Methods
We use a large questionnaire battery in 300 individuals aged 5–25 years (n = 102 XXY/KS, n = 64 XYY, n = 134 age-matched XY) to characterize the structure of correlations among 53 diverse measures of psychopathology in XXY/KS and XYY syndrome – enabling us to compare the effects of X- versus Y-chromosome dosage on the architecture of psychopathology at multiple, distinctly informative levels.
Results
Behavior correlation matrices describe the architecture of psychopathology in each syndrome. A comparison of matrix rows reveals that social problems and externalizing symptoms are most differentially coupled to other aspects of psychopathology in XXY/KS versus XYY. Clustering the difference between matrices captures coordinated group differences in pairwise coupling between measures of psychopathology: XXY/KS shows greater coherence among externalizing, internalizing, and autism-related features, while XYY syndrome shows greater coherence in dissociality and early neurodevelopmental impairment.
Conclusions
These methods offer new insights into X- and Y-chromosome dosage effects on behavior, and our shared code can now be applied to other clinical groups of interest – helping to hone mechanistic models and inform the tailoring of care.
A key step toward understanding psychiatric disorders that disproportionately impact female mental health is delineating the emergence of sex-specific patterns of brain organisation at the critical transition from childhood to adolescence. Prior work suggests that individual differences in the spatial organisation of functional brain networks across the cortex are associated with psychopathology and differ systematically by sex.
Aims
We aimed to evaluate the impact of sex on the spatial organisation of person-specific functional brain networks.
Method
We leveraged person-specific atlases of functional brain networks, defined using non-negative matrix factorisation, in a sample of n = 6437 youths from the Adolescent Brain Cognitive Development Study. Across independent discovery and replication samples, we used generalised additive models to uncover associations between sex and the spatial layout (topography) of personalised functional networks (PFNs). We also trained support vector machines to classify participants’ sex from multivariate patterns of PFN topography.
Results
Sex differences in PFN topography were greatest in association networks including the frontoparietal, ventral attention and default mode networks. Machine learning models trained on participants’ PFNs were able to classify participant sex with high accuracy.
Conclusions
Sex differences in PFN topography are robust, and replicate across large-scale samples of youth. These results suggest a potential contributor to the female-biased risk in depressive and anxiety disorders that emerge at the transition from childhood to adolescence.
Objectives/Goals: This pilot study aims to assess the implementation of the DataDay app in memory clinics for patients with MCI or dementia, focusing on usability, user satisfaction, and impact on health outcomes. We seek to identify barriers and facilitators to implementation and evaluate its effect on reducing unnecessary hospital stays. Methods/Study Population: This mixed-methods study will involve 50 participants, 25 diads of patients with MCI or mild-to-moderate dementia and their caregivers from the community. Participants will use DataDay for 12 weeks, receiving reminders to log daily activities such as nutrition, mood, cognition, and physical activity. Baseline demographic data will be collected from self-reported surveys. Participants will receive training on app use, with follow-up interviews at 4, 8, and 12 weeks to gather feedback. Quantitative data analysis will include repeated measures analysis of variance to compare pre- and post-intervention outcomes, such as medication use and ER visits. Thematic analysis will be conducted on interview transcripts to understand user experiences. Results/Anticipated Results: We anticipate the study will demonstrate the feasibility of the DataDay app for self-management in individuals with MCI or dementia. Expected outcomes include improved medication adherence, reduced emergency room visits, and increased user engagement with daily health monitoring. Qualitative feedback is expected to highlight user satisfaction with the app’s reminders and ease of integration into daily routine. We also expect potential challenges to be identified such as initial learning difficulties and technology-related frustration. The data will help refine the app for better usability and inform strategies for widespread implementation in memory assessment clinics. Discussion/Significance of Impact: The study will provide insights into the practicality of implementing DataDay in memory clinics. The results will highlight necessary adjustments and provide key factors for successful adoption in other clinics. DataDay aims to allow individuals with MCI or dementia to manage their condition at home and enhance their quality of life.
The definition of tool use has long been debated, especially when applied beyond humans. Recent work argues that the phenomena included within tool use are so broad and varied that there is little hope of using the category for scientific generalizations, explanations, and predictions about the evolution, ecology, and psychology of tool users. One response to this argument has been the development of tooling as a replacement for tool use. In this article, we analyze the tool use and tooling frameworks. Identifying advantages and limitations in each, we offer a synthetic approach that suggests promising avenues for future research.
Objectives/Goals: Lack of comparative data limits research operations quality improvement (QI). The Northwest Participant and Clinical Interactions (NW PCI) Network, a group of 17 unaffiliated university and health system-based research centers, built an operations dashboard to track efficiency and enable multisite QI projects. Methods/Study Population: A Data Governance Working Group was assembled to establish shared data governance, draft nondisclosure (NDA), and data transfer and use agreements (DTUA) suitable across organizations and standardize research operations metric definitions. Sites in the NW PCI Network were recruited to participate in a pilot program to assess data sharing and governance infrastructure, data collection, upload procedures, and data visualization tools. The NW PCI Coordinating Center developed an analytical data dashboard of research operations metrics and conducted semi-structured interviews with participating sites to understand barriers and facilitators of program success. Results/Anticipated Results: Four sites (2 health systems, 2 universities) were recruited for the pilot and reviewed and executed NDAs and DTUAs. Three of the sites have submitted data for a total of 1,405 studies. Of the 24 requested data operations metrics (e.g., study startup, recruitment, implementation, and basic study information), 71% of the metrics were submitted by all three sites (n = 17), 25% were submitted by at least one site (n = 6), and 4% were not submitted by any site (n = 1). Interviews with sites after data submission found areas for improvements (clarification of data definitions, efficiency of data upload process) and positive effects for sites (e.g., process improved insight into own data operations). Discussion/Significance of Impact: Unaffiliated research centers created data governance procedures to enable sharing of operations data. Pilot sites successfully loaded most but not all operations data to the dashboard. Interviews identified process limitations and opportunities for improvement to inform expansion to all NW PCI sites.
Building on the success of the Soft Drinks Industry Levy (SDIL), new tax proposals have been considered in the public health policy debate in the UK. To inform such debate, estimates of the potential impacts of alternative tax scenarios are of critical importance. Using a modelling approach, we studied the effects of two tax scenarios: (1) a hypothetical excise tax designed to tax food products included in the Sugar Reduction Programme (SRP), accounting for pack size to reduce the convenience of purchasing larger quantities at once; (2) an ad valorem tax targeting products based on the UK Nutrient Profile Model (NPM). Simulations of scenario 1 show a reduction in sugar purchased of up to 38 %, with the largest decreases observed for sweet confectionery with a tiered tax, similar in structure to the SDIL. Expected food reformulation in scenario 1 led to further decreases in sugar purchased for all categories. In scenario 2, under the assumption that the tax would not affect purchases of healthier products, a 20 % tax on less healthy products would reduce total sugar purchased by 4·3 % to 14·7 % and total energy by 4·7 % to 14·8 %. Despite some limitations and assumptions, our results suggest that new fiscal policy options hold a significant potential for improving diet quality beyond what has been achieved by the SDIL and SRP. An estimated increase in consumer expenditures in both scenarios suggests that attention needs to be paid to potentially regressive effects in the design of any new food taxes.
While reformulation policies are commonly used to incentivise manufacturers to improve the nutrient profile of the foods and beverages they produce, only a few countries have implemented mandatory reformulation policies. This paper aimed to review evidence on the design, implementation challenges and effectiveness of mandatory reformulation policies and compare them to voluntary reformulation policies. The systematic search retrieved seventy-one studies including twelve on mandatory reformulation policies. Most mandatory reformulation policies were aimed at reducing trans-fatty acids or sodium in foods. Overall, mandatory reformulation policies were found to be more effective than voluntary ones in improving dietary intakes. Mandatory policies were implemented when voluntary policies either failed or were found to be insufficient to improve the composition of foods. Typical features of mandatory policies could also improve the design of voluntary policies. Examples include strict but attainable targets and a tight monitoring of compliance.
Cost-effectiveness models fully informed by real-world epidemiological parameters yield the best results, but they are costly to obtain. Model calibration using real-world data/evidence (RWD/E) on routine health indicators can provide an alternative to improve the validity and acceptability of the results. We calibrated the transition probabilities of the reference chemotherapy treatment using RWE on patient overall survival (OS) to model the survival benefit of adjuvant trastuzumab in Indonesia.
Methods
A Markov model comprising four health states was initially parameterized using the reference-treatment transition probabilities, obtained from published international evidence. We then calibrated these probabilities, targeting a 2-year OS of 86.11 percent from the RWE sourced from hospital registries. We compared projected OS duration and life-years gained (LYG) before and after calibration for the Nelder–Mead, Bound Optimization BY Quadratic Approximation, and generalized reduced gradient (GRG) nonlinear optimization methods.
Results
The pre-calibrated transition probabilities overestimated the 2-year OS (92.25 percent). GRG nonlinear performed best and had the smallest difference with the RWD/E OS. After calibration, the projected OS duration was significantly lower than their pre-calibrated estimates across all optimization methods for both standard chemotherapy (~7.50 vs. 11.00 years) and adjuvant trastuzumab (~9.50 vs. 12.94 years). LYG measures were, however, similar (~2 years) for the pre-calibrated and calibrated models.
Conclusions
RWD/E calibration resulted in realistically lower survival estimates. Despite the little difference in LYG, calibration is useful to adapt external evidence commonly used to derive transition probabilities to the policy context, thereby enhancing the validity and acceptability of the modeling results.
Bringing together an interdisciplinary team of scholars, this book explores three interconnected aspects of syntax - its origins and evolution, its acquisition by children, and its role in languages' ongoing development and change. These three distinct areas were linked through Bickerton's most provocative work 'Language Bioprogram Hypothesis' (LBH). This book highlights the discussions on syntax that have emerged over the years as a result of the LBH model. Each chapter include a discussion of Bickerton's work, and a special focus is placed on Creole languages, which provide unique case studies for the study of the evolution, acquisition and development of languages. The book also discusses the relevance of LBH for other natural languages, including sign languages. Shedding light on the relevance of syntax in language, it is essential reading for researchers and students in a wide range of linguistic disciplines.
Adone’s chapter focuses on home signs, bringing to light the acquisition process against a background of ‘normless’ language environment (Bakker, this volume) and in the absence of exposure to a ‘conventional language model’ (Adone 2005). She thus discusses what absence of exposure means when looking at children home signers. In comparison to previous work, Adone shows that the absence of a conventional language model does not mean complete absence of input. She argues that children ‘scan’ their environment for input and use every bit of language-related information as input. Adone further argues that the verb chains in child home signers’ initial grammars develop into adult-like serial verb constructions. This development can be interpreted as evidence for the view that children exploit input to the best of their ability to ‘create language’.
“Syntax lies at the very heart of what it means to be human” (Bickerton & Szathmary 2009: xviii). It has been argued that no other species has been able to acquire a rudimentary syntax, thus reinforcing the view that acquiring syntax is a unique ability of humans (Bickerton & Szathmary 2009). The present volume describes the current state of the discussion on syntax with a special focus on Creole languages. It sheds light on the relevance of syntax in Language by bringing together scholars from the fields of language evolution, language acquisition and development of young languages, that is, Creoles.