To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Multidimensional item response theory (MIRT) offers psychometric models for various data settings, most popularly for dichotomous and polytomous data. Less attention has been devoted to count responses. A recent growth in interest in count item response models (CIRM)—perhaps sparked by increased occurrence of psychometric count data, e.g., in the form of process data, clinical symptom frequency, number of ideas or errors in cognitive ability assessment—has focused on unidimensional models. Some recent unidimensional CIRMs rely on the Conway–Maxwell–Poisson distribution as the conditional response distribution which allows conditionally over-, under-, and equidispersed responses. In this article, we generalize to the multidimensional case, introducing the Multidimensional Two-Parameter Conway–Maxwell–Poisson Model (M2PCMPM). Using the expectation-maximization (EM) algorithm, we develop marginal maximum likelihood estimation methods, primarily for exploratory M2PCMPMs. The resulting discrimination matrices are rotationally indeterminate. Recently, regularization of the discrimination matrix has been used to obtain a simple structure (i.e., a sparse solution) for dichotomous and polytomous data. For count data, we also (1) rotate or (2) regularize the discrimination matrix. We develop an EM algorithm with lasso ($\ell _1$) regularization for the M2PCMPM and compare (1) and (2) in a simulation study. We illustrate the proposed model with an empirical example using intelligence test data.
We model transient mushy-layer growth for a binary alloy solidifying from a cooled boundary, characterising the impact of liquid composition and thermal growth conditions on the mush porosity and growth rate. We consider cooling from a perfectly conducting isothermal boundary, and from an imperfectly conducting boundary governed by a linearised thermal boundary condition. For an isothermal boundary we characterise different growth regimes depending on a concentration ratio, which can also be viewed as characterising the ratio of composition-dependent freezing point depression versus the temperature difference across the mushy layer. Large concentration ratio leads to high porosity throughout the mushy layer and an asymptotically simplified model for growth with an effective thermal diffusivity accounting for latent heat release from internal solidification. Low concentration ratio leads to low porosity throughout most of the mushy layer, except for a high-porosity boundary layer localised near the mush–liquid interface. We identify scalings for the boundary-layer thickness and mush growth rate. An imperfectly conducting boundary leads to an initial lag in the onset of solidification, followed by an adjustment period, before asymptoting to the perfectly conducting state at large time. We develop asymptotic solutions for large concentration ratio and large effective heat capacity, and characterise the mush structure, growth rate and transition times between the regimes. For low concentration ratio the high porosity zone spans the full mush depth at early times, before localising near the mush–liquid interface at later times. Such variation of porosity has important implications for the properties and biological habitability of mushy sea ice.
Studies have shown that some covertly conscious brain-injured patients, who are behaviorally unresponsive, can reply to simple questions via neuronal responses. Given the possibility of such neuronal responses, Andrew Peterson et al. have argued that there is warrant for some covertly conscious patients being included in low-stakes medical decisions using neuronal responses, which could protect and enhance their autonomy. The justification for giving credence to alleged neuronal responses must be analyzed from various perspectives, including neurology, bioethics, law, and as we suggest, philosophy of mind. In this article, we analyze the warrant for giving credence to neuronal responses from two different views in philosophy of mind. We consider how nonreductive physicalism’s causal exclusion problem elicits doubt about interpreting neural activity as indicating a conscious response. By contrast, such an interpretation is supported by the mind-body powers model of neural correlates of consciousness inspired by hylomorphism.
Treating inertial measurement unit (IMU) measurements as inputs to a motion model and then preintegrating these measurements have almost become a de facto standard in many robotics applications. However, this approach has a few shortcomings. First, it conflates the IMU measurement noise with the underlying process noise. Second, it is unclear how the state will be propagated in the case of IMU measurement dropout. Third, it does not lend itself well to dealing with multiple high-rate sensors such as a lidar and an IMU or multiple asynchronous IMUs. In this paper, we compare treating an IMU as an input to a motion model against treating it as a measurement of the state in a continuous-time state estimation framework. We methodically compare the performance of these two approaches on a 1D simulation and show that they perform identically, assuming that each method’s hyperparameters have been tuned on a training set. We also provide results for our continuous-time lidar-inertial odometry in simulation and on the Newer College Dataset. In simulation, our approach exceeds the performance of an imu-as-input baseline during highly aggressive motion. On the Newer College Dataset, we demonstrate state-of-the art results. These results show that continuous-time techniques and the treatment of the IMU as a measurement of the state are promising areas of further research. Code for our lidar-inertial odometry can be found at: https://github.com/utiasASRL/steam_icp.
This systematic literature review explores the applications of social network platforms for disaster health care management and resiliency and investigates their potential to enhance decision-making and policy formulation for public health authorities during such events.
Methods
A comprehensive search across academic databases yielded 90 relevant studies. Utilizing qualitative and thematic analysis, the study identified the primary applications of social network data analytics during disasters, organizing them into 5 key themes: communication, information extraction, disaster Management, Situational Awareness, and Location Identification.
Results
The findings highlight the potential of social networks as an additional tool to enhance decision-making and policymaking for public health authorities in disaster settings, providing a foundation for further research and innovative approaches in this field.
Conclusions
However, analyzing social network data has significant challenges due to the massive volume of information generated and the prevalence of misinformation. Moreover, it is important to point out that social network users do not represent individuals without access to technology, such as some elderly populations. Therefore, relying solely on social network data analytics is insufficient for effective disaster health care management. To ensure efficient disaster management and control, it is necessary to explore alternative sources of information and consider a comprehensive approach.
This study aimed to investigate the prevalence and nature of cognitive impairment among severely ill COVID-19 patients and the effectiveness of the Montreal Cognitive Assessment (MoCA) in detecting it.
Method:
We evaluated cognition in COVID-19 patients hospitalized during the first wave (March to June 2020) from six Dutch hospitals, nine months post-discharge, using a comprehensive multi-domain neuropsychological test battery. Test performance was corrected for sex, age, and education differences and transformed into z-scores. Scores within each cognitive domain were averaged and categorized as average and above (z-score ≥ −0.84), low average (z-score −1.28 to −0.84), below average (z-score −1.65 to −1.28), and exceptionally low (z-score < −1.65). Patients were classified with cognitive impairment if at least one domain’s z-score fell below −1.65. We assessed the MoCA’s accuracy using both the original cutoff (<26) and an “optimal” cutoff determined by Youden’s index.
Results:
Cognitive impairment was found in 12.1% (24/199) of patients, with verbal memory and mental speed most affected (6.5% and 7% below −1.65, respectively). The MoCA had an area under the curve of 0.84. The original cutoff showed sensitivity of 83% and specificity of 66%. Using the identified optimal cutoff of <24, maintained sensitivity while improving specificity to 81%.
Conclusions:
Cognitive impairment prevalence in initially hospitalized COVID-19 patients is lower than initially expected. Verbal memory and processing speed are primarily affected. The MoCA is a valuable screening tool for these impairments and lowering the MoCA cutoff to <24 improves specificity.
Despite the versatility of generalized linear mixed models in handling complex experimental designs, they often suffer from misspecification and convergence problems. This makes inference on the values of coefficients problematic. In addition, the researcher’s choice of random and fixed effects directly affects statistical inference correctness. To address these challenges, we propose a robust extension of the “two-stage summary statistics” approach using sign-flipping transformations of the score statistic in the second stage. Our approach efficiently handles within-variance structure and heteroscedasticity, ensuring accurate regression coefficient testing for 2-level hierarchical data structures. The approach is illustrated by analyzing the reduction of health issues over time for newly adopted children. The model is characterized by a binomial response with unbalanced frequencies and several categorical and continuous predictors. The proposed approach efficiently deals with critical problems related to longitudinal nonlinear models, surpassing common statistical approaches such as generalized estimating equations and generalized linear mixed models.
The focus of job satisfaction literature remains on the subordinate even though supervisors are responsible for evaluating employee performance, determining employee pay, raises, promotions, growth opportunities, etc., all of which impact employees’ subsequent performance that contributes (or not) to organizational success. Using a psychological contracts lens, we develop and test theoretical arguments predicting supervisors’ response to contributions is not uniformly positive depending on the type and amount of contribution involved. Across two studies, we ask supervisors to evaluate subordinates’ delivered contributions relative to promised contributions. Our results challenge the assumption that supervisors always desire larger amounts of work from their subordinates; excess contributions were associated with lower supervisors’ satisfaction with subordinates for some types of contributions. The results imply that subordinates’ contributions of work to supervisors may influence supervisors’ satisfaction with subordinates perhaps affecting their performance reviews and career opportunities.
Children continue to be an underrepresented population in research and clinical trials due to difficulties encountered in recruitment, assenting, and retention processes. “Sofia Learns About Research” is a children’s activity book that introduces youth to clinical research and basic elements of clinical trials.
Methods:
Development of the activity book began in 2016, with publication of the first paper version in 2017 and an online version adapted for computer and tablet users in 2019. In 2019, we developed internal review board-approved pre/post surveys with five statements (written at ≤ 3rd-grade level) reflecting key concepts covered in the book. Participants were asked to indicate whether they agreed, disagreed, or were not sure about each of the statements and if they would ever want to be part of a research study. Preliminary analyses included descriptive statistics and cross-tabulations with chi squares.
Results:
Despite delays in dissemination and outreach due to the COVID-19 pandemic, we obtained feedback from over 170 diverse persons across a spectrum of communities and community partners. After book exposure, more participants knew that both children and parents have to assent/consent and that participants can withdraw from a study at any time.
Conclusions:
The book is an important advocacy tool with a long-term aim of increasing children’s knowledge and awareness about clinical research, ultimately leading to enhanced participation in clinical research and trials.
The nonlinear stability of two-dimensional (2-D) plane Couette flow subject to a constant throughflow is analysed at finite and asymptotically large Reynolds numbers $\textit {Re}$. The speed of this throughflow is quantified by the non-dimensional throughflow number $\eta$. The base flow exhibits a linear instability provided $\eta \gtrsim 3.35$, with multi-deck upper and lower branch structures developing in the limit $1\ll \eta \ll \mathit {O}(\textit {Re})$. This instability provides a springboard for the computation of nonlinear travelling waves which bifurcate subcritically from the linear neutral curve, allowing us to map out a neutral surface at different values of $\eta$. Using strongly nonlinear critical layer theory, we investigate the waves that bifurcate from the upper branch at asymptotically large $\textit {Re}$. This asymptotic structure exists provided the throughflow number is larger than the critical value of $\eta _c\approx 1.20$ and is shown to give quantitatively similar results to the numerical solutions at Reynolds numbers of $\mathit {O}(10^5)$.
Given the rate of advancement in predictive psychiatry, there is a threat that it outpaces public and professional willingness for use in clinical care and public health. Prediction tools in psychiatry estimate the risk of future development of mental health conditions. Prediction tools used with young populations have the potential to reduce the worldwide burden of depression. However, little is known globally about adolescents’ and other stakeholders’ attitudes toward use of depression prediction tools. To address this, key informant interviews and focus group discussions were conducted in Brazil, Nepal, Nigeria and the United Kingdom with 23 adolescents, 45 parents, 47 teachers, 48 health-care practitioners and 78 other stakeholders (total sample = 241) to assess attitudes toward using a depression prediction risk calculator based on the Identifying Depression Early in Adolescence Risk Score. Three attributes were identified for an acceptable depression prediction tool: it should be understandable, confidential and actionable. Understandability includes depression literacy and differentiating between having a condition versus risk of a condition. Confidentiality concerns are disclosing risk and impeding educational and occupational opportunities. Prediction results must also be actionable through prevention services for high-risk adolescents. Six recommendations are provided to guide research on attitudes and preparedness for implementing prediction tools.
A super-stable granular heap is a pile of grains whose free surface is inclined above the angle of repose, and which forms when particles are poured onto a plane that is confined laterally by frictional sidewalls that are separated by a narrow gap. During continued mass supply, the heap free surface gradually steepens until all the inflowing grains can flow out of the domain. As soon as the supply of grains is stopped, the heap is progressively eroded, and if the base of the domain is inclined above the angle of repose, then all the grains eventually flow out. This phenomenology is modelled using a system of two-dimensional width-averaged mass and momentum balances that incorporate the sidewall friction. The granular material is assumed to be incompressible and satisfy the partially regularized $\mu (I)$-rheology. This is implemented in OpenFOAM$^{\circledR}$ and compared against small-scale experiments that study the formation, steady-state behaviour and drainage of a super-stable heap. The simulations accurately capture the dense liquid-like flows as well as the evolving heap shape. The steady uniform flow that develops along the heap surface has non-trivial inertial number dependence through its depth. Super-stable heaps are therefore a sensitive rheometer that can be used to determine the dependence of the friction $\mu$ on the inertial number $I$. However, these flows are challenging to simulate because the free-surface inertial number is high, and can exceed the threshold for ill-posedness even for the partially regularized theory.
Compared with first-tier cities in China that are of abundant funds and resources like legions of high-level hospitals, the degree of nurses’ disaster nursing preparedness in non-first-tier cities (inland) is relatively lower. For example, nurses’ knowledge reserve of specific disasters is not comprehensive enough. And nurses are diffident when it comes to the skills of handling disaster rescue. Furthermore, their understanding of the roles to take in disaster coordination management is ambiguous. Conquering these challenges could be conducive to the improvement of local medical staffs’ confidence and capabilities in disaster relief. Consequently, the objective of this research is to probe for approaches of improving the strategies of disaster nursing in Chinese small cities.
Methods
In order to ascertain the factors that influence disaster preparedness, a cross-sectional study with SPSS 25.0 data analysis method is adopted. The sample of the study is nursing personnel from 4 comprehensive hospitals at or above the second level in Yongcheng, Henan Province, China, as nurses are first-line responders in disasters and pandemics and the largest group in disaster rescue.
Results
From 813 distributed questionnaires, 784 completed questionnaires were returned, for a response rate of 96.43%. The total average score of the Disaster Preparedness Evaluation Tool is 146.76 ± 34.98, and the average score of all the entries is 3.26 ± 0.45, indicating moderate performance. The results indicate that age affects post disaster management (P < 0.05), meaning that as age increases, the scores also tend to be higher. Work experience, professional titles, disaster relief experience, and disaster training significantly differ in terms of knowledge, skills, post disaster management, and total scores (P < 0.01). The result of multivariate analysis indicates that titles, disaster relief experiences, and disaster training are the main factors affecting the disaster preparedness of nurses in Chinese non-first-tier (inland) cities(P < 0.05).
Conclusions
For the sake of upgrading the efficiency of disaster nursing preparedness in Chinese non-first-tier (inland) cities with limited funds and resources, it is very important to formulate training and education methods that are suitable for the local area, conduct characteristic simulation exercises, and expand experience exchange between hospitals. It’s certain that the local government will also play an important role in coordinating and organizing the division of labor, resource allocation, and management of hospitals at all levels in different phases of disasters, which can help nursing staff have a clearer understanding of their roles when preparing for disasters.
The aim of this article and the ensuing Special Issue is to assess, après fifteen years, the effects on the EU legal and political system of the overhaul of executive delegated powers inaugurated by the Lisbon Treaty. It identifies core parameters – i.e. (institutional) balance of powers, (democratic) legitimacy, control and accountability, effectiveness of EU policy implementation – considered by the contributions to this Special Issue to map and examine, both constitutionally and normatively, the EU system of delegated powers in law and practice. It also puts forward seven overarching reflections revealing some of the core issues and challenges posed by the current stage of development of the post-Lisbon EU system of delegated powers.
The amplitude modulation coefficient, $R$, that is widely used to characterize nonlinear interactions between large- and small-scale motions in wall-bounded turbulence is not compatible with detecting the convective nonlinearity of the Navier–Stokes equations. Through a spectral decomposition of $R$ and a simplified model of triadic convective interactions, we show that $R$ suppresses the signature of convective scale interactions, but is strongly influenced by linear interactions between large-scale motions and the background mean flow. We propose an additional coefficient that is specifically designed for the detection of convective nonlinearities, and we show how this new coefficient, $R_T$, quantifies the turbulent kinetic energy transport involved in turbulent scale interactions and reveals a classical energy cascade across widely separated scales.
Some effects are considered to be higher level than others. High-level effects provide expressive and succinct abstraction of programming concepts, while low-level effects allow more fine-grained control over program execution and resources. Yet, often it is desirable to write programs using the convenient abstraction offered by high-level effects, and meanwhile still benefit from the optimizations enabled by low-level effects. One solution is to translate high-level effects to low-level ones.
This paper studies how algebraic effects and handlers allow us to simulate high-level effects in terms of low-level effects. In particular, we focus on the interaction between state and nondeterminism known as the local state, as provided by Prolog. We map this high-level semantics in successive steps onto a low-level composite state effect, similar to that managed by Prolog’s Warren Abstract Machine. We first give a translation from the high-level local-state semantics to the low-level global-state semantics, by explicitly restoring state updates on backtracking. Next, we eliminate nondeterminism altogether in favour of a lower-level state containing a choicepoint stack. Then we avoid copying the state by restricting ourselves to incremental, reversible state updates. We show how these updates can be stored on a trail stack with another state effect. We prove the correctness of all our steps using program calculation where the fusion laws of effect handlers play a central role.
A common way of acquiring multiword expressions is through language input, such as during reading and listening. However, this type of learning is slow. Identifying approaches that optimize learning from input, therefore, is an important language-learning endeavor. In the present study, 85 learners of English as a foreign language read short texts with 42 figurative English phrasal verbs, repeated three times. In a counterbalanced design, we manipulated access to definitions (before text, after text, no definition) and typographic enhancement (with bolding, without bolding). The learning was measured by immediate and delayed gap-fill and meaning generation posttests. All posttests showed that learning with definitions was better than without, and that access to definitions after reading was more beneficial than before reading. Typographic enhancement effectively promoted contextual learning of phrasal verbs and increased the learning advantage associated with presenting definitions after reading.
We document significant increases in the suspension of ongoing drug projects following the passage of the Food and Drug Administration Amendments Act of 2007 (FDAAA), which mandates that pharmaceutical companies publicly disclose detailed clinical study results. Our results suggest a causal interpretation through difference-in-differences analyses that exploit variations in pre-FDAAA information environments. We also show evidence that fewer new projects are initiated after the FDAAA. Drug developers’ learning from peer failures is the primary mechanism, further amplified by financial constraints. We also examine the consequences of enhanced information disclosure, including changes in firm investment efficiency, drug quality, and disease morbidity.
Multivariate analysis using graphical models is rapidly gaining ground in psychology. In particular, Markov random field (MRF) graphical models have become popular because their graph structure reflects the conditional associations between psychological variables. Despite the fact that most psychological variables are assessed on an ordinal scale, the analysis of MRFs for ordinal variables has received little attention in the psychometric literature. To fill this gap, we present an MRF for ordinal data that so far has not been considered in network psychometrics. We present statistical methodology to test the structure of the proposed MRF, which requires us to determine the plausibility of the opposing hypotheses of conditional dependence and independence. To this end, we develop a Bayesian approach using the inclusion Bayes factor to quantify the (lack of) evidence for a given edge. We use a Bayesian variable selection approach to model the inclusion and exclusion of edges in the network, and Bayesian model averaging to compare network structures with and without the given edge. We provide an implementation in the new R package bgms, evaluate its performance in simulations, and illustrate it with empirical data.