We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
To understand healthcare workers’ (HCWs) beliefs and practices toward blood culture (BCx) use.
Design:
Cross-sectional electronic survey and semi-structured interviews.
Setting:
Academic hospitals in the United States.
Participants:
HCWs involved in BCx ordering and collection in adult intensive care units (ICU) and wards.
Methods:
We administered an anonymous electronic survey to HCWs and conducted semi-structured interviews with unit staff and quality improvement (QI) leaders in these institutions to understand their perspectives regarding BCx stewardship between February and November 2023.
Results:
Of 314 HCWs who responded to the survey, most (67.4%) were physicians and were involved in BCx ordering (82.3%). Most survey respondents reported that clinicians had a low threshold to culture patients for fever (84.4%) and agreed they could safely reduce the number of BCx obtained in their units (65%). However, only half of them believed BCx was overused. Although most made BCx decisions as a team (74.1%), a minority reported these team discussions occurred daily (42.4%). A third of respondents reported not usually collecting the correct volume per BCx bottle, half were unaware of the improved sensitivity of 2 BCx sets, and most were unsure of the nationally recommended BCx contamination threshold (87.5%). Knowledge regarding the utility of BCx for common infections was limited.
Conclusions:
HCWs’ understanding of best collection practices and yield of BCx was limited.
Fructose-containing sugars can exaggerate postprandial lipaemia and stimulate hepatic de novo lipogenesis (DNL) when compared to glucose-based carbohydrates(1). Galactose has recently been shown to increase postprandial lipaemia compared to glucose(2), but mechanisms remain uncharacterised. The aim of this study was to assess the effect and mechanisms of lactose-induced lipaemia.
Twenty-four non-obese adults (12 male and 12 female) completed three trials in a randomised, crossover design (28 ± 7-day washout). During trials, participants consumed test drinks containing 50 g fat with 100 g of carbohydrate. The control carbohydrate was a glucose polymer (maltodextrin), the experimental carbohydrate was galactose-containing carbohydrate (lactose) and the active comparator was fructose-containing carbohydrate (sucrose). Hepatic DNL was assessed by the 2H2O method and [U-13C]-palmitate was added to the test drink to trace the fate of the ingested fat. Blood and breath samples were taken to determine plasma metabolite and hormone concentrations, in addition to plasma and breath 2H and 13C enrichments. Data were converted into incremental under the curve (iAUC) and were checked for normality by visual inspection of residuals. Differences between trials were assessed by one-way ANOVA. Where a main effect of trial was detected, post- hoc t-tests were performed to determine which trials differed from lactose according to the principle of closed-loop testing.
The plasma triacylglycerol iAUC (mean ± SD) in response to maltodextrin was 51 ± 68 mmol/L*360 min. Following lactose ingestion, plasma triacylglycerol iAUC increased to 98 ± 88 mmol/L*360 min (p<0.001 vs maltodextrin), which was comparable to sucrose [90 ± 95 mmol/L*360 min (p=0.41 vs lactose)]. Hepatic DNL in response to maltodextrin was 6.6 ± 3.0%. Following ingestion of lactose, hepatic DNL increased to 12.4 ± 6.9% (p=0.02 vs maltodextrin), which was comparable to sucrose [12.2 ± 6.9% (p=0.96 vs lactose)]. Exhaled 13CO2 in response to maltodextrin was 10.4 ± 4.1 mmol/kgFFM*360 min. Following ingestion of lactose, exhaled 13CO2 was 8.8 ± 4.9 mmol/kgFFM*360 min (p=0.09 vs maltodextrin), which was lower than sucrose [11.1 ± 3.9 mmol/kgFFM*360 min (p=0.01 vs lactose)].
These data are consistent with the hypothesis that hepatic de novo lipogenesis contributes to both lactose and sucrose-induced lipaemia and provide a rationale to investigate the longer-term effects of lactose and sucrose on metabolism.
Cation-exchange capacities determined by methylene blue adsorption (CEC MB) and by the amount of K displaced from a K-saturated clay by NH4 (CEC K//NH4) correlate closely in five soil clays from Pennsylvania, but differ greatly in two soil clays with a large content of amorphous material. CEC MB was found to provide a more precise distinction between montmorillonite and vermiculite than CEC K//NH4. Synthetic aluminosilicate gels showed CEC K//NH4 > CEC Ca/Mg > CEC MB, but no relation to the behavior of the two soil clays with a large content of amorphous material was found.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
In the United States, lesbian, gay, bisexual, transgender, queer, intersex, asexual and other sexually minoritized and gender expansive (LGBTQ+) young adults are at increased risk for experiencing mental health inequities, including anxiety, depression and psychological distress-related challenges associated with their sexual and gender identities. LGBTQ+ young adults may have unique experiences of sexual and gender minority-related vulnerability because of LGBTQ+-related minority stress and stressors, such as heterosexism, family rejection, identity concealment and internalized homophobia. Identifying and understanding specific LGBTQ+-related minority stress experiences and their complex roles in contributing to mental health burden among LGBTQ+ young adults could inform public health efforts to eliminate mental health inequities experienced by LGBTQ+ young adults. Therefore, this study sought to form empirically based risk profiles (i.e., latent classes) of LGBTQ+ young adults based on their experiences with familial heterosexist experiences, LGBTQ+-related family rejection, internalized LGBTQ+-phobia and LGBTQ+ identity concealment, and then identify associations of derived classes with psychological distress.
Methods
We recruited and enrolled participants using nonprobability, cross-sectional online survey data collected between May and August 2020 (N = 482). We used a three-step latent class analysis (LCA) approach to identify unique classes of response patterns to LGBTQ+-related minority stressor subscale items (i.e., familial heterosexist experiences, LGBTQ+-related family rejection, internalized LGBTQ+-phobia and LGBTQ+ identity concealment), and multinomial logistic regression to characterize the associations between the derived classes and psychological distress.
Results
Five distinct latent classes emerged from the LCA: (1) low minority stress, (2) LGBTQ+ identity concealment, (3) family rejection, (4) moderate minority stress and (5) high minority stress. Participants who were classified in the high and moderate minority stress classes were more likely to suffer from moderate and severe psychological distress compared to those classified in the low minority stress class. Additionally, relative to those in the low minority stress class, participants who were classified in the LGBTQ+ identity concealment group were more likely to suffer from severe psychological distress.
Conclusion
Familial heterosexist experiences, LGBTQ+-related family rejection, internalized LGBTQ+-phobia and LGBTQ+ identity concealment are four constructs that have been extensively examined as predictors for mental health outcomes among LGBTQ+ persons, and our study is among the first to reveal nuanced gradients of these stressors. Additionally, we found that more severe endorsement of minority stress was associated with greater psychological distress. Given our study results and the previously established negative mental health impacts of minority stressors among LGBTQ+ young adults, findings from our study can inform research, practice, and policy reform and development that could prevent and reduce mental health inequities among LGBTQ+ young adults.
OBJECTIVES/GOALS: Super refractory status epilepticus (SRSE) is associated with high mortality, often due to withdrawal of life sustaining therapy (WLST) based on perceived poor neurological prognosis. Factors influencing decision making are underreported and poorly understood. We surveyed clinicians who treat SRSE to identify factors that influence WLST. METHODS/STUDY POPULATION: Health care providers (HCP), including physicians, pharmacists, and advanced practice providers, who treat SRSE answered a 51-question survey on respondent demographics, institutional characteristics and SRSE management that was distributed though professional societies. Respondents described approaches to prognostication and rated the importance of clinical factors in the management of two hypothetical clinical cases followed by their prediction of recovery potential for the same two cases. Neurointensivists and other HCP responses were compared using descriptive statistics to differentiate group characteristics; a p-value <0.05 was considered significant. Logistical regression models were employed to identify associations between clinician specific factors and prognostication. RESULTS/ANTICIPATED RESULTS: One-hundred and sixty-four respondents were included in the analysis. Compared to other HCPs (neurologists, epileptologists, neurosurgeons, other intensivists; n=122, 74%), neurointensivists (n=42, 26%) [Odds ratio (OR) 0.3, 95% confidence interval (CI) 0.14-0.68), p=.004)] were less likely to use prognostic severity scores and were less likely to prognosticate likelihood of good functional recovery (OR: 0.28 (95% CI: 0.13-0.62), p=.002) compared to non-neurointensivist HCPs, controlling for potential confounders including professional degree, years of experience, country of practice, and annual volume of SRSE cases. There was, however, significant overlap in factors deemed necessary for determining futility in care escalation. DISCUSSION/SIGNIFICANCE: Neurointensivists value similar clinical factors to other HCPs when evaluating medical futility in SRSE but are less likely to predict definitive outcomes. Pending final survey results, future studies aimed at understanding why neurointensivists may be less likely to decisively prognosticate (i.e. avoiding nihilism) in SRSE may be warranted.
A program of simultaneous linear equations has been developed to calculate component proportions and/or component property values for mineral mixtures in soil clays and sediments. The analysis is based on quantitatively measured chemical and physical properties of samples and involves (1) qualitative identification of the mineral components in the mixture by any appropriate means; (2) quantitative measure of the sample property values selected for use in the program; (3) estimation of the proportion of each component in the mixture by a technique such as X-ray powder diffraction; (4) assignment of limits to component property ranges; (5) selection of one of four available calculation options and application of the simultaneous linear-equations program; (6) examination of the residuals of the analysis and, if appropriate, adjustment of the initial estimates for component proportions or property ranges and then repeating step 5; and (7) verification of the final component proportions by comparison with information from step 1. Completeness and/or accuracy of the final results for component proportions may be checked by the closeness of approach to 1.0 for the sum of the component proportions. The method requires that, at minimum, the number of properties measured must equal the number of components in the samples being analyzed and that the minimum number of samples must equal the number of properties measured.
Using the clay fractions of 15 Pennsylvania soils containing kaolinite, illite, smectite, vermiculite, chlorite, interstratified vermiculite/chlorite, quartz, and noncrystalline material, and measuring methylene blue cation-exchange capacity, the amount of Ca displaced by Mg from a Ca-saturated clay, the amount of K displaced by NH4 from a K-saturated clay heated to 110°C, % K2O, % SiO2, % MgO, and weight loss at 110°–300°C and 300°–950°C, the adjustment of property values was found to have the lowest residual value and the most consistent results. The source of analytical errors was also located by examination of residual tables. Samples that were similar in composition gave more reliable component proportions.
High-quality evidence is lacking for the impact on healthcare utilisation of short-stay alternatives to psychiatric inpatient services for people experiencing acute and/or complex mental health crises (known in England as psychiatric decision units [PDUs]). We assessed the extent to which changes in psychiatric hospital and emergency department (ED) activity were explained by implementation of PDUs in England using a quasi-experimental approach.
Methods
We conducted an interrupted time series (ITS) analysis of weekly aggregated data pre- and post-PDU implementation in one rural and two urban sites using segmented regression, adjusting for temporal and seasonal trends. Primary outcomes were changes in the number of voluntary inpatient admissions to (acute) adult psychiatric wards and number of ED adult mental health-related attendances in the 24 months post-PDU implementation compared to that in the 24 months pre-PDU implementation.
Results
The two PDUs (one urban and one rural) with longer (average) stays and high staff-to-patient ratios observed post-PDU decreases in the pattern of weekly voluntary psychiatric admissions relative to pre-PDU trend (Rural: −0.45%/week, 95% confidence interval [CI] = −0.78%, −0.12%; Urban: −0.49%/week, 95% CI = −0.73%, −0.25%); PDU implementation in each was associated with an estimated 35–38% reduction in total voluntary admissions in the post-PDU period. The (urban) PDU with the highest throughput, lowest staff-to-patient ratio and shortest average stay observed a 20% (−20.4%, CI = −29.7%, −10.0%) level reduction in mental health-related ED attendances post-PDU, although there was little impact on long-term trend. Pooled analyses across sites indicated a significant reduction in the number of voluntary admissions following PDU implementation (−16.6%, 95% CI = −23.9%, −8.5%) but no significant (long-term) trend change (−0.20%/week, 95% CI = −0.74%, 0.34%) and no short- (−2.8%, 95% CI = −19.3%, 17.0%) or long-term (0.08%/week, 95% CI = −0.13, 0.28%) effects on mental health-related ED attendances. Findings were largely unchanged in secondary (ITS) analyses that considered the introduction of other service initiatives in the study period.
Conclusions
The introduction of PDUs was associated with an immediate reduction of voluntary psychiatric inpatient admissions. The extent to which PDUs change long-term trends of voluntary psychiatric admissions or impact on psychiatric presentations at ED may be linked to their configuration. PDUs with a large capacity, short length of stay and low staff-to-patient ratio can positively impact ED mental health presentations, while PDUs with longer length of stay and higher staff-to-patient ratios have potential to reduce voluntary psychiatric admissions over an extended period. Taken as a whole, our analyses suggest that when establishing a PDU, consideration of the primary crisis-care need that underlies the creation of the unit is key.
Transient acquisition of methicillin-resistant Staphylococcus aureus (MRSA) on healthcare personnel (HCP) gloves and gowns following patient care has been examined. However, the potential for transmission to the subsequent patient has not been studied. We explored the frequency of MRSA transmission from patient to HCP, and then in separate encounters from contaminated HCP gloves and gowns to a subsequent simulated patient as well as the factors associated with these 2 transmission pathways.
Methods:
We conducted a prospective cohort study with 2 parts. In objective 1, we studied MRSA transmission from random MRSA-positive patients to HCP gloves and gowns after specific routine patient care activities. In objective 2, we simulated subsequent transmission from random HCP gloves and gowns without hand hygiene to the next patient using a manikin proxy.
Results:
For the first objective, among 98 MRSA-positive patients with 333 randomly selected individual patient–HCP interactions, HCP gloves or gowns were contaminated in 54 interactions (16.2%). In a multivariable analysis, performing endotracheal tube care had the greatest odds of glove or gown contamination (OR, 4.06; 95% CI, 1.3–12.6 relative to physical examination). For the second objective, after 147 simulated HCP–patient interactions, the subsequent transmission of MRSA to the manikin proxy occurred 15 times (10.2%).
Conclusion:
After caring for a patient with MRSA, contamination of HCP gloves and gown and transmission to subsequent patients following HCP-patient interactions occurs frequently if contact precautions are not used. Proper infection control practices, including the use of gloves and gown, can prevent this potential subsequent transmission.
Unsupervised remote digital cognitive assessment makes frequent testing feasible and allows for measurement of learning across days on participants’ own devices. More rapid detection of diminished learning may provide a potentially valuable metric that is sensitive to cognitive change over short intervals. In this study we examine feasibility and predictive validity of a novel digital assessment that measures learning of the same material over 7 days in older adults.
Participants and Methods:
The Boston Remote Assessment for Neurocognitive Health (BRANCH) (Papp et al., 2021) is a web-based assessment administered over 7 consecutive days repeating the same stimuli each day to capture multi-day-learning slopes. The assessment includes Face-Name (verbal-visual associative memory), Groceries-Prices (numeric-visual associative memory), and Digits-Signs (speeded processing of numeric-visual associations). Our sample consisted of200 cognitively unimpaired older adults enrolled in ongoing observational studies (mean age=74.5, 63% female, 87% Caucasian, mean education=16.6) who completed the tasks daily, at home, on their own digital devices. Participants had previously completed in-clinic paper-and-pencil tests to compute a Preclinical Alzheimer’s Cognitive Composite (PACC-5). Mixed-effects models controlling for age, sex, and education were used to observe the associations between PACC-5 scores and both initial performance and multi-day learning on the three BRANCH measures.
Results:
Adherence was high with 96% of participants completing all seven days of consecutive assessment; demographic factors were not associated with differences in adherence. Younger participants had higher Day 1 scores all three measures, and learning slopes on Digit-Sign. Female participants performed better on Face-Name (T=3.35, p<.001) and Groceries-Prices (T=2.00, p=0.04) on Day 1 but no sex differences were seen in learning slopes; there were no sex differences on Digit-Sign. Black participants had lower Day 1 scores on Face-Name (T=-3.34, p=0.003) and Digit Sign (T=3.44, p=0.002), but no racial differences were seen on learning slopes for any measure. Education was not associated with any measure. First day performance on Face-Name (B=0.39, p<.001), but not learning slope B=0.008, p=0.302) was associated with the PACC5. For Groceries-Prices, both Day 1 (B=0.27, p<.001) and learning slope (B=0.02, p=0.03) were associated with PACC-5. The Digit-Sign scores at Day 1 (B=0.31, p<.001) and learning slope (B=0.06, p<.001) were also both associated with PACC-5.
Conclusions:
Seven days of remote, brief cognitive assessment was feasible in a sample of cognitively unimpaired older adults. Although various demographic factors were associated with initial performance on the tests, multi-day-learning slopes were largely unrelated to demographics, signaling the possibility of its utility in diverse samples. Both initial performance and learning scores on an associative memory and processing speed test were independently related to baseline cognition indicating that these tests’ initial performance and learning metrics are convergent but unique in their contributions. The findings signal the value of measuring differences in learning across days as a means towards sensitively identifying differences in cognitive function before signs of frank impairment are observed. Next steps will involve identifying the optimal way to model multi-day learning on these subtests to evaluate their potential associations with Alzheimer’s disease biomarkers.
The gold standard for hand hygiene (HH) while wearing gloves requires removing gloves, performing HH, and donning new gloves between WHO moments. The novel strategy of applying alcohol-based hand rub (ABHR) directly to gloved hands might be effective and efficient.
Design:
A mixed-method, multicenter, 3-arm, randomized trial.
Setting:
Adult and pediatric medical-surgical, intermediate, and intensive care units at 4 hospitals.
Participants:
Healthcare personnel (HCP).
Interventions:
HCP were randomized to 3 groups: ABHR applied directly to gloved hands, the current standard, or usual care.
Methods:
Gloved hands were sampled via direct imprint. Gold-standard and usual-care arms were compared with the ABHR intervention.
Results:
Bacteria were identified on gloved hands after 432 (67.4%) of 641 observations in the gold-standard arm versus 548 (82.8%) of 662 observations in the intervention arm (P < .01). HH required a mean of 14 seconds in the intervention and a mean of 28.7 seconds in the gold-standard arm (P < .01). Bacteria were identified on gloved hands after 133 (98.5%) of 135 observations in the usual-care arm versus 173 (76.6%) of 226 observations in the intervention arm (P < .01). Of 331 gloves tested 6 (1.8%) were found to have microperforations; all were identified in the intervention arm [6 (2.9%) of 205].
Conclusions:
Compared with usual care, contamination of gloved hands was significantly reduced by applying ABHR directly to gloved hands but statistically higher than the gold standard. Given time savings and microbiological benefit over usual care and lack of feasibility of adhering to the gold standard, the Centers for Disease Control and Prevention and the World Health Organization should consider advising HCP to decontaminate gloved hands with ABHR when HH moments arise during single-patient encounters.
Three-dimensional (3D) food printing is a rapidly emerging technology offering unprecedented potential for customised food design and personalised nutrition. Here, we evaluate the technological advances in extrusion-based 3D food printing and its possibilities to promote healthy and sustainable eating. We consider the challenges in implementing the technology in real-world applications. We propose viable applications for 3D food printing in health care, health promotion and food waste upcycling. Finally, we outline future work on 3D food printing in food safety, acceptability and economics, ethics and regulations.
Multiplex polymerase chain reaction (PCR) respiratory panels are rapid, highly sensitive tests for viral and bacterial pathogens that cause respiratory infections. In this study, we (1) described best practices in the implementation of respiratory panels based on expert perspectives and (2) identified tools for diagnostic stewardship to enhance the usefulness of testing.
Methods:
We conducted a survey of the Society for Healthcare Epidemiology of America Research Network to explore current and future approaches to diagnostic stewardship of multiplex PCR respiratory panels.
Results:
In total, 41 sites completed the survey (response rate, 50%). Multiplex PCR respiratory panels were perceived as supporting accurate diagnoses at 35 sites (85%), supporting more efficient patient care at 33 sites (80%), and improving patient outcomes at 23 sites (56%). Thirteen sites (32%) reported that testing may support diagnosis or patient care without improving patient outcomes. Furthermore, 24 sites (58%) had implemented diagnostic stewardship, with a median of 3 interventions (interquartile range, 1–4) per site. The interventions most frequently reported as effective were structured order sets to guide test ordering (4 sites), restrictions on test ordering based on clinician or patient characteristics (3 sites), and structured communication of results (2 sites). Education was reported as “helpful” but with limitations (3 sites).
Conclusions:
Many hospital epidemiologists and experts in infectious diseases perceive multiplex PCR respiratory panels as useful tests that can improve diagnosis, patient care, and patient outcomes. However, institutions frequently employ diagnostic stewardship to enhance the usefulness of testing, including most commonly clinical decision support to guide test ordering.
OBJECTIVES/GOALS: The goal of this proposal is to develop a technology that combines calcium imaging via confocal microscopy, and force measurement via monolayer stress microscopy to perform simultaneous quantitative measurements of agonist-induced Ca2+ and mechanical signals in HASMCs. METHODS/STUDY POPULATION: The methods by which second messenger signals and changes in mechanical forces determine specific physiological responses are complex. Recent studies point to the importance of temporal and spatial encoding in determining signal specificity. Hence, approaches that probe both chemical and mechanical signals are needed. We combine hyperspectral imaging for second messenger signal measurements, monolayer stress microscopy for mechanical force measurements, and S8 analysis software for quantifying localized signals. Imaging was performed using an excitation-scanning hyperspectral microscope. Hyperspectral images were unmixed to identify signals from fluorescent labels and microparticles. Images were analyzed to quantify localized force dynamics through monolayer stress microscopy. RESULTS/ANTICIPATED RESULTS: Results indicate that localized and transient cellular signals can be quantified and mapped within cell populations. Importantly, these results establish a method for simultaneous interrogation of cellular signals and mechanical forces that may play synergistic roles in regulating downstream cellular physiology in confluent monolayers. DISCUSSION/SIGNIFICANCE: We will measure the distribution of chemical and mechanical signals within cells, providing insight into the dynamics of cell signaling. Studies will have implication in the understanding of infections, drug delivery in which non-uniform distributions of drugs are a certainty, and in understanding coordinated responses in cellular systems.
Hydrodynamic clogging in planar channels is studied via direct numerical simulation for the first time, utilising a novel numerical test cell and stochastic methodology with special focus on the influence of electrostatic forces. Electrostatic physics is incorporated into an existing coupled lattice Boltzmann-discrete element method framework, which is verified rigorously. First, the dynamics of the problem is governed by the Stokes number, $St$. At low $St$, the clogging probability, $P$, increases with $St$ due to increasing collision frequency. At high $St$, however, $P$ decreases with $St$ due to quadratic scaling of hydrodynamic force acting on arches. Under electrostatic forces, clogging is well represented by the wall adhesion number, $Ad_w$. For $Ad_w \lesssim 4$, the mechanical dependence on $St$ is exhibited, while for $4 < Ad_w < 20$, there is a transition to high $P$ as sliding along, and attachment to, the channel surface occurs increasingly. For $Ad_w \gtrsim 20$, clogging occurs with $P > 0.95$. Particle agglomeration, however, can also decrease $P$ due to diminished interaction with channel walls. Distinct parametric regions of clogging are also observed in relation to the channel width, while a critical width $w/d^*=2.6$ is reported, which increases to $w/d^*=4$ with strong electrostatic surface attachment. The number of particles that form stable arches across a planar channel is determined to be $n=\left \lceil {w/d}\right \rceil + 1$. Finally, sensitivity to the Coulomb friction coefficient is determined in favour of calibrating numerical parameters to bulk system behaviour. The greatest sensitivities occur in situations where the arch stability is lowest, while clogging becomes independent of friction for strong wall adhesion.