We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Interest groups are an important influence in the subnational policymaking process. Previously, environmental policy scholars measured the strength of environmental groups in the American policymaking subnational process by using proxies like state-level group membership in major nationwide environmental organizations (e.g., Sierra Club). Although these prior measures of group strength have face validity, recent scholarship suggests that the utilization of group financial resources is a better measure of the influence of interest groups in state-level models. We take this approach and provide a new way to measure state-level environmental interests by using aggregated financial information (income and assets) from Internal Revenue Service (IRS) data obtained via the National Center for Charitable Statistics (NCCS). This measure provides several advantages over previous approaches because it varies over time, is derived from easily accessible public data, includes a greater diversity of environmental organizations, and it is considered reliable by prior scholars. We demonstrate its empirical value by deploying our measure in a model of state policy adoption. We encourage researchers to further utilize this new measure in their analysis of environmental advocacy at the subnational level.
Observational studies consistently report associations between tobacco use, cannabis use and mental illness. However, the extent to which this association reflects an increased risk of new-onset mental illness is unclear and may be biased by unmeasured confounding.
Methods
A systematic review and meta-analysis (CRD42021243903). Electronic databases were searched until November 2022. Longitudinal studies in general population samples assessing tobacco and/or cannabis use and reporting the association (e.g. risk ratio [RR]) with incident anxiety, mood, or psychotic disorders were included. Estimates were combined using random-effects meta-analyses. Bias was explored using a modified Newcastle–Ottawa Scale, confounder matrix, E-values, and Doi plots.
Results
Seventy-five studies were included. Tobacco use was associated with mood disorders (K = 43; RR: 1.39, 95% confidence interval [CI] 1.30–1.47), but not anxiety disorders (K = 7; RR: 1.21, 95% CI 0.87–1.68) and evidence for psychotic disorders was influenced by treatment of outliers (K = 4, RR: 3.45, 95% CI 2.63–4.53; K = 5, RR: 2.06, 95% CI 0.98–4.29). Cannabis use was associated with psychotic disorders (K = 4; RR: 3.19, 95% CI 2.07–4.90), but not mood (K = 7; RR: 1.31, 95% CI 0.92–1.86) or anxiety disorders (K = 7; RR: 1.10, 95% CI 0.99–1.22). Confounder matrices and E-values suggested potential overestimation of effects. Only 27% of studies were rated as high quality.
Conclusions
Both substances were associated with psychotic disorders and tobacco use was associated with mood disorders. There was no clear evidence of an association between cannabis use and mood or anxiety disorders. Limited high-quality studies underscore the need for future research using robust causal inference approaches (e.g. evidence triangulation).
A surge of pediatric respiratory illnesses beset the United States in late 2022 and early 2023. This study evaluated within-surge hospital acute and critical care resource availability and utilization. The study aimed to determine pediatric hospital acute and critical care resource use during a respiratory illness surge.
Methods
Between January and February 2023, an online survey was sent to the sections of hospital medicine and critical care of the American Academy of Pediatrics, community discussion forums of the Children’s Hospital Association, and PedSCCM—a pediatric critical care website. Data were summarized with median values and interquartile range.
Results
Across 35 hospitals with pediatric intensive care units (PICU), increase in critical care resource use was significant. In the month preceding the survey, 26 (74%) hospitals diverted patients away from their emergency department (ED) to other hospitals, with 46% diverting 1-5 patients, 23% diverting 6-10 patients, and 31% diverting more than 10 patients. One in 5 hospitals reported moving patients on mechanical ventilation from the PICU to other settings, including the ED (n = 2), intermediate care unit (n = 2), cardiac ICU (n = 1), ward converted to an ICU (n = 1), and a ward (n = 1). Utilization of human critical care resources was high, with PICU faculty, nurses, and respiratory therapists working at 100% capacity.
Conclusions
The respiratory illness surge triggered significant hospital resource use and diversion of patients away from hospitals. Pediatric public health emergency-preparedness should innovate around resource capacity.
Emergency psychiatric care, unplanned hospital admissions, and inpatient health care are the costliest forms of mental health care. According to Statistics Canada (2018), almost 18% (5.3 million) of Canadians reported needing mental health support. However, just above half of this figure (56.2%) have reported their needs were fully met. To further expand capacity and access to mental health care in the province, Nova Scotia Health has launched a novel mental health initiative, the Rapid Access, and Stabilization Program (RASP).
Objectives
This study evaluates the effectiveness and impact of the RASP on high-cost health services utilization (e.g. ED visits, mobile crisis visits, and inpatient treatments) and related costs. It also assesses healthcare partners’ (e.g. healthcare providers, policymakers, community leaders) perceptions and patient experiences and satisfaction with the program and identifies sociodemographic characteristics, psychological conditions, recovery, well-being, and risk measures in the assisted population.
Methods
This is a hypothesis-driven program evaluation study that employs a mixed methods approach. A within-subject comparison will examine health services utilization data from patients attending RASP, one year before and one year after their psychiatry assessment at the program. A controlled between-subject comparison will use historical data from a control population will examine whether possible changes in high-cost health services utilization are associated with the intervention (RASP). The primary analysis involves extracting secondary data from provincial information systems, electronic medical records, and regular self-reported clinical assessments. Additionally, a qualitative sub-study will examine patient experience and satisfaction, and examine health care partners’ impressions.
Results
The results for the primary, secondary, and qualitative outcome measures to be available within 6 months of study completion. We expect that RASP evaluation findings will demonstrate a minimum 10% reduction in high-cost health services utilization and corresponding 10% cost savings, and also a reduction in the wait times for patient consultations with psychiatrists to less than 30 calendar days. In addition, we anticipate that patients, healthcare providers, and healthcare partners would express high levels of satisfaction with the new service.
Conclusions
This study will demonstrate the results of the Mental Health and Addictions Program (MHAP) efforts to provide stepped-care, particularly community-based support, to individuals with mental illnesses. Results will provide new insights into a novel community-based approach to mental health service delivery and contribute to knowledge on how to implement mental health programs across varying contexts.
The small angle X-ray scattering data obtained in an earlier investigation of a series of Na-montmorillonite clay samples containing varying concentrations of sodium metaphosphate have been used to calculate the potential energy φ(x) of the interaction between two isolated parallel clay platelets separated by a distance x. All φ(x) curves have the form expected for Na-montmorillonite. In each curve there is a potential well for a platelet separation approximately equal to the most probable separation distance determined in the earlier study. Because the depth of the potential well is of the order of 0·01 eV for all samples, the attractive forces are relatively weak. While the calculated φ(x) functions are not highly accurate, in future investigations precautions can be taken to increase the reliability of the computed potential energy functions. This preliminary study suggests that determination of φ(x) from small angle X-ray scattering data can be a useful method for quantitative study of interparticle forces in Na-montmorillonite clays.
Analysis of Mossbauer effect in layer silicates provides a spectroscopic method for determining valences and coordination of iron. In this study Mossbauer spectra were obtained for amesite, cronstedtite, nontronite, two glauconites, biotite, lepidomelane, chlorite, minnesotaite, vermiculite. stilpnomelane, and chloritoid.
Trivalent iron was detected in tetrahedral coordination. Abundant trivalent iron in octahedral coordination apparently causes quadrupole splitting values of divalent iron in the same mineral to decrease. This phenomenon was noted in cronstedtite and glauconite. In cases where divalent iron predominates in the mineral, the quadrupole splitting is larger. It is generally accepted that ferrous iron is largely in octahedral coordination. This suggests that the octahedral sites may be more distorted when ferric iron is present in the octahedral sheet. In biotite, quadrupole splitting of divalent iron is decreased when trivalent iron is present in tetrahedral sheets. This suggests that there is also more distortion in the octahedral sheet because of iron in tetrahedral positions.
The complementary feeding period (6-23 months of age) is when solid foods are introduced alongside breastmilk or infant formula and is the most significant dietary change a person will experience. The introduction of complementary foods is important to meet changing nutritional requirements(1). Despite the rising Asian population in New Zealand, and the importance of nutrition during the complementary feeding period, there is currently no research on Asian New Zealand (NZ) infants’ micronutrient intakes from complementary foods. Complementary foods are a more easily modifiable component of the diet than breastmilk or other infant milk intake. This study aimed to compare the dietary intake of micronutrients from complementary foods of Asian infants and non-Asian infants in NZ. This study reported a secondary analysis of the First Foods New Zealand cross-sectional study of infants (aged 7.0-9.9 months) in Dunedin and Auckland. 24-hour recall data were analysed using FoodFiles 10 software with the NZ food composition database FOODfiles 2018, and additional data for commercial complementary foods(2). The multiple source method was used to estimate usual dietary intake. Ethnicity was collected from the main questionnaire of the study, answered by the respondents (the infant’s parent/caregiver). Within the Asian NZ group, three Asian subgroups were identified – South East Asian, East Asian, and South Asian. The non-Asian group included all remaining participants of non-Asian ethnicities. Most nutrient reference values (NRV’s)(3) available for the 7-12 month age group are for total intake from complementary foods and infant milks, so the adequacy for the micronutrient intakes from complementary foods alone could not be determined. Vitamin A was the only micronutrient investigated in this analysis that had an NRV available from complementary foods only, allowing conclusions around adequacy to be made. The Asian NZ group (n = 99) had lower mean group intakes than the non-Asian group (n = 526) for vitamin A (274µg vs. 329µg), and vitamin B12 (0.49µg vs. 0.65µg), and similar intakes for vitamin C (27.8mg vs. 28.5mg), and zinc (1.7mg vs. 1.9mg). Mean group iron intakes were the same for both groups (3.0mg). The AI for vitamin A from complementary foods (244µg) was exceeded by the mean intakes for both groups, suggesting that Vitamin A intakes were adequate. The complementary feeding period is a critical time for obtaining nutrients essential for development and growth. The results from this study indicate that Asian NZ infants have lower intakes of two of the micronutrients of interest than the non-Asian infants in NZ. However, future research is needed with the inclusion of infant milk intake in these groups to understand the total intake of the micronutrients. Vitamin A intakes do appear to be adequate in NZ infants.
The prevalence of food allergies in New Zealand infants is unknown; however, it is thought to be similar to Australia, where the prevalence is over 10% of 1-year-olds(1). Current New Zealand recommendations for reducing the risk of food allergies are to: offer all infants major food allergens (age appropriate texture) at the start of complementary feeding (around 6 months); ensure major allergens are given to all infants before 1 year; once a major allergen is tolerated, maintain tolerance by regularly (approximately twice a week) offering the allergen food; and continue breastfeeding while introducing complementary foods(2). To our knowledge, there is no research investigating whether parents follow these recommendations. Therefore, this study aimed to explore parental offering of major food allergens to infants during complementary feeding and parental-reported food allergies. The cross-sectional study included 625 parent-infant dyads from the multi-centred (Auckland and Dunedin) First Foods New Zealand study. Infants were 7-10 months of age and participants were recruited in 2020-2022. This secondary analysis included the use of a study questionnaire and 24-hour diet recall data. The questionnaire included determining whether the infant was currently breastfed, whether major food allergens were offered to the infant, whether parents intended to avoid any foods during the first year of life, whether the infant had any known food allergies, and if so, how they were diagnosed. For assessing consumers of major food allergens, 24-hour diet recall data was used (2 days per infant). The questionnaire was used to determine that all major food allergens were offered to 17% of infants aged 9-10 months. On the diet recall days, dairy (94.4%) and wheat (91.2%) were the most common major food allergens consumed. Breastfed infants (n = 414) were more likely to consume sesame than non-breastfed infants (n = 211) (48.8% vs 33.7%, p≤0.001). Overall, 12.6% of infants had a parental-reported food allergy, with egg allergy being the most common (45.6% of the parents who reported a food allergy). A symptomatic response after exposure was the most common diagnostic tool. In conclusion, only 17% of infants were offered all major food allergens by 9-10 months of age. More guidance may be required to ensure current recommendations are followed and that all major food allergens are introduced by 1 year of age. These results provide critical insight into parents’ current practices, which is essential in determining whether more targeted advice regarding allergy prevention and diagnosis is required.
Although food insecurity affects a significant proportion of young children in New Zealand (NZ)(1), evidence of its association with dietary intake and sociodemographic characteristics in this population is lacking. This study aims to assess the household food security status of young NZ children and its association with energy and nutrient intake and sociodemographic factors. This study included 289 caregiver and child (1-3 years old) dyads from the same household in either Auckland, Wellington, or Dunedin, NZ. Household food security status was determined using a validated and NZ-specific eight-item questionnaire(2). Usual dietary intake was determined from two 24-hour food recalls, using the multiple source method(3). The prevalence of inadequate nutrient intake was assessed using the Estimated Average Requirement (EAR) cut-point method and full probability approach. Sociodemographic factors (i.e., socioeconomic status, ethnicity, caregiver education, employment status, household size and structure) were collected from questionnaires. Linear regression models were used to estimate associations with statistical significance set at p <0.05. Over 30% of participants had experienced food insecurity in the past 12 months. Of all eight indicator statements, “the variety of foods we are able to eat is limited by a lack of money,” had the highest proportion of participants responding “often” or “sometimes” (35.8%). Moderately food insecure children exhibited higher fat and saturated fat intakes, consuming 3.0 (0.2, 5.8) g/day more fat, and 2.0 (0.6, 3.5) g/day more saturated fat compared to food secure children (p<0.05). Severely food insecure children had lower g/kg/day protein intake compared to food secure children (p<0.05). In comparison to food secure children, moderately and severely food insecure children had lower fibre intake, consuming 1.6 (2.8, 0.3) g/day and 2.6 (4.0, 1.2) g/day less fibre, respectively. Severely food insecure children had the highest prevalence of inadequate calcium (7.0%) and vitamin C (9.3%) intakes, compared with food secure children [prevalence of inadequate intakes: calcium (2.3%) and vitamin C (2.8%)]. Household food insecurity was more common in those of Māori or Pacific ethnicity; living in areas of high deprivation; having a caregiver who was younger, not in paid employment, or had low educational attainment; living with ≥2 other children in the household; and living in a sole-parent household. Food insecure young NZ children consume a diet that exhibits lower nutritional quality in certain measures compared to their food-secure counterparts. Food insecurity was associated with various sociodemographic factors that are closely linked with poverty or low income. As such, there is an urgent need for poverty mitigation initiatives to safeguard vulnerable young children from the adverse consequences of food insecurity.
OBJECTIVES/GOALS: Donor hearts are transported in cold storage (CS) and undergo ischemia-reperfusion injury (IRI) when transplanted. IRI injures microvascular endothelial cells (EC), heightens the immune response, and has been associated with increased autophagy. We aim to understand the changes in autophagy during CS and IRI and its impact on EC immunogenicity. METHODS/STUDY POPULATION: To study autophagy changes during IRI, immunoblotting for autophagy markers was performed in mouse cardiac ECs (MCECs) lysates. MCECs were in a cold preservation solution in a hypoxic chamber for 6 hours(h) and warm conditions with culture medium for 24 h. MCECs, under standard conditions, served as controls. Secreted interferon-gamma (IFN-γ) levels were quantified via ELISA to study autophagy and EC immunogenicity. MCEC-sensitized CD8+ T-cells were isolated from C57BL/6 spleens and co-cultured with MCECs pre-treated for 16 h with rapamycin or starvation, autophagy inducers, or chloroquine, an autophagy inhibitor under normal or IRI conditions. MCECs without any treatment served as controls. RESULTS/ANTICIPATED RESULTS: To determine autophagy levels in IRI, immunoblotting of MCEC lysates revealed a significant increase (P<0.01) in the established autophagy marker, LC3, at early time points post-reperfusion compared to NT conditions, indicating more autophagosome formation during CS and IRI. To assess the role of autophagy in EC immunogenicity, the co-culture experiment revealed that autophagy induction in MCECs under NT and HCS conditions with rapamycin had a 74.9-fold and 51.5-fold reduction of IFN-γ (pg/mL), resepectively, compared to the non-treated controls. In contrast, autophagy inhibition in MCECs with chloroquine resulted in 1.82-fold increase of IFN-γ compared to untreated controls. This suggests a protective role of autophagy in ECs during IRI. DISCUSSION/SIGNIFICANCE: We observed that autophagy may be protective during IRI by mitigating EC immunogenicity. Thus, pharmacologically modulating microvascular EC autophagy in donor hearts prior to transplantation may mitigate insults incurred during CS and IRI.
The National Environmental Isotope Facility (NEIF) Radiocarbon Laboratory at the Scottish Universities Environmental Research Centre (SUERC) performs radiocarbon measurement of a wide range of sample matrices for applications in environmental research. Radiocarbon is applied to palaeoenvironmental, palaeoceanographic, and palaeoclimatic investigations, as well as work to understand the source, fate, turnover, and age of carbon in the modern carbon cycle. The NEIF Radiocarbon Laboratory supports users in the development and deployment of novel sampling techniques and laboratory approaches. Here, we give an overview of methods and procedures used by the laboratory to support the field collection, laboratory processing, and measurement of samples. This includes in-house development of novel and/or specialized methods and approaches, such as field collection of CO2 and CH4, hydropyrolysis, and ramped oxidation. The sample types covered include organic remains (e.g., plant material, peat, wood, charcoal, proteins), carbonates (e.g., speleothems, foraminifera, mollusc shell, travertine), waters (dissolved organic and inorganic carbon), gases (CO2 and CH4), soils and sediments (including sub-fractions).
To (1) understand the role of antibiotic-associated adverse events (ABX-AEs) on antibiotic decision-making, (2) understand clinician preferences for ABX-AE feedback, and (3) identify ABX-AEs of greatest clinical concern.
Design:
Focus groups.
Setting:
Academic medical center.
Participants:
Medical and surgical house staff, attending physicians, and advanced practice practitioners.
Methods:
Focus groups were conducted from May 2022 to December 2022. Participants discussed the role of ABX-AEs in antibiotic decision-making and feedback preferences and evaluated the prespecified categorization of ABX-AEs based on degree of clinical concern. Thematic analysis was conducted using inductive coding.
Results:
Four focus groups were conducted (n = 15). Six themes were identified. (1) ABX-AE risks during initial prescribing influence the antibiotic prescribed rather than the decision of whether to prescribe. (2) The occurrence of an ABX-AE leads to reassessment of the clinical indication for antibiotic therapy. (3) The impact of an ABX-AE on other management decisions is as important as the direct harm of the ABX-AE. (4) ABX-AEs may be overlooked because of limited feedback regarding the occurrence of ABX-AEs. (5) Clinicians are receptive to feedback regarding ABX-AEs but are concerned about it being punitive. (6) Feedback must be curated to prevent clinicians from being overwhelmed with data. Clinicians generally agreed with the prespecified categorizations of ABX-AEs by degree of clinical concern.
Conclusions:
The themes identified and assessment of ABX-AEs of greatest clinical concern may inform antibiotic stewardship initiatives that incorporate reporting of ABX-AEs as a strategy to reduce unnecessary antibiotic use.
This essay recovers the newspaper writings of the Omaha journalist Susette Bright Eyes La Flesche as the first Indigenous woman to publish about the 1890 Wounded Knee Massacre. Her eyewitness accounts challenge mainstream histories of the massacre that focus largely on frontier violence and Indigenous death by rewriting Wounded Knee as a place of Indigenous resilience and of an Indigenous community bound together by the rights and responsibilities of kinship. By prioritizing the stories of surviving Indigenous women and girls, Bright Eyes's reporting speaks to and becomes a precedent for ongoing acts and discourses of Indigenous activism, feminism, resurgence, and self-determination.
We studied the extent of carbapenemase-producing Enterobacteriaceae (CPE) sink contamination and transmission to patients in a nonoutbreak setting.
Methods:
During 2017–2019, 592 patient-room sinks were sampled in 34 departments. Patient weekly rectal swab CPE surveillance was universally performed. Repeated sink sampling was conducted in 9 departments. Isolates from patients and sinks were characterized using pulsed-field gel electrophoresis (PFGE), and pairs of high resemblance were sequenced by Oxford Nanopore and Illumina. Hybrid assembly was used to fully assemble plasmids, which are shared between paired isolates.
Results:
In total, 144 (24%) of 592 CPE-contaminated sinks were detected in 25 of 34 departments. Repeated sampling (n = 7,123) revealed that 52%–100% were contaminated at least once during the sampling period. Persistent contamination for >1 year by a dominant strain was common. During the study period, 318 patients acquired CPE. The most common species were Klebsiella pneumoniae, Escherichia coli, and Enterobacter spp. In 127 (40%) patients, a contaminated sink was the suspected source of CPE acquisition. For 20 cases with an identical sink-patient strain, temporal relation suggested sink-to-patient transmission. Hybrid assembly of specific sink-patient isolates revealed that shared plasmids were structurally identical, and SNP differences between shared pairs, along with signatures for potential recombination events, suggests recent sharing of the plasmids.
Conclusions:
CPE-contaminated sinks are an important source of transmission to patients. Although traditionally person-to-person transmission has been considered the main route of CPE transmission, these data suggest a change in paradigm that may influence strategies of preventing CPE dissemination.
Inductive reasoning training has been found to be particularly effective at improving inductive reasoning, with some evidence of improved everyday functioning and driving. Telehealth may be useful for increasing access to, reducing time and travel burdens of, and reducing the need for physical spaces for cognitive training. On the other hand, telehealth increases technology burden. The present study investigated the feasibility and effectiveness of implementing an inductive reasoning training program, designed to mimic the inductive reasoning arm used in a large multi-site clinical trial (Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE)), via telehealth (using Zoom and Canvas as delivery platforms).
Participants and Methods:
31 older adult participants (mean age = 71.2, range = 65-85; mean education = 15.5, range = 13-18; 64.5% female; 87.1% white) received 10-sessions of telehealth-delivered inductive reasoning training over 5 weeks. Comparison groups (inductive reasoning trained and no-contact controls) were culled from the in-person ACTIVE trial via propensity matching. All participants completed three pretest and posttest inductive reasoning measures (Word Series, Letter Series, Letter Sets), as well as a posttest measure assessing participant perceptions of the telehealth intervention. In addition, at the end of each of the ten training sessions, participants received a final inductive reasoning assessment.
Results:
Telehealth participants provided high levels of endorsement suggesting that the telehealth training program was useful, reliable, easy to use and interact on, and employed a useable interface. Participants were generally satisfied with the training program. With regard to performance, telehealth participants demonstrated greater gains than untrained controls on Letter Series [F(1, 116) = 9.81, p = 0.002, partial eta-squared = 0.084] and Letter Sets [F(1, 116) = 8.69, p = 0.004, partial eta-squared = 0.074], but did not differ in improvement on Word Series [F(1, 116) = 1.145, p = 0.287, partial eta-squared = 0.010]. Furthermore, telehealth participants evinced similar inductive reasoning gains as matched inperson inductive reasoning trained participants on Letter Series [F(1, 116) = 1.24, p = 0.226, partial eta-squared = 0.01] and Letter Sets [F(1, 116) = 1.29, p = 0.259, partial eta-squared = 0.01], but demonstrated fewer gains in Word Series performance [F(1, 116) = 25.681, p = < 0.001, partial eta-squared = 0.181]. On the end-of-session reasoning tests, telehealth-trained participants showed a similar general pattern of improvement across the ten training sessions and did not differ significantly from in-person trained comparison participants.
Conclusions:
Cognitive training via telehealth evinced similar gains across nearly all measures as its in-person counterpart. However, telehealth also led to substantial challenges regarding the telehealth training platform. Despite these challenges, participants reported perceiving increased competence with computer use, peripherals (mice, trackpad), and videoconferencing. These may be ancillary benefits of such training and may be maximized if more age-friendly learning management systems are investigated. Overall, this study suggests that telehealth delivery may be a viable form of cognitive training in inductive reasoning, and future studies could increase performance gains by optimizing the online training platform for older adults.
An ideal vision model accounts for behavior and neurophysiology in both naturalistic conditions and designed lab experiments. Unlike psychological theories, artificial neural networks (ANNs) actually perform visual tasks and generate testable predictions for arbitrary inputs. These advantages enable ANNs to engage the entire spectrum of the evidence. Failures of particular models drive progress in a vibrant ANN research program of human vision.
This paper describes the development process of a mobile app-based version of the World Health Organization mental health Gap Action Programme Intervention Guide, testing of the app prototypes, and its functionality in the assessment and management of people with mental health conditions in Nepal. Health workers’ perception of feasibility and acceptability of using mobile technology in mental health care was assessed during the inspiration phase (N = 43); the ideation phase involved the creation of prototypes; and prototype testing was conducted over multiple rounds with 15 healthcare providers. The app provides provisional diagnoses and treatment options based on reported symptoms. Participants found the app prototype useful in reminding them of the process of assessment and management of mental disorders. Some challenges were noted, these included a slow app prototype with multiple technical problems, including difficulty in navigating ‘yes’/‘no’ options, and there were challenges reviewing detailed symptoms of a particular disorder using a “more information” icon. The initial feasibility work suggests that if the technical issues are addressed, the e-mhGAP warrants further research to understand if it is a useful method in improving the detection of people with mental health conditions and initiation of evidence-based treatment in primary healthcare facilities.