We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The objective of this study was to explore barriers and facilitators to utilising a range of food assistance resources as reported by parents living with or at risk for food insecurity (FI), as well as parents’ recommendations for improving utilisation of these resources. Qualitative data from semi-structured interviews about parents’ perspectives on interventions to address FI were analysed using a hybrid deductive/inductive thematic approach. Parents were drawn from the larger Family Matters longitudinal cohort study (N = 1,307), which was recruited from primary care clinics in Minnesota. Forty racially and ethnically diverse parents (Mage = 38.5 years; 97.5% mothers; 85% parents of colour) were recruited by food security level, with ten parents representing each level (i.e. high, marginal, low, very low). Six overarching qualitative themes were identified, which indicated the importance of (1) comfort level seeking assistance; (2) routine screening to assess need; (3) advertising, referrals, and outreach; (4) adequacy of policies and programmes to address need; (5) resource proximity and delivery; and (6) acceptability of foods/benefits provided. With some exceptions, these themes were generally represented from more than one angle (i.e. as barriers, facilitators, recommendations) and raised as relevant across different types of assistance (e.g. federal food assistance programmes, food pantries) and different settings (e.g. schools, healthcare). This study identified key factors influencing food assistance utilisation across multiple dimensions of access. These factors—which range from psychosocial to logistical in nature—should be considered in efforts to expand the reach of food assistance programmes and, in turn, improve food security among families.
The 2021 State of the World’s Children Report (UNICEF 2021) makes it clear that mental health is a human right and a global good. Research in a variety of fields, including DOHaD, suggests that infancy is a critical period in both brain formation and the formation of positive relational networks that are the grounds for development and adult well-being. Strong evidence that mental health is adversely affected by poor socio-economic conditions suggests the need for carefully directing resources towards structural conditions. At the same time, positive attachment relations within caregiver–child dyads can offset some environmental insults and futures of ill health. The field of infant mental health (IMH) pays attention to the formation of these relationships in the earliest periods of life. This chapter describes efforts to localise universalist models of infant well-being in South Africa, a low-resource setting. These include a new masters’ level training programme and diagnostic tools that can help to sensitise health practitioners to infant well-being. The discussion offers one route to reframing Euro-American models for local contexts while retaining the insights that strong relational capacities can generate resilience in difficult contexts. Its emphasis on historical context, local meaning, and social environment is instructive for DOHaD scholarship.
Deinstitutionalization of nursing care in European counties relies profoundly on the mobilization of the caregivers and municipal homecare services. Yet, caring for home-dwelling people with dementia (PwD) can be stressful and resource demanding. The LIVE@Home.Path trial tailored, implemented, and evaluated the multicomponent LIVE intervention on informal caregivers’ burden in dyads of home-dwelling PwDs and their families.
Method:
From 2019 to 2021, we conducted a 24-month multicenter, multicomponent, stepped-wedge randomized control trial including dyads of people ≥65 years with mild to moderate dementia with minimum 1h/week contact with their informal caregiver. The user-developed Learning, Innovation, Volunteer support, and Empowerment (LIVE) intervention was implemented by municipal coordinators over 6 months periods. In an intention-to-treat analysis, we applied mixed-effect regression models accounting for time and confounding factors to evaluate the effect of the intervention on Relative Stress Scale (RSS), Resource Utilization in Dementia (RUD) and Clinical Global Impression of Change (CGIC).
Results:
A total of 280 dyads were included at baseline, mean age of PwD was 82.2 years, 63% female, 43% lived alone, 36% had Alzheimer’s dementia, median MMSE was 20 (range 0-30) and median FAST score 4 (range 1-7). Caregivers were on average 66 years, 64% female, 49% were the PwDs child. At baseline, 80 dyads were randomized to intervention sequence 1 of which 67 received the intervention, corresponding numbers for sequence 2 and 3 were 97/ 57 and 103/50. During the active intervention period, time spent in personal activities of daily living significantly increased with 2.8 hours/months compared to 1.2 hours/months increase in the control period, total score of RSS was stable in the intervention period (0.36 points) (range 0-60), while it increased significantly in the control period (27.0 points), CGIG increased significantly only in the intervention period (0.5 points) (range: -5 worsening, 5 improvement).
Conclusion:
Although caregivers reported more care time during the intervention periods, they did not experience more stress which may be related to their increased understanding of dementia. Increase in reported care time might also reflect the increased understanding of dementia, leading to more realistic evaluation of own time contribution.
Dementia is not an unavoidable consequence of aging, but for most home-dwelling people with dementia (PwD) a result of complex chronic health conditions. About 95% of PwD have multimorbidity, which requires a multicomponent approach and interdisciplinary collaboration to support patients and relatives, and to implement welfare technology and smart solutions.
Method:
The LIVE@Home.Path study is a 2-year, mixed-method, stepped-wedged, cluster randomized controlled trial, including home-dwelling PwD and their informal caregivers (N=320 dyads) in Norway (May 2019 – December 2021), to investigate the efficacy of the multicomponent LIVE intervention (LIVE is the acronym for Learning, Innovation, Volunteerism, and Empowerment) on resource utilization and use of welfare technology. The intervention was implemented by a skilled coordinator from the municipality with high focus on use, usefulness, and experiences in welfare technology, both at baseline and during the implementation period.
Results:
At baseline, we found that most participants had traditional equipment such as stove guards (43.3%), social alarms (39.5%) or everyday technology (45.3%) (e.g., calendar, door locks). A social alarm was more often available for alone-living elderly women, while tracking devices (14.9%) were associated with lower age. Everyday technology was more often available for women at increased age, higher comorbidity, and poor instrumental activities of daily living (IADL). In people with severe dementia, welfare technology was associated with poor IADL function, children as the main caregiver (61.3%), and having caregivers who contributed 81–100% to their care (49.5%).
Discussion:
We describe unmet potential for communication, tracking, and sensing technology and especially, for devices not offered by the municipalities. In our symposium, we will present early findings on the implementation effect of welfare technology and participants experiences related to usage and awareness.
Schizophrenia is a severe and common psychiatric disorder characterized by disturbed brain development. Brain-derived neurotrophic factor (BDNF) mediates differentiation and survival of neurons as well as synaptic plasticity during the brain development. Several studies have shown decreased serum levels of BDNF in chronic, first episode, and drug naïve schizophrenia patients. Folate provides the substrate for intracellular methylation reactions that are essential to normal brain development and function. Abnormal folate metabolism has been implicated in schizophrenia. For example, reduced maternal folate intake associated with an increased risk for schizophrenia. Also, low blood levels of folate have been reported in patients with schizophrenia, and are associated with clinical manifestation especially in the negative symptom domain.
Objectives
With this study, we want to know how BDNF levels at baseline in drug-naïve FEP are associated with folic acid.
Methods
Fifty drug-naïve FEP treated between April 2013 and July 2017 at the ETEP Program at Hospital del Mar were included. Inclusion criteria were: 1) age 18-35 years; 2) DSM-IV-TR criteria for brief psychotic disorder, schizophreniform disorder, schizophrenia or unspecified psychosis; 3) no previous history of severe neurological medical conditions or severe traumatic brain injury; 4) presumed IQ level > 80, and 5) no substance abuse or dependence disorders except for cannabis and/or nicotine use. All patients underwent an assessment at baseline including sociodemographic and clinical variables. Fasting blood samples were obtained before administering any medication at baseline and used to determine folic acid and BDNF levels.
Results
In our drug-naïve FEP sample, folic acid levels showed a significative positive correlation with BDNF levels at baseline (r = 0.584; p = 0.003). Moreover, we did a lineal regression model that showed that the baseline variables that better predict BDNF levels were folic acid levels, and cannabis use.
Conclusions
Our results are consistent with the findings from some of previous studies that also shows that lower folic acid levels are associated with lower BNDF levels at baseline in drug-naïve FEP. Folate deficiency is associated with cerebrovascular and neurological diseases, and mood disorders. The importance of folate in the nervous system was initially demonstrated in studies that established a greatly increased risk of neurodevelopmental disorders in the offspring of folate-deficient pregnant women. In the adult, epidemiological studies have linked lack of folate to neurodegenerative and neuropsychiatric diseases. However, the mechanisms by which chronic folate deficiency adversely affects CNS function are incompletely understood. Some studies in animals models have hypothesized that folate deficiency in animals could be associated with pyramidal cell loss and reduced hippocampal BDNF.
There is a paucity of research examining the patterning of socioeconomic disadvantages and mental health problems across multiple generations. The significance of research on multigenerational processes is based on a concern with if and how (dis)advantages are generated and sustained across generations, and how socioeconomic, mental health, and gender inequalities evolve over a longer period of time.
Objectives
The current study therefore aimed to investigate the interconnected transmissions of socioeconomic disadvantages and mental health problems from grandparents to grandchildren through the parents, as well as the extent to which these transmissions differ according to lineage (i.e., through matrilineal/patrilineal descent) and grandchild gender.
Methods
Drawing on the Stockholm Birth Cohort Multigenerational Study, the sample included 21,416 unique lineages by grandchild gender centered around cohort members born in 1953 (parental generation) as well as their children (grandchild generation) and their parents (grandparental generation). Based on local and national register data, socioeconomic disadvantages were operationalized as low income, and mental health problems as psychiatric disorders. A series of path models based on structural equation modelling were applied to estimate the associations between low income and psychiatric disorders across generations and for each lineage-G2 gender combination.
Results
We found a multigenerational transmission of low income through the patriline to grandchildren. Psychiatric disorders were transmitted through both the patriline and matriline, but only to grandsons. The patriline-grandson transmission of psychiatric disorders was partially operated via low income of the fathers. Furthermore, grandparents’ psychiatric disorders influenced their children’s and grandchildren’s income.
Conclusions
We conclude that there is evidence of transmissions of socioeconomic disadvantages and mental health problems across three generations, although these transmissions differ by lineage and grandchild gender. Our findings further highlight that grandparents’ mental health problems could cast a long shadow on their children’s and grandchildren’s socioeconomic outcomes, and that socioeconomic disadvantages in the intermediate generation may play an important role for the multigenerational transmission of mental health problems.
In Denmark and Sweden, surveys were undertaken to estimate the prevalence of leg problems in conventional broiler production. The Danish survey included 28 Ross 208 flocks, and the Swedish survey included 15 Ross 208 and 16 Cobb flocks. Leg problems included reduced walking ability (gait), tibial dyschondroplasia (TD), varus/valgus deformations (VV) and foot-pad dermatitis (FPD). Danish Ross chicks showed a significantly higher prevalence of gait score > 0, gait score > 2 and TD, but a lower prevalence of VV, than Swedish Ross chicks. Cobb chicks showed a significantly higher prevalence of gait score > 0, gait score > 2 and TD than Swedish Ross chicks, a significantly higher prevalence of VV than Danish Ross chicks, and a significantly lower prevalence of FPD than both Danish and Swedish Ross chicks. The two genotypes of Swedish chicks showed similar relationships between body weight and probability of gait score > 0, TD and VV, indicating that the difference in prevalence of these leg problems may be due to the difference in mean body weight at slaughter age. At body weights below 2300 g, Danish chicks showed a higher probability of gait score > 2 than Swedish chicks. Furthermore, at body weights below 1900 g, Danish chicks had a higher probability of TD than Swedish chicks, whereas at body weights above 2200 g they had a lower probability of TD. This indicates that the difference in prevalence of TD between Danish and Swedish chicks was due to differences in mean body weight at slaughter age as well as housing conditions. Therefore, further studies on the risk factors in relation to management and housing conditions are required.
Bodily contact with water is a novel and aversive experience for broiler chickens, and this has been used when designing the Latency to Lie (LTL) test. The original testing procedure, in which the birds are tested in groups, involves a certain settling period, which makes the test time-consuming to carry out on commercial broiler farms. Our modifications of the LTL test for on-farm use mean that a) the birds are tested individually without visual contact with other birds; and b) the water tub is already filled with water when the birds are placed in it. The results from the LTL tests can then be compared with the scores achieved for each individual bird on the commonly used ‘gait scoring’ procedure. At 14 farms participating in a larger survey, we used three birds of each gait score from 0 to 4 (when available) for LTL testing. The time spent standing before making the first attempt to lie down was recorded. The results show a clear negative correlation (r = -0.86, P < 0.001) between time spent standing and gait score. The mean LTL values for the different gait scores were all significantly (P < 0.01) different. There was no significant difference in LTL results between flocks. The method described appears to be well suited for on-farm use. If further developed, it could become a useful tool in monitoring programmes for the ongoing efforts aiming at decreasing the levels of leg weakness in modern broiler production.
The relationship between animal welfare at slaughter and slaughterhouse profitability is complex, with potential trade-offs between animal welfare costs and benefits. Slaughterhouses currently lack data support for decisions on investments that can improve both animal welfare and profitability. Therefore, this study mapped the economic impacts for slaughterhouse businesses of improved cattle and pig welfare at slaughter. Specific aims were to: (i) highlight the possible economic impact of animal welfare improvements, based on the scientific literature; (ii) develop an economic model demonstrating the theoretical contribution of animal welfare to slaughterhouse profitability; and (iii) validate the economic model through focus group interviews with slaughterhouse personnel in Sweden. The findings indicated that investing in animal welfare improvements could result in accumulation of an intangible asset that can be considered together with other production factors in the economic model. Model validation stressed the importance of selling by-products for the economic outcome and of smooth workflow for productivity. The study thus improves understanding of the economic impacts of animal welfare at slaughter and incentives for slaughterhouse businesses to improve animal welfare. The results are important for public and private policy-makers interested in enhancing animal welfare at slaughter.
The term ‘depopulation’ is used in this case to describe mass euthanasia or killing of groups of animals on a farm for emergency disease eradication purposes. There are a number of guidelines for monitoring animal welfare during such operations, eg the OIE Terrestrial Health Code and the EU regulation on protection of animals at the time of killing, which can be useful when designing a specific monitoring system for depopulation. In this paper, the responsibilities of the competent authorities are identified, and a systematic approach to monitoring on-farm killing is proposed, including three major critical points: i) animal handling prior to killing; ii) the stun/kill quality, ie the effectiveness of the method used to render the animals unconscious and dead; and iii) confirmation of death prior to carcase disposal. The importance of good biosecurity routines, efficient disease detection systems, relevant training of staff and thorough contingency planning to prevent animal welfare problems from arising is strongly emphasised. It is the responsibility of national competent authorities to provide the appointed official veterinarians in charge of monitoring animal welfare during depopulation with proper tools, including anything from appropriate knowledge and practical checklists to the authority to demand immediate corrective action when necessary, and to develop systems for feedback and incorporation of experiences from previous outbreaks into the national contingency plans.
In Sweden, laying hens are killed using the following methods: i) traditional slaughter; ii) on-farm with CO2 in a mobile container combined with a grinder; or iii) with CO2 inside the barn. The number of hens killed using the latter method has increased. During these killings a veterinarian is required to be present and report to the Swedish Board of Agriculture. Data were registered during four commercial killings and extracted from all official veterinary reports at CO2 whole-house killings in 2008-2010. On-farm monitoring showed that temperature decreased greatly and with high variability. The time until birds became unconscious after coming into contact with the gas, based on time until loss of balance, was 3-5 min. Veterinary reports show that 1.5 million laying hens were killed, in 150 separate instances. The most common non-compliance with legislation was failure to notify the regional animal welfare authorities prior to the killings. Six out of 150 killings were defined as animal welfare failures, eg delivery of insufficient CO2 or failure to seal buildings to achieve adequate gas concentration. Eleven were either potentially or completely unacceptable from the perspective of animal welfare. We conclude that, on the whole, the CO2 whole-house gas killing of spent hens was carried out in accordance with the appropriate legislation. Death was achieved reliably. However, there remain several risks to animal welfare and increased knowledge would appear vital in order to limit mistakes related to miscalculations of house volume, improper sealing or premature ventilation turn-off.
Electrical head-only stunning is a widely used method in sheep (Ovis aries) slaughter. To investigate the influence of current level on stun and meat quality in practice, two studies were carried out at a commercial slaughterhouse. In trial one, 200 lambs were randomly assigned to four groups with a current level of 0.6, 0.8, 1.0 and 1.25 A, respectively, using 50-Hz sine wave supply voltage and a stun duration of 10.5 s. In trial two, 135 lambs were randomly assigned to two groups, with electrical current of 1.25 A applied for 14 and 3 s. For each lamb, the position of the tongs was observed and classified as correct or incorrect. The stun quality was evaluated based on observations of the corneal reflex, eye movements, rhythmic breathing, head-righting reflex and kicking during the tonic phase. Blood splash (haemorrhages in Longissimus dorsi muscle) was evaluated four days after slaughter. Incorrect tongs’ positioning was seen commonly, and positively correlated with poor stun quality. The lowest current level tested produced an unsatisfactory stun in the majority of animals observed. Short stun duration increased the risk of a poor stun quality. There was no significant effect of current level, stun duration or tongs’ position on the risk of blood splash. These data underline the importance of a correct technique, including choice of tongs’ positioning, sufficient current levels and sufficient stun duration, for electrical stunning of lambs to achieve unconsciousness before sticking and thereby avoiding unnecessary suffering at commercial slaughter.
Family farming is still the main source of income for many people in the tropical regions of the world. At the same time, modern society is quickly becoming more aware of the welfare of animals for human consumption. The main objective of this study was to illustrate the need to modify certain aspects of the original Welfare Quality® (WQ) protocols developed by the EU-funded WQ project, under the conditions of small community farmers in the tropics. Thirty-four dual purpose farms in the State of Chiapas, Mexico, which had their main production focus on milk but for whom beef production was also of significant value, were evaluated utilising a merged version of the WQ protocols for dairy and beef cattle. Based on their average score, the farms obtained at least an acceptable level in each indicator of welfare. However, after merging indicators from the dairy and beef cattle protocols of WQ in order to adjust it to the prevailing conditions in the tropics, a number of sections are not applicable. This is particularly true of the section related to good housing, where most of the items do not apply due to the absence of infrastructure; the farms obtained a very high score in this section but further studies to verify whether this reflects an accurate assessment of the welfare status should be carried out. In general, the approach of the WQ protocol was useful, however certain aspects are quite different from the conventional intensive farming systems predominantly used in Europe and there is a need to implement a number of modifications.
Studies suggest the inter-rater reliability of judges at wine competitions is higher than what would be expected by random chance, but lower than what is observed when experts in other fields make judgments specific to their expertise. To further contextualize the (un-) reliability of wine judging while also extending the study of fine water, we examine the inter-rater reliability of judges at an annual international competition for bottled waters. We find that the inter-rater reliability of water judging is generally better than chance and, at best, about the same as the inter-rater reliability of wine judging at some wine competitions. These results suggest that perceptible differences between fine waters exist but are less pronounced than those between fine wines and, also, that aesthetic standards with respect to fine waters exist but are currently less established than those for fine wines.
Alternate day fasting (ADF) with consumption of calories up to 25 % of the daily energy intake on fast days is one of the most used intermittent fasting regimens and promoted as a promising, alternative approach for treating obesity. Feelings of appetite are critical for adherence to dietary approaches, and therefore the success of dietary interventions. This systematic review aimed to assess the effects of a minimum of 8 weeks of ADF on subjective feelings of appetite and body weight for adults with overweight and obesity. We conducted the review in accordance with the Cochrane guidelines, including systematic searches in four databases. Because of the high level of clinical and methodological heterogeneity, a narrative approach was used to synthesise the results. Eight studies with a total of 456 participants met the eligibility criteria: three randomised controlled trials and five uncontrolled before-after studies. Seven of the studies had high risk of bias. Feelings of appetite were assessed by hunger in eight studies, fullness in seven studies, satisfaction in four studies and desire to eat in one study. All the studies assessed weight loss. The certainty of the evidence was rated low or very low for all outcomes, thus no firm conclusions can be drawn about the potential benefits of ADF on subjective feelings of appetite and body weight. Despite the high interest in ADF, good quality evidence is still needed to determine its effectiveness and if offered in clinical practice, ADF should be offered cautiously while concomitantly evaluated.
To examine associations among neighbourhood food environments (NFE), household food insecurity (HFI) and child’s weight-related outcomes in a racially/ethnically diverse sample of US-born and immigrant/refugee families.
Design:
This cross-sectional, observational study involving individual and geographic-level data used multilevel models to estimate associations between neighbourhood food environment and child outcomes. Interactions between HFI and NFE were employed to determine whether HFI moderated the association between NFE and child outcomes and whether the associations differed for US-born v. immigrant/refugee groups.
Setting:
The sample resided in 367 census tracts in the Minneapolis/St. Paul, MN metropolitan area, and the data were collected in 2016–2019.
Participants:
The sample was from the Family Matters study of families (n 1296) with children from six racial/ethnic and immigrant/refugee groups (African American, Latino, Hmong, Native American, Somali/Ethiopian and White).
Results:
Living in a neighbourhood with low perceived access to affordable fresh fruits and vegetables was found to be associated with lower food security (P < 0·01), poorer child diet quality (P < 0·01) and reduced availability of a variety of fruits (P < 0·01), vegetables (P < 0·05) and whole grains in the home (P < 0·01). Moreover, residing in a food desert was found to be associated with a higher child BMI percentile if the child’s household was food insecure (P < 0·05). No differences in associations were found for immigrant/refugee groups.
Conclusions:
Poor NFE were associated with worse weight-related outcomes for children; the association with weight was more pronounced among children with HFI. Interventions aiming to improve child weight-related outcomes should consider both NFE and HFI.
Conspiracy theories are popular during the COVID-19 pandemic. Conspiratorial thinking is characterised by the strong conviction that a certain situation that one sees as unjust is the result of a deliberate conspiracy of a group of people with bad intentions. Conspiratorial thinking appears to have many similarities with paranoid delusions.
Objectives
To explore the nature, consequences, and social-psychological dimensions of conspiratorial thinking, and describe similarities and differences with paranoid delusions.
Methods
Critically assessing relevant literature about conspiratorial thinking and paranoid delusions.
Results
Conspiratorial thinking meets epistemic, existential, and social needs. It provides clarity in uncertain times and connection with an in-group of like-minded people. Both conspiratorial thinking and paranoid delusions involve an unjust, persistent, and sometimes bizarre conviction. Unlike conspiracy theorists, people with a paranoid delusion are almost always the only target of the presumed conspiracy, and they usually stand alone in their conviction. Furthermore, conspiracy theories are not based as much on unusual experiences of their inner self, reality, or interpersonal contacts.
Conclusions
Conspirational thinking is common in uncertain circumstances. It gives grip, certainty, moral superiority and social support. Extreme conspirational thinking seems to fit current psychiatric definitions of paranoid delusions, but there are also important differences. To make a distinction with regard to conspiratorial thinking, deepening of conventional definitions of delusions is required. Instead of the strong focus on the erroneous content of delusions, more attention should be given to the underlying idiosyncratic, changed way of experiencing reality.
COVID19 has brought several psychosocial stressors that are having an impact on global mental health. The impact of the pandemic on the incidence of First Episode of Psychosis (FEP) is not clear.
Objectives
To describe the clinical and sociodemographic characteristics of FEP patients diagnosed since the onset of the COVID19 pandemic and compare them with the equivalent period of the previous year.
Methods
We included all FEP patients attended at Parc de Salut Mar (Barcelona, Spain) from March 14, 2020 (when the state of emergency in Spain began) to December 31, 2020 with the same period of 2019. We assessed sociodemographic variables, duration of untreated psychosis (DUP), cannabis and alcohol use, psychiatric diagnosis, and psychiatric symptom scales. We performed a univariate analysis between the groups using U-Mann Whitney for continuous variables and Chi-Square for qualitative variables.
Results
A total of 20 FEP patients were diagnosed in each period. No differences were found in sociodemographic variables, scales scores or DUP. During COVID19 period there was a smaller proportion of cannabis users (60% vs 90%; p=0.028) and a tendency of lower weekly consumption (14.44 vs 16.42; p=0.096). There were more cases of BPD (25% vs 5%; p=0.077) and less of affective psychosis (0% vs 25%; 0.017).
Conclusions
During the COVID-19 pandemic we did not find an increase of FEP or more severe clinical presentations. However, we identified differences in the type of FEP that could be related to the psychosocial stressors of this time.
Introduction. Tobacco use increases risks for numerous diseases, including respiratory illnesses. We examined the literature to determine whether a history of tobacco use increases risks for adverse outcomes among COVID-19 patients. Methods. We conducted a systematic search of PubMed, LitCovid, Scopus, and Europe PMC (for preprints) using COVID-19 and tobacco-related terms. We included studies of human subjects with lab-confirmed COVID-19 infections that examined tobacco use history as an exposure and used multivariable analyses. The data was collected between March 31st, 2020, and February 20th, 2021. Outcomes included mortality, hospitalization, ICU admission, mechanical ventilation, and illness severity. Results. Among the 39 studies (33 peer-reviewed, 6 preprints) included, the most common outcome assessed was mortality (n = 32). The majority of these studies (17/32) found that tobacco use increased risk, one found decreased risk, and 14 found no association. Tobacco use was associated with increased risk of hospitalization in 7 of 10 studies, ICU admission in 6 of 9 studies, mechanical ventilation in 2 of 6 studies, and illness severity in 3 of 9 studies. One study found that tobacco use history increased risk of pulmonary embolism in COVID-19 patients. Tobacco use was found to compound risks associated with diabetes (n = 1), cancer (n = 2), and chronic liver disease (n = 1). Conclusion. There is strong evidence that tobacco use increases risks of mortality and disease severity/progression among COVID-19 patients. Public health efforts during the pandemic should encourage tobacco users to quit use and seek care early and promote vaccination and other preventive behaviors among those with a history of tobacco use.