We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Unmet legal needs contribute to housing, income, and food insecurity, along with other conditions that harm health and drive health inequity. Addressing health injustice requires new tools for the next generations of lawyers, doctors, and other healthcare professionals. An interprofessional group of co-authors argue that law and medical schools and other university partners should develop and cultivate Academic Medical-Legal Partnerships (A-MLPs), which are uniquely positioned to leverage service, education, and research resources, to advance health justice.
To examine the impact of SARS-CoV-2 infection on CLABSI rate and characterize the patients who developed a CLABSI. We also examined the impact of a CLABSI-reduction quality-improvement project in patients with and without COVID-19.
Design:
Retrospective cohort analysis.
Setting:
Academic 889-bed tertiary-care teaching hospital in urban Los Angeles.
Patients or participants:
Inpatients 18 years and older with CLABSI as defined by the National Healthcare Safety Network (NHSN).
Intervention(s):
CLABSI rate and patient characteristics were analyzed for 2 cohorts during the pandemic era (March 2020–August 2021): COVID-19 CLABSI patients and non–COVID-19 CLABSI patients, based on diagnosis of COVID-19 during admission. Secondary analyses were non–COVID-19 CLABSI rate versus a historical control period (2019), ICU CLABSI rate in COVID-19 versus non–COVID-19 patients, and CLABSI rates before and after a quality- improvement initiative.
Results:
The rate of COVID-19 CLABSI was significantly higher than non–COVID-19 CLABSI. We did not detect a difference between the non–COVID-19 CLABSI rate and the historical control. COVID-19 CLABSIs occurred predominantly in the ICU, and the ICU COVID-19 CLABSI rate was significantly higher than the ICU non–COVID-19 CLABSI rate. A hospital-wide quality-improvement initiative reduced the rate of non–COVID-19 CLABSI but not COVID-19 CLABSI.
Conclusions:
Patients hospitalized for COVID-19 have a significantly higher CLABSI rate, particularly in the ICU setting. Reasons for this increase are likely multifactorial, including both patient-specific and process-related issues. Focused quality-improvement efforts were effective in reducing CLABSI rates in non–COVID-19 patients but were less effective in COVID-19 patients.
Housing First (HF), a recovery-oriented approach, was proven effective in stabilising housing situations of homeless individuals with severe mental disorders, yet had limited effectiveness on recovery outcomes on a short-term basis compared to standard treatment. The objective was to assess the effects of the HF model among homeless people with high support needs for mental and physical health services on recovery, housing stability, quality of life, health care use, mental symptoms and addiction issues on 4 years of data from the Un Chez Soi d'Abord trial.
Methods
A multicentre randomised controlled trial was conducted from August 2011 to April 2018 with intent-to-treat analysis in four French cities: Lille, Marseille, Paris and Toulouse. Participants were homeless or precariously-housed patients with a DSM-IV-TR diagnosis of bipolar disorder or schizophrenia. Two groups were compared: the HF group (n = 353) had immediate access to independent housing and support from the assertive community treatment team; the Treatment-As-Usual (TAU) group (n = 350) had access to existing support and services. Main outcomes were personal recovery (Recovery Assessment Scale (RAS) scale), housing stability, quality of life (S-QoL), global physical and mental status (Medical Outcomes Study 36-item Short Form Health Survey (SF-36)), inpatient days, mental symptoms (Modified Colorado Symptom Index (MCSI)) and addictions (Mini International Neuropsychiatric Interview (MINI) and Alcohol Use Disorders Identification Test (AUDIT)). Mixed models using longitudinal and cluster designs were performed and adjusted to first age on the street, gender and mental disorder diagnosis. Models were tested for time × group and site × time interactions.
Results
The 703 participants [123 (18%) female] had a mean age of 39 years (95% CI 38.0–39.5 years). Both groups improved RAS index from baseline to 48 months, with no statistically significant changes found between the HF and TAU groups over time. HF patients exhibited better autonomy (adjusted β = 2.6, 95% CI 1.2–4.1) and sentimental life (2.3, 95% CI 0.5–4.1), higher housing stability (28.6, 95% CI 25.1–32.1), lower inpatient days (−3.14, 95% CI −5.2 to −1.1) and improved SF-36 mental composite score (−0.8, 95% CI −1.6 to −0.1) over the 4-year follow-up. HF participants experienced higher alcohol consumption between baseline and 48 months. No significant differences were observed for self-reported mental symptoms or substance dependence.
Conclusion
Data at 4 years were consistent with 2-year follow-up data: similar improvement in personal recovery outcomes but higher housing stability, autonomy and lower use of hospital services in the HF group compared to the TAU group, with the exception of an ongoing alcohol issue. These sustained benefits support HF as a valuable intervention for the homeless patients with severe mental illness.
Many people who are homeless with severe mental illnesses are high users of healthcare services and social services, without reducing widen health inequalities in this vulnerable population. This study aimed to determine whether independent housing with mental health support teams with a recovery-oriented approach (Housing First (HF) program) for people who are homeless with severe mental disorders improves hospital and emergency department use.
Methods
We did a randomised controlled trial in four French cities: Lille, Marseille, Paris and Toulouse. Participants were eligible if they were 18 years or older, being absolutely homeless or precariously housed, with a diagnosis of schizophrenia (SCZ) or bipolar disorder (BD) and were required to have a high level of needs (moderate-to-severe disability and past hospitalisations over the last 5 years or comorbid alcohol or substance use disorder). Participants were randomly assigned (1:1) to immediate access to independent housing and support from the Assertive Community Treatment team (social worker, nurse, doctor, psychiatrist and peer worker) (HF group) or treatment as usual (TAU group) namely pre-existing dedicated homeless-targeted programs and services. Participants and interviewers were unmasked to assignment. The primary outcomes were the number of emergency department (ED) visits, hospitalisation admissions and inpatient days at 24 months. Secondary outcomes were recovery (Recovery Assessment Scale), quality of life (SQOL and SF36), mental health symptoms, addiction issues, stably housed days and cost savings from a societal perspective. Intention-to-treat analysis was performed.
Results
Eligible patients were randomly assigned to the HF group (n = 353) or TAU group (n = 350). No differences were found in the number of hospital admissions (relative risk (95% CI), 0.96 (0.76–1.21)) or ED visits (0.89 (0.66–1.21)). Significantly less inpatient days were found for HF v. TAU (0.62 (0.48–0.80)). The HF group exhibited higher housing stability (difference in slope, 116 (103–128)) and higher scores for sub-dimensions of S-QOL scale (psychological well-being and autonomy). No differences were found for physical composite score SF36, mental health symptoms and rates of alcohol or substance dependence. Mean difference in costs was €-217 per patient over 24 months in favour of the HF group. HF was associated with cost savings in healthcare costs (RR 0.62(0.48–0.78)) and residential costs (0.07 (0.05–0.11)).
Conclusion
An immediate access to independent housing and support from a mental health team resulted in decreased inpatient days, higher housing stability and cost savings in homeless persons with SCZ or BP disorders.
OBJECTIVES/SPECIFIC AIMS: Delirium, a form of acute brain dysfunction, characterized by changes in attention and alertness, is a known independent predictor of mortality in the Intensive Care Unit (ICU). We sought to understand whether catatonia, a more recently recognized form of acute brain dysfunction, is associated with increased 30-day mortality in critically ill older adults. METHODS/STUDY POPULATION: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Coma, was defined as a Richmond Agitation Scale score of −4 or −5. We used the Cox Proportional Hazards model predicting 30-day mortality after adjusting for delirium, coma and catatonia status. RESULTS/ANTICIPATED RESULTS: We enrolled 335 medical, surgical or trauma critically ill patients with 1103 matched delirium and catatonia assessments. Median age was 58 years (IQR: 48 - 67). Main indications for admission to the ICU included: airway disease or protection (32%; N=100) or sepsis and/or shock (25%; N=79. In the unadjusted analysis, regardless of the presence of catatonia, non-delirious individuals have the highest median survival times, while delirious patients have the lowest median survival time. Comparing the absence and presence of catatonia, the presence of catatonia worsens survival (Figure 1). In a time-dependent Cox model, comparing non-delirious individuals, holding catatonia status constant, delirious individuals have 1.72 times the hazards of death (IQR: 1.321, 2.231) while those with coma have 5.48 times the hazards of death (IQR: 4.298, 6.984). For DSM-5 catatonia scores, a 1-unit increase in the score is associated with 1.18 times the hazards of in-hospital mortality. Comparing two individuals with the same delirium status, an individual with a DSM-5 catatonia score of 0 (no catatonia) will have 1.178 times the hazard of death (IQR: 1.086, 1.278), while an individual with a score of 3 catatonia items (catatonia) present will have 1.63 times the hazard of death. DISCUSSION/SIGNIFICANCE OF IMPACT: Non-delirious individuals have the highest median survival times, while those who are comatose have the lowest median survival times after a critical illness, holding catatonia status constant. Comparing the absence and presence of catatonia, the presence of catatonia seems to worsen survival. Those individual who are both comatose and catatonic have the lowest median survival time.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
The next-generation radio telescopes such as LOFAR and SKA will give access to high time-resolution and high instantaneous sensitivity that can be exploited to study slow and fast transients over the whole radio window. The search for radio transients in large datasets also represents a new signal-processing challenge requiring efficient and robust signal reconstruction algorithms. Using sparse representations and the general ‘compressed sensing’ framework, we developed a 2D–1D algorithm based on the primal-dual splitting method. We have performed our sparse 2D–1D reconstruction on three-dimensional data sets containing either simulated or real radio transients, at various levels of SNR and integration times. This report presents a summary of the current level of performance of our method.
The deadliest outburst flood from an englacial cavity occurred on Glacier de Tête Rousse in the Mont Blanc area, French Alps, in 1892. A subglacial reservoir was discovered in the same glacier in 2010 and drained artificially in 2010, 2011 and 2012 to protect the 3000 inhabitants downstream. The mechanism leading to the spontaneous refilling of the cavity following these pumping operations has been analyzed. For this purpose, the subglacial water volume changes between 2010 and 2013 were reconstructed. The size of the cavity following the pumping was found to have decreased from 53500 m3 in 2010 to 12 750 m3 in 2013. Creep and the partial collapse of the cavity roof explain a large part of the volume loss. Analysis of cavity filling showed a strong relationship between measured surface melting and the filling rate, with a time delay of 4–6 hours. A permanent input of 15 m3 d−1, not depending on surface melt, was also found. The meltwater and rain from the surface is conveyed to bedrock through crevasses and probably through a permeable layer of rock debris at the glacier bed. The drainage pathway permeability was estimated at 0.054 ms−1 from water discharge measurements and dye-tracing experiments.
The origin of red supergiant mass loss still remains to be unveiled. Characterising the formation loci and the dust distribution in the first stellar radii above the surface is key to understand the initiation of the mass loss phenomenon. Polarimetric interferometry observations in the near-infrared allowed us to detect an inner dust atmosphere located only 0.5 stellar radius above the photosphere of Betelgeuse. We modelled these observations and compare them with visible polarimetric measurements to discuss the dust distribution properties.
Effects of plant maturity on apparent ruminal synthesis and post-ruminal supply of B vitamins were evaluated in two feeding trials. Diets containing alfalfa (Trial 1) or orchardgrass (Trial 2) silages harvested either (1) early cut, less mature (EC) or (2) late cut, more mature (LC) as the sole forage were offered to ruminally and duodenally cannulated lactating Holstein cows in crossover design experiments. In Trial 1, conducted with 16 cows (569±43 kg of empty BW (ruminal content removed) and 43.7±8.6 kg/day of 3.5% fat-corrected milk yield; mean±SD) in two 17-day treatment periods, both diets provided ~22% forage NDF and 27% total NDF, and the forage-to-concentrate ratios were 53 : 47 and 42 : 58 for EC and LC, respectively. In Trial 2, conducted with 13 cows (588±55 kg of empty BW and 43.7±7.7 kg/day of 3.5% fat-corrected milk yield; mean±SD) in two 18-day treatment periods, both diets provided ~25% forage NDF and 31% total NDF; the forage-to-concentrate ratios were 58 : 42 and 46 : 54 for EC and LC, respectively. Thiamin, riboflavin, niacin, vitamin B6, folates and vitamin B12 were measured in feed and duodenal content. Apparent ruminal synthesis was calculated as the duodenal flow minus the intake. Diets based on EC alfalfa decreased the amounts of thiamin, niacin and folates reaching the duodenum, whereas diets based on EC orchardgrass increased riboflavin duodenal flow. Daily apparent ruminal synthesis of thiamin, riboflavin, niacin and vitamin B6 were correlated negatively with their intake, suggesting a microbial regulation of their concentration in the rumen. Vitamin B12 apparent ruminal synthesis was correlated negatively with total volatile fatty acids concentration, but positively with ruminal pH and microbial N duodenal flow.
The contribution of subsidized food commodities to total food consumption is unknown. We estimated the proportion of individual energy intake from food commodities receiving the largest subsidies from 1995 to 2010 (corn, soyabeans, wheat, rice, sorghum, dairy and livestock).
Design
Integrating information from three federal databases (MyPyramid Equivalents, Food Intakes Converted to Retail Commodities, and What We Eat in America) with data from the 2001–2006 National Health and Nutrition Examination Surveys, we computed a Subsidy Score representing the percentage of total energy intake from subsidized commodities. We examined the score’s distribution and the probability of having a ‘high’ (≥70th percentile) v. ‘low’ (≤30th percentile) score, across the population and subgroups, using multivariate logistic regression.
Setting
Community-dwelling adults in the USA.
Subjects
Participants (n 11 811) aged 18–64 years.
Results
Median Subsidy Score was 56·7 % (interquartile range 47·2–65·4 %). Younger, less educated, poorer, and Mexican Americans had higher scores. After controlling for covariates, age, education and income remained independently associated with the score: compared with individuals aged 55–64 years, individuals aged 18–24 years had a 50 % higher probability of having a high score (P<0·0001). Individuals reporting less than high-school education had 21 % higher probability of having a high score than individuals reporting college completion or higher (P=0·003); individuals in the lowest tertile of income had an 11 % higher probability of having a high score compared with individuals in the highest tertile (P=0·02).
Conclusions
Over 50 % of energy in US diets is derived from federally subsidized commodities.
Background: A definitive diagnosis of multiple sclerosis (MS), as distinct from a clinically isolated syndrome, requires one of two conditions: a second clinical attack or particular magnetic resonance imaging (MRI) findings as defined by the McDonald criteria. MRI is also important after a diagnosis is made as a means of monitoring subclinical disease activity. While a standardized protocol for diagnostic and follow-up MRI has been developed by the Consortium of Multiple Sclerosis Centres, acceptance and implementation in Canada have been suboptimal. Methods: To improve diagnosis, monitoring, and management of a clinically isolated syndrome and MS, a Canadian expert panel created consensus recommendations about the appropriate application of the 2010 McDonald criteria in routine practice, strategies to improve adherence to the standardized Consortium of Multiple Sclerosis Centres MRI protocol, and methods for ensuring effective communication among health care practitioners, in particular referring physicians, neurologists, and radiologists. Results: This article presents eight consensus statements developed by the expert panel, along with the rationale underlying the recommendations and commentaries on how to prioritize resource use within the Canadian healthcare system. Conclusions: The expert panel calls on neurologists and radiologists in Canada to incorporate the McDonald criteria, the Consortium of Multiple Sclerosis Centres MRI protocol, and other guidance given in this consensus presentation into their practices. By improving communication and general awareness of best practices for MRI use in MS diagnosis and monitoring, we can improve patient care across Canada by providing timely diagnosis, informed management decisions, and better continuity of care.
Nitrogen is an essential nutrient in agriculture. Its reactive forms are the focus of concerns because they are responsible for a multitude of impacts on the environment and health. This was highlighted by the 2011 European Nitrogen Assessment, which drew up a critical inquiry on nitrogen imbalances due to livestock farming systems. The ambivalent status of nitrogen, both a resource for agriculture and a pollutant for environment, requires legal systems to find the equilibrium between the fertilising potential of the animal wastes and its possible negative effects on the environment. The European policies on nitrate and gaseous pollutants are the subject of much litigation with the European Commission. In this context, the French government asked for a synthesis of scientific knowledge on flows and fate of nitrogen related to livestock farms. Articles following in this review draw (i) the current situation, (ii) explain social and economic causes of the territorial variability of nitrogen pressure, (iii) quantify the flow on farms, (iv) look at indicators, (v) regulation instruments and finally (vi) identify options for reducing nitrogen pressure caused by livestock farming. In terms of materials and methods, a particular importance was given to peer recognition, and plurality inside the panel of experts and in the literature selection.
Cobalamin (CBL), the biologically active form of vitamin B12, and its analogs, are produced by bacteria only if cobalt supply is adequate. The analogs differ generally by the nucleotide moiety of the molecule. In CBL, 5,6-dimethylbenzimidazole (5,6-DMB) is the base in the nucleotide moiety. The present study aimed to determine if a supplement of 5,6-DMB could increase utilization of dietary cobalt for synthesis of CBL and change ruminal fermentation, nutrient digestibility, omasal flow of nutrients and ruminal protozoa counts. Eight ruminally cannulated multiparous Holstein cows (mean±standard deviation=238±21 days in milk and 736±47 kg of BW) were used in a crossover design. Cows were randomly assigned to a daily supplement of a gelatin capsule containing 1.5 g of 5,6-DMB via the rumen cannula or no supplement. Each period lasted 29 days and consisted of 21 days for treatment adaptation and 8 days for data and samples collection. Five corrinoids, CBL and four cobamides were detected in the total mixed ration and the omasal digesta from both treatments. The dietary supplement of 5,6-DMB increased (P=0.02) apparent ruminal synthesis of CBL from 14.6 to 19.6 (s.e.m. 0.8) mg/day but had no effect (P>0.1) on apparent ruminal synthesis of the four analogs. The supplement of 5,6-DMB had no effect (P>0.1) on milk production and composition, or on protozoal count, ruminal pH and concentrations of volatile fatty acids and ammonia nitrogen in rumen content. The supplement had also no effect (P>0.1) on intake, omasal flow and apparent ruminal digestibility of dry matter, organic matter, NDF, ADF and nitrogenous fractions. Plasma concentration of CBL was not affected by treatments (P=0.98). Providing a preformed part of the CBL molecule, that is, 5,6-DMB, increased by 34% the apparent ruminal synthesis of CBL by ruminal bacteria but had no effect on ruminal fermentation or protozoa count and it was not sufficient to increase plasma concentrations of the vitamin. Even though the efficiency of cobalt utilization for apparent synthesis of CBL was increased from 2.0% to 2.7% by the 5,6-DMB supplement, this improved efficiency was still very low. Further research is needed to identify the factors affecting efficiency of utilization of cobalt for synthesis of CBL by the bacterial populations in rumen.
Physical aggression (PA) tends to have its onset in infancy and to increase rapidly in frequency. Very little is known about the genetic and environmental etiology of PA development during early childhood. We investigated the temporal pattern of genetic and environmental etiology of PA during this crucial developmental period.
Method
Participants were 667 twin pairs, including 254 monozygotic and 413 dizygotic pairs, from the ongoing longitudinal Quebec Newborn Twin Study. Maternal reports of PA were obtained from three waves of data at 20, 32 and 50 months. These reports were analysed using a biometric Cholesky decomposition and linear latent growth curve model.
Results
The best-fitting Cholesky model revealed developmentally dynamic effects, mostly genetic attenuation and innovation. The contribution of genetic factors at 20 months substantially decreased over time, while new genetic effects appeared later on. The linear latent growth curve model revealed a significant moderate increase in PA from 20 to 50 months. Two separate sets of uncorrelated genetic factors accounted for the variation in initial level and growth rate. Non-shared and shared environments had no effect on the stability, initial status and growth rate in PA.
Conclusions
Genetic factors underlie PA frequency and stability during early childhood; they are also responsible for initial status and growth rate in PA. The contribution of shared environment is modest, and perhaps limited, as it appears only at 50 months. Future research should investigate the complex nature of these dynamic genetic factors through genetic–environment correlation (rGE) and interaction (G × E) analyses.
Nonparametric regression is a powerful tool to estimate nonlinear relations between some predictors and a response variable. However, when the number of predictors is high, nonparametric estimators may suffer from the curse of dimensionality. In this chapter, we show how a dimension reduction method (namely Sliced Inverse Regression) can be combined with nonparametric kernel regression to overcome this drawback. The methods are illustrated both on simulated datasets as well as on an astronomy dataset using the R software.
Here we present the installation and successful commissioning of an L'-band Annular Groove Phase Mask (AGPM) coronagraph on VLT/NACO. The AGPM is a vector vortex coronagraph made from diamond subwavelength gratings tuned to the L' band. The vector vortex coronagraph enables high contrast imaging at very small inner working angle (here 0″.09, the diffraction limit of the VLT at L'), potentially being the key to a new parameter space. During technical and science verification runs, we discovered a late-type companion at two beamwidths from an F0V star (Mawet et al. 2013), and imaged the inner regions of β Pictoris down to the previously unexplored projected radius of 1.75 AU. The circumstellar disk was also resolved from ≃ 1″ to 5″ (see J. Milli et al., these proceedings). These results showcase the potential of the NACO L-band AGPM over a wide range of spatial scales.
The well-known recurrent nova T Pyx has brightened by 7 magnitudes, starting on 2011 April 14, its first eruption since 1966. T Pyx is unique amongst recurrent novæ in being surrounded by a nebula formed of material ejected during previous eruptions. The latest eruption therefore offers the rare opportunity to observe a light echo sweeping through the existing shell, and a new one forming. The sudden exposure of the existing shell to high-energy light is expected to result in a change of the dust morphology as well as in the part destruction of molecules. We observe this process in the near- and mid-IR during several epochs using ESO's VLT instruments Sinfoni, Visir and Isaac. Unfortunately, in the data analysed so far we only have a tentative detection in Brα from the shell, so might in the end have to be content with upper limits for the emission from the various molecular bands and ionised lines.