We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This report explores key considerations in relation to adopting a dynamic discount rate funding approach and the impacts of doing so in a range of areas, including funding volatility, investment strategy and end game objectives. It considers the advantages and disadvantages of this approach from the perspective of a range of stakeholders and the challenges that need overcoming in order to fully implement and support the approach, for example data challenges and the new skills required in the industry. The report includes sample modelling to highlight the practical issues that arise when adopting this approach. It describes a step-by-step approach for assessing the risks to be considered when determining an appropriate level of assets to provide funding for a sample set of pension scheme cash flows, as summarised in the table below.
Steps involved in determining the funding buffer and discount rate
Step 1
Create an asset portfolio based on best estimate liability cash flows
Step 2
Adjustment for investment costs
Step 3
Buffer: allowance for asset-side risks
Step 4
Buffer: allowance for asset-liability mismatch risk (reinvestment and disinvestment risk)
Step 5
Buffer: allowance for liability-side risks
Step 6
Buffer: consideration of risk diversification when determining the buffer
It also considers how a dynamic discount rate approach fits within the proposed future funding regulations. Finally, the report puts forward recommendations for the IFoA, Scheme Actuaries and TPR.
Consequences of schemes adopting a dynamic discount rate approach could include very different investment strategies with investment in a wider pool of assets, less use of leveraged Liability Driven Investment, fewer schemes targeting buy-out as their end game strategy and an increase in technical work for actuaries in advising on the optimisation of asset and liability cash flows.
Tight focusing with very small f-numbers is necessary to achieve the highest at-focus irradiances. However, tight focusing imposes strong demands on precise target positioning in-focus to achieve the highest on-target irradiance. We describe several near-infrared, visible, ultraviolet and soft and hard X-ray diagnostics employed in a ∼1022 W/cm2 laser–plasma experiment. We used nearly 10 J total energy femtosecond laser pulses focused into an approximately 1.3-μm focal spot on 5–20 μm thick stainless-steel targets. We discuss the applicability of these diagnostics to determine the best in-focus target position with approximately 5 μm accuracy (i.e., around half of the short Rayleigh length) and show that several diagnostics (in particular, 3$\omega$ reflection and on-axis hard X-rays) can ensure this accuracy. We demonstrated target positioning within several micrometers from the focus, ensuring over 80% of the ideal peak laser intensity on-target. Our approach is relatively fast (it requires 10–20 laser shots) and does not rely on the coincidence of low-power and high-power focal planes.
Hospitalizations among skilled nursing facility (SNF) residents in Detroit increased in mid-March 2020 due to the coronavirus disease 2019 (COVID-19) pandemic. Outbreak response teams were deployed from local healthcare systems, the Centers for Disease Control and Prevention (CDC), and the Detroit Health Department (DHD) to understand the infection prevention and control (IPC) gaps in SNFs that may have accelerated the outbreak.
Methods:
We conducted 2 point-prevalence surveys (PPS-1 and PPS-2) at 13 Detroit SNFs from April 8 to May 8, 2020. The DHD and partners conducted facility-wide severe acute respiratory coronavirus virus 2 (SARS-CoV-2) testing of all residents and staff and collected information regarding resident cohorting, staff cohorting, and personnel protective equipment (PPE) utilized during that time.
Results:
Resident cohorting had been implemented in 7 of 13 (58.3%) SNFs prior to point-prevalence survey 1 (PPS-1), and other facilities initiated cohorting after obtaining PPS-1 results. Cohorting protocols of healthcare practitioners and environmental service staff were not established in 4 (31%) of 13 facilities, and in 3 facilities (23.1%) the ancillary staff were not assigned to cohorts. Also, 2 SNFs (15%) had an observation unit prior to PPS-1, 2 (15%) had an observation unit after PPS-1, 4 (31%) could not establish an observation unit due to inadequate space, and 5 (38.4%) created an observation unit after PPS-2.
Conclusion:
On-site consultations identified gaps in IPC knowledge and cohorting that may have contributed to ongoing transmission of SARS-CoV-2 among SNF residents despite aggressive testing measures. Infection preventionists (IPs) are critical in guiding ongoing IPC practices in SNFs to reduce spread of COVID-19 through response and prevention.
Accumulating evidence suggests that deficits of visual selective attention may already occur at early stages of Alzheimer's disease (AD) like the prodromal phase of mild cognitive impairment (MCI).
Our study investigated visual selective attention in amnestic MCI and probable AD patients compared to healthy elderly controls. Groups were matched for age, gender and education. In combination with Bundesen's ‘theory of visual attention’, two mathematically independent and quantitative parameter estimates were derived from a partial report of briefly presented letter arrays: top-down control of attentional selection, representing task-related attentional weighting for prioritizing relevant visual objects, and spatial distribution of attentional weights across the left and right hemifield.
Compared to controls, MCI patients showed significantly reduced top-down controlled selection which further deteriorated in AD subjects. Moreover, attentional weighting was significantly unbalanced across hemifields in MCI and tended to be more lateralized in AD. The majority of patients was biased to the left. Across MCI and AD patients, carriers of the apolipoprotein E ɛ4 allele (ApoE4) revealed a leftward spatial bias. The leftward bias was the more pronounced the younger the ApoE4-positive patients and the earlier disease onset. ApoE4-negative subjects showed balanced attentional weighting.
These results indicate that impaired top-down control may be linked to early dysfunction of fronto-parietal cortico-cortical networks. Accompanying, an early interhemispheric asymmetry in temporo-parietal cortical interactions might cause a pathological spatial bias. As the inheritance of ApoE4 is associated with asymmetric parietal metabolism, a pathological spatial bias may function as early cognitive marker for detecting probable AD subjects.
Serotonergic neurotransmission plays a key role in seasonal changes of mood and behaviour. Higher serotonin transporter availability in healthy human subjects in times of lesser light has been reported in recent studies. Furthermore, seasonal alterations of postsynaptic serotonin-1A receptors have been suggested by a recent animal study. Following that, this study aimed at identifying seasonal alterations of serotonin-1A receptor binding in the living human brain.
Methods
Thirty-six healthy, drug-naïve subjects were investigated using PET and the specific tracer [carbonyl-11C]WAY-100635. Regional serotonin-1A receptor binding (5-HT1A BPND) was related to the individual exposure to global radiation. Furthermore, the subjects were divided into two groups depending on individual exposure to global radiation, and the group differences in regional 5-HT1A BPND were determined.
Results
Correlation analysis controlled for age and gender revealed highly significant positive correlations between regional postsynaptic 5-HT1A BPND and global radiation accumulated for 5 days (r=.32 to .48, p=.030 to .002). Highly significant differences in 5-HT1A BPND binding between subjects with low compared to high exposure to global radiation were revealed (T=-2.63 to -3.77, p .013 to .001). 20% to 30% lower 5-HT1A BPND was found in the subject group exposed to lower amount of global radiation.
Conclusion
Seasonal factors such as exposure to global radiation influence postsynaptic serotonin-1A receptor binding in various brain regions in healthy human subjects. In combination with seasonal alterations in serotonin turnover and 5-HTT availability revealed in recent studies, our results provide an essential contribution of molecular mechanisms in seasonal changes of human serotonergic neurotransmission.
Regional alterations of serotonergic neurotransmission and functional activation in the amygdalar region of patients with major depression are underpinning its important role in affective disorders. In this study we used fMRI and PET to describe functional and molecular alterations associtated with an astrocytoma in the left amygdalar region in a patient with organic depressive disorder compared to control subjects.
Methods
The serotonin-1A (5-HT1A) receptor binding (BPND) was quantified with PET (30 frames, 90 min, 4.4 mm FWHM) in 36 subjects using the radioligand [carbonyl-11C]WAY-100635, and a reference tissue model (MRTM2). In fMRI (3T, EPI inplane resolution 1.6*2.7 mm, 10 AC-PC orientated slices, ST = 3 mm, TE/TR = 31/1000 ms), 32 participants performed emotion discrimination and sensorimotor control tasks. Statistical analysis with SPM5 and unpaired t-tests were performed on molecular and functional data separately.
Results
The astrocytoma was delineated in the serotonin-1A receptor distribution showing (p < 0.01, uncorrected) regional BPND decrease. The ipsilateral thalamus and bilateral habenula regions displayed (p < 0.001; uncorrected) BPND increase. The fMRI data showed significantly (p < 0.05; uncorrected) reduced activation in the affected amygdalar region, ipsilateral fusiform gyrus, bilateral orbitofrontal cortex and temporal regions and increased activation in the contralateral temporal pole.
Conclusions
Lower serotonin-1A receptor binding in the left amydala region reflects the glial provenance of the tumor. The increased receptor binding in the habenulae might be associated with altered monoaminergic neurotransmission and depressive symptoms according to the influence of the habenulae on monoaminergic nuclei. The functional data demonstrate neuroplastic changes beyond affected areas and might indicate compensatory mechanisms.
Mental health service delivery in the general health care sector is restricted with regard to understanding the magnitude and impact of mental illness in the medically ill (co-morbidity), as well as the significance of current mental health service delivery. A new model in development in the framework of a Biomed2 grant is presented. It consists of case-finding through complexity of hospital care prediction (COMPRI) followed by an integral health service needs assessment (INTERMED). It might serve to develop a more structural relation with the general health care sector for the management of mentally co-morbid high utilizing patients.
While the art and science of disaster triage continue to evolve, the education of the US health care student in matters pertaining to disaster preparedness and response remains stifled. Unfortunately, these students will be assuming major decision-making responsibilities regarding catastrophes that will be complicated by climate change, nuclear threats, global terrorism, and pandemics. Meanwhile, Sort, Assess, Life-Saving Interventions, Treatment, and/or Transport (SALT) triage is being advocated over the globally popular Simple Triage and Rapid Treatment (START) algorithm for multiple reasons: (1) it’s an all-hazard approach; (2) it has four medical interventions; and (3) it has an additional triage color for victims with non-survivable injuries.
Hypothesis/Problem
As present-day threats become more ominous and health care education emphasizes the needs of vulnerable populations and palliative care, the authors hypothesize that, when given a choice, health care students will prefer SALT triage.
Methods
A convenience sample of 218 interprofessional, disaster-naïve health care students received just-in-time, unbiased education on both START and SALT triage systems. Students then completed a survey asking them to decide which triage system they believe would be most effective in their community.
Results
A total of 123 health care students (56.4%) preferred SALT while 95 (43.6%) preferred START; however, only the physician assistant students showed a statistically significantly preference (28 versus six, respectively; P=.042). Interestingly, there was also a statistically significant difference in preference by gender (Chi-square=5.02; P=.025) of the observed distribution versus expected distribution in SALT and START. The females preferred SALT (61.0%) while the males preferred START (55.9%).
Among those who preferred START, START being easier to learn was the most important reason cited. Among those who preferred SALT, the most important reason cited was that the number of patient triage categories seemed more logical, comprehensible, and consistent with traditional medical care.
Conclusion:
While SALT’s preference among females and physician assistant students was based on the addition of medical interventions and the provision of palliative care, START’s preference was related to expediency. Based on this research, incorporating disaster concepts into US health care students’ curricula encourages thoughtful consideration among the future health care leaders about the most effective approach to triage care. It is critical that further research be completed to determine, without reservation, which triage system will not only save the most lives but provide the most humane care to victims.
Fink BN, Rega PP, Sexton ME, Wishner C. START versus SALT triage: which is preferred by the 21st century health care student? Prehosp Disaster Med. 2018;33(4):381–386
Ammoniation of montmorillonites pillared with polyhydroxo-complexes of Al was performed in flow conditions at 773–1073 K and monitored by IR spectroscopy. Interaction of the clay with ammonia revealed exchange of terminal and bridging -OH groups by NH2-, NH- and N- species depending on the reaction temperature. Amination begins at 773 K while formation of NH groups starts at 873 K. Transformation of NH- into N-species occurs at 973 K. A mechanism for the ammoniation reaction is proposed and the series of the relative reactivity of OH groups was established: Si–OH > Mg–OH–Mg > Al–OH–Mg > Al–OH–Al. This series was rationalized in terms of accessibility of the OH groups.
Deglaciation chronologies for some sectors of former ice sheets are relatively poorly constrained because of the paucity of features or materials traditionally used to constrain the timing of deglaciation. In areas without good deglaciation varve chronologies and/or without widespread occurrence of material that indicates the start of earliest organic radiocarbon accumulations suitable for radiocarbon dating, typically only general patterns and chronologies of deglaciation have been deduced. However, mid-latitude ice sheets that had warm-based conditions close to their margins often produced distinctive deglaciation landform assemblages, including eskers, deltas, meltwater channels and aligned lineation systems. Because these features were formed or significantly altered during the last glaciation, boulder or bedrock samples from them have the potential to yield reliable deglaciation ages using terrestrial cosmogenic nuclides (TCN) for exposure age dating. Here we present the results of a methodological study designed to examine the consistency of TCN-based deglaciation ages from a range of deglaciation landforms at a site in northern Norway. The strong coherence between exposure ages across several landforms indicates great potential for using TCN techniques on features such as eskers, deltas and meltwater channels to enhance the temporal resolution of ice-sheet deglaciation chronologies over a range of spatial scales.
Lateral moraines constructed along west to east sloping outlet glaciers from mountain centred, pre-last glacial maximum (LGM) ice fields of limited extent remain largely preserved in the northern Swedish landscape despite overriding by continental ice sheets, most recently during the last glacial. From field evidence, including geomorphological relationships and a detailed weathering profile including a buried soil, we have identified seven such lateral moraines that were overridden by the expansion and growth of the Fennoscandian ice sheet. Cosmogenic 10Be and 26Al exposure ages of 19 boulders from the crests of these moraines, combined with the field evidence, are correlated to episodes of moraine stabilisation, Pleistocene surface weathering, and glacial overriding. The last deglaciation event dominates the exposure ages, with 10Be and 26Al data derived from 15 moraine boulders indicating regional deglaciation 9600 ± 200 yr ago. This is the most robust numerical age for the final deglaciation of the Fennoscandian ice sheet. The older apparent exposure ages of the remaining boulders (14,600–26,400 yr) can be explained by cosmogenic nuclide inheritance from previous exposure of the moraine crests during the last glacial cycle. Their potential exposure history, based on local glacial chronologies, indicates that the current moraine morphologies formed at the latest during marine oxygen isotope stage 5. Although numerous deglaciation ages were obtained, this study demonstrates that numerical ages need to be treated with caution and assessed in light of the geomorphological evidence indicating moraines are not necessarily formed by the event that dominates the cosmogenic nuclide data.