We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This editorial considers the value and nature of academic psychiatry by asking what defines the specialty and psychiatrists as academics. We frame academic psychiatry as a way of thinking that benefits clinical services and discuss how to inspire the next generation of academics.
The ability to remotely monitor cognitive skills is increasing with the ubiquity of smartphones. The Mobile Toolbox (MTB) is a new measurement system that includes measures assessing Executive Functioning (EF) and Processing Speed (PS): Arrow Matching, Shape-Color Sorting, and Number-Symbol Match. The purpose of this study was to assess their psychometric properties.
Method:
MTB measures were developed for smartphone administration based on constructs measured in the NIH Toolbox® (NIHTB). Psychometric properties of the resulting measures were evaluated in three studies with participants ages 18 to 90. In Study 1 (N = 92), participants completed MTB measures in the lab and were administered both equivalent NIH TB measures and other external measures of similar cognitive constructs. In Study 2 (N = 1,021), participants completed the equivalent NIHTB measures in the lab and then took the MTB measures on their own, remotely. In Study 3 (N = 168), participants completed MTB measures twice remotely, two weeks apart.
Results:
All three measures exhibited very high internal consistency and strong test-retest reliability, as well as moderately high correlations with comparable NIHTB tests and moderate correlations with external measures of similar constructs. Phone operating system (iOS vs. Android) had a significant impact on performance for Arrow Matching and Shape-Color Sorting, but no impact on either validity or reliability.
Conclusions:
Results support the reliability and convergent validity of MTB EF and PS measures for use across the adult lifespan in remote, self-administered designs.
The diagnosis of functional constipation (FC) relies on patient-reported outcomes evaluated as criteria based on the clustering of symptoms. Although the ROME IV criteria for FC diagnosis is relevant for a multicultural population(1), how an individual’s lifestyle, environment and culture may influence the pathophysiology of FC remains a gap in our knowledge. Building on insights into mechanisms underpinning disorders of gut-brain interactions (formerly functional gastrointestinal disorders) in the COMFORT Cohort(2), this study aimed to investigate the differences in gastrointestinal (GI) symptom scores among participants with FC in comparison to healthy controls between Chinese and non-Chinese New Zealanders. The Gastrointestinal Understanding of Functional Constipation In an Urban Chinese and Urban non-Chinese New Zealander Cohort (GUTFIT) study was a longitudinal cohort study, which aimed to determine a comprehensive profile of characteristics and biological markers of FC between Chinese and non-Chinese New Zealanders. Chinese (classified according to maternal and paternal ethnicity) or non-Chinese (mixed ethnicities) adults living in Auckland classified as with or without FC based on ROME IV were enrolled. Monthly assessment (for 3 months) of GI symptoms, anthropometry, quality of life, diet, and biological samples were assessed monthly over March to June 2023. Demographics were obtained through a self-reported questionnaires and GI symptoms were assessed using the Gastrointestinal Symptom Rating Scale (GSRS) and Structured Assessment of Gastrointestinal Symptoms Scale (SAGIS). This analysis is a cross-sectional assessment of patient-reported outcomes of GI symptoms. Of 78 enrolled participants, 66 completed the study (male, n = 10; female, n = 56) and were distributed across: Chinese with FC (Ch-FC; n = 11), Chinese control (Ch-CON; n = 19), non-Chinese with FC (NCh-FC; n = 16), non-Chinese control (NCh-CON; n = 20). Mean (SD) age, body mass index, and waist circumference were 40 ± 9 years, 22.7 ± 2.5 kg/m2, and 78.0 ± 7.6 cm, respectively. Ethnicity did not impact SAGIS domain scores for GI symptoms (Ethnicity x FC severity interaction p>0.05). Yet, the constipation symptoms domain of the GSRS was scored differently depending on ethnicity and FC status (Ethnicity x FC interaction p<0.05). In post hoc comparison, NCh-FC tended to have higher GSRS constipation severity scores than Ch-FC (3.4 ± 1.0 versus 3.8 ± 0.8 /8, p<0.1) Although constipation symptom severity tended to be higher in NCh-FC, on the whole, ethnicity did not explain variation in this cohort. FC status was a more important predictor of GI symptoms scores. Future research will assess differences in symptom burden to explore ethnicity-specific characteristics of FC.
Distinct pathophysiology has been identified with disorders of gut-brain interactions (DGBI), including functional constipation (FC)(1,2), yet the causes remain unclear. Identifying how modifiable factors (i.e., diet) differ depending on gastrointestinal health status is important to understand relationships between dietary intake, pathophysiology, and disease burden of FC. Given that dietary choices are culturally influenced, understanding ethnicity-specific diets of individuals with FC is key to informing appropriate symptom management and prevention strategies. Despite distinct genetic and cultural features of Chinese populations with increasing FC incidence(3), DGBI characteristics are primarily described in Caucasian populations(2). We therefore aimed to identify how dietary intake of Chinese individuals with FC differs to non-Chinese individuals with FC, relative to healthy controls. The Gastrointestinal Understanding of Functional Constipation In an Urban Chinese and Urban non-Chinese New Zealander Cohort (GUTFIT) study was a longitudinal case-control study using systems biology to investigate the multi-factorial aetiology of FC. Here we conducted a cross-sectional dietary intake assessment, comparing Chinese individuals with FC (Ch-FC) against three control groups: a) non-Chinese with FC (NCh-FC) b) Chinese without FC (Ch-CON) and c) non-Chinese without FC (NCh-CON). Recruitment from Auckland, New Zealand (NZ) identified Chinese individuals based on self-identification alongside both parents self-identifying as Chinese, and FC using the ROME IV criteria. Dietary intake was captured using 3-day food diaries recorded on consecutive days, including one weekend day. Nutrient analysis was performed by Foodworks 10 and statistical analysis with SPSS using a generalised linear model (ethnicity and FC status as fixed factors). Of 78 enrolled participants, 66 completed the study and 64 (39.4 ± 9.2 years) completed a 3-day food diary at the baseline assessment. More participants were female (84%) than male (16%). FC and ethnicity status allocated participants into 1 of 4 groups: Ch-FC (n = 11), Ch-CON (n = 18), NCh-FC (n = 16), NCh-CON (n = 19). Within NCh, ethnicities included NZ European (30%), non-Chinese Asian (11%), Other European (11%), and Latin American (2%). Fibre intake did not differ between Ch-FC and NCh-FC (ethnicity × FC status interaction p>0.05) but was independently lower overall for FC than CON individuals (21.8 ± 8.7 versus 27.0 ± 9.7 g, p<0.05) and overall for Ch than NCh (22.1 ± 8.0 versus 27.0 ± 10.4 g, p<0.05). Carbohydrate, protein, and fat intakes were not different across groups (p>0.05 each, respectively). In the context of fibre and macronutrient intake, there is no difference between Ch-FC and NCh-FC. Therefore, fibre and macronutrients are unlikely to contribute to potential pathophysiological differences in FC between ethnic groups. A more detailed assessment of dietary intake concerning micronutrients, types of fibre, or food choices may be indicated to ascertain whether other dietary differences exist.
On the basis of neutron diffraction studies, the two inner-hydroxyl ions in highly ordered kaolinite were recently shown to be differently oriented. One of the inner-hydroxyl ions points generally toward a hole in the octahedral sheet and the other toward a hole in the tetrahedral sheet. These orientations and the locations of the other atoms in the primitive triclinic unit cell have now been determined for a sample of Keokuk kaolinite with improved precision compared with that reported earlier. Rietveld structure refinement was carried out for the entire crystal structure simultaneously (99 atom positional and 17 other parameters) with each of two newly collected sets of high-resolution neutron powder diffraction data. The different orientations of the inner-hydroxyl ions are the most marked evidence that the unit cell is not C centered. The positions of the inner-surface hydrogen atoms provide further evidence in that all differ from a C-centered relationship by six to eight estimated standard deviations in their y coordinates. The cell is, therefore, not centered. The space group is P1.
As part of the Research Domain Criteria (RDoC) initiative, the NIMH seeks to improve experimental measures of cognitive and positive valence systems for use in intervention research. However, many RDoC tasks have not been psychometrically evaluated as a battery of measures. Our aim was to examine the factor structure of 7 such tasks chosen for their relevance to schizophrenia and other forms of serious mental illness. These include the n-back, Sternberg, and self-ordered pointing tasks (measures of the RDoC cognitive systems working memory construct); flanker and continuous performance tasks (measures of the RDoC cognitive systems cognitive control construct); and probabilistic learning and effort expenditure for reward tasks (measures of reward learning and reward valuation constructs).
Participants and Methods:
The sample comprised 286 cognitively healthy participants who completed novel versions of all 7 tasks via an online recruitment platform, Prolific, in the summer of 2022. The mean age of participants was 38.6 years (SD = 14.5, range 18-74), 52% identified as female, and stratified recruitment ensured an ethnoracially diverse sample. Excluding time for instructions and practice, each task lasted approximately 6 minutes. Task order was randomized. We estimated optimal scores from each task including signal detection d-prime measures for the n-back, Sternberg, and continuous performance task, mean accuracy for the flanker task, win-stay to win-shift ratio for the probabilistic learning task, and trials completed for the effort expenditure for reward task. We used parallel analysis and a scree plot to determine the number of latent factors measured by the 7 task scores. Exploratory factor analysis with oblimin (oblique) rotation was used to examine the factor loading matrix.
Results:
The scree plot and parallel analyses of the 7 task scores suggested three primary factors. The flanker and continuous performance task both strongly loaded onto the first factor, suggesting that these measures are strong indicators of cognitive control. The n-back, Sternberg, and self-ordered pointing tasks strongly loaded onto the second factor, suggesting that these measures are strong indicators of working memory. The probabilistic learning task solely loaded onto the third factor, suggesting that it is an independent indicator of reinforcement learning. Finally, the effort expenditure for reward task modestly loaded onto the second but not the first and third factors, suggesting that effort is most strongly related to working memory.
Conclusions:
Our aim was to examine the factor structure of 7 RDoC tasks. Results support the RDoC suggestion of independent cognitive control, working memory, and reinforcement learning. However, effort is a factorially complex construct that is not uniquely or even most strongly related to positive valance. Thus, there is reason to believe that the use of at least 6 of these tasks are appropriate measures of constructs such as working memory, reinforcement learning and cognitive control.
Translatability of preclinical results remains a major obstacle in neuropsychiatric research. Even when cognitive tests in preclinical models show translational validity for human testing, with sensitivity to clinical deficits, there remains the issue of heterogeneity among human participants. Norming of performance on cognitive tasks enable corrections for any differences in performance that may arise from the influence of socioeconomic factors, and thus a more direct comparison with preclinical testing results. The 5-choice continuous performance task (5C-CPT) is a test sensitive to changes in sustained attention and cognitive control in rodent manipulations and clinical populations, including schizophrenia and bipolar disorder. Herein, we present normed results of 5C-CPT data from a cohort of human participants, enabling greater comparison to future clinical and rodent testing.
Participants and Methods:
5C-CPT data were generated from a range of participants from the Translational Methamphetamine AIDS Research Center (n=82) and a study of bipolar disorder (n=45). Participant demographics were as follows: Age M=38.5, SD=16.7, Education: M=14.5, SD=1.9, 45% female, 10% Asian, 17% African American, 27% Hispanic, and 46% non-Hispanic White. We used the test2norm R-package to create norms for each of the major outcomes from the 5C-CPT. Non-normally distributed raw scores were transformed to generate more normally distributed data needed for the norming process. Raw scores were first converted into uniform scaled scores that range from 0-20 where a higher score indicated better performance. We then generated T-score formulas, which are standardized residuals and scaled to have a mean of 50 and standard deviation of 10. The residuals are obtained from regressions, modeled using multiple fractional polynomial method (MFP), which regresses scaled scores on demographic variables, which a user wishes to control for (gender, age, education, ethnicity, etc.). MFP models allow to fit non-linear effects for numeric demographic factors (e.g., age), if such effects exist.
Results:
New, demographically corrected T-score formulas were calculated for each major outcome of the 5C-CPT: reaction time (MCL), reaction time variability (VarRT), dprime, hit rate (HR) and false-alarm rate (FAR). MFP models showed that age had a significant effect on MCL, VarRT, dprime, and HR (all p<0.01), while gender only showed a significant effect for MCL and VarRT (all p<0.05). Interestingly, education and ethnicity did not show a significant effect for any MFP model and none of the demographic factors (age, education, gender, ethnicity) were significant in the model for FAR. As defined in the test2norm package, all scaled scores had a mean of 10 and SD of 3 and all T-scores had a mean of 50 and SD of 10.
Conclusions:
The 5C-CPT is a test of attention and cognitive control available for human testing, reverse-translated from rodent studies. The normative data generated here will enable future comparisons of data without the need for additional control studies. Furthermore, comparing these normative data to manipulations will enable further comparisons to rodent testing, with manipulations relative to baseline becoming more meaningful. Thus, the 5C-CPT is a viable tool for conducting cross-species translational research toward developing novel therapeutics that treat dysfunctional attentional and cognitive control.
We review impacts of climate change, energy scarcity, and economic frameworks on sustainability of natural and human systems in coastal zones, areas of high biodiversity, productivity, population density, and economic activity. More than 50% of the global population lives within 200 km of a coast, mostly in tropical developing countries. These systems developed during stable Holocene conditions. Changes in global forcings are threatening sustainability of coastal ecosystems and populations. During the Holocene, the earth warmed and became wetter and more productive. Climate changes are impacting coastal systems via sea level rise, stronger tropical cyclones, changes in basin inputs, and extreme weather events. These impacts are passing tipping points as the fossil fuel-powered industrial-technological-agricultural revolution has overwhelmed the source–sink functions of the biosphere and degraded natural systems. The current status of industrialized society is primarily the result of fossil fuel (FF) use. FFs provided more than 80% of global primary energy and are projected to decline to 50% by mid-century. This has profound implications for societal energy requirements, including the transition to a renewable economy. The development of the industrial economy allowed coastal social systems to become spatially separated from their dominant energy and food sources. This will become more difficult to maintain with the fading of cheap energy. It seems inevitable that past growth in energy use, resource consumption, and economic growth cannot be sustained, and coastal areas are in the forefront of these challenges. Rapid planning and cooperation are necessary to minimize impacts of the changes associated with the coming transition. There is an urgent need for a new economic framework to guide society through the transition as mainstream neoclassical economics is not based on natural sciences and does not adequately consider either the importance of energy or the work of nature.
Introduction. Some medical centers and surgeons require patients to stop smoking cigarettes prior to elective orthopaedic surgeries in an effort to decrease surgical complications. Given higher rates of smoking among rural individuals, rural patients may be disproportionately impacted by these requirements. We assessed the perceptions and experiences of rural-residing Veterans and clinicians related to this requirement. Methods. We conducted qualitative semistructured one-on-one interviews of 26 rural-residing veterans, 10 VA orthopaedic surgery staff (from two Veterans Integrated Services Networks), 24 PCPs who serve rural veterans (14 VA; 10 non-VA), and 4 VA pharmacists. Using the knowledge, attitudes, and behavior framework, we performed conventional content analysis. Results. We found three primary themes across respondents: (1) knowledge of and the evidence base for the requirement varied widely; (2) strong personal attitudes toward the requirement; and (3) implementation and possible implications of this requirement. All surgery staff reported knowledge of requirements at their institution. VA PCPs reported knowledge of requirements but typically could not recall specifics. Most patients were unaware. The majority of respondents felt this requirement could increase motivation to quit smoking. Some PCPs felt a more thorough explanation of smoking-related complications would result in increased quit attempts. About half of all patients reported belief that the requirement was reasonable regardless of initial awareness. Respondents expressed little concern that the requirement might increase rural-urban disparities. Most PCPs and patients felt that there should be exceptions for allowing surgery, while surgical staff disagreed. Discussion. Most respondents thought elective surgery was a good motivator to quit smoking; but patients, PCPs, and surgical staff differed on whether there should be exceptions to the requirement that patients quit preoperatively. Future efforts to augment perioperative smoking cessation may benefit from improving coordination across services and educating patients more about the benefits of quitting.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Globally, South Asia has the highest proportion of disabling hearing loss. There is a paucity of data exploring the associated hearing loss and disability caused by chronic middle-ear disease in South Asia in the setting of surgical outreach. This study aimed to measure disability using the World Health Organization Disability Assessment Schedule 2.0 in patients undergoing ear surgery for chronic middle-ear disease in an ear hospital in Nepal.
Method
The World Health Organization Disability Assessment Schedule 2.0 was translated into Nepali and administered by interview to patients before ear surgery, and results were correlated with pre-operative audiograms.
Results
Out of a total of 106 patients with a mean age of 23 years, the mean World Health Organization Disability Assessment Schedule 2.0 score was 17.7, and the highest domain scores were for domain 6 ‘participation in society’ at a score of 34. There was a positive correlation of World Health Organization Disability Assessment Schedule 2 score with hearing level (r = 0.46).
Conclusion
Patients with ear disease in Nepal have had their disability measured using the World Health Organization Disability Assessment Schedule 2.0. Our study demonstrated a correlation between impaired hearing and disability in a surgical outreach context, which was an expected but not previously reported finding.
Individuals with treatment-resistant depression (TRD) experience a high burden of illness. Current guidelines recommend a stepped care approach for treating depression, but the extent to which best-practice care pathways are adhered to is unclear.
Aims
To explore the extent and nature of ‘treatment gaps’ (non-adherence to stepped care pathways) experienced by a sample of patients with established TRD (non-response to two or more adequate treatments in the current depressive episode) across three cities in the UK.
Method
Five treatment gaps were considered and compared with guidelines, in a cross-sectional retrospective analysis: delay to receiving treatment, lack of access to psychological therapies, delays to medication changes, delays to adjunctive (pharmacological augmentation) treatment and lack of access to secondary care. We additionally explored participant characteristics associated with the extent of treatment gaps experienced.
Results
Of 178 patients with TRD, 47% had been in the current depressive episode for >1 year before initiating antidepressants; 53% had received adequate psychological therapy. A total of 47 and 51% had remained on an unsuccessful first and second antidepressant trial respectively for >16 weeks, and 24 and 27% for >1 year before medication switch, respectively. Further, 54% had tried three or more antidepressant medications within their episode, and only 11% had received adjunctive treatment.
Conclusions
There appears to be a considerable difference between treatment guidelines for depression and the reality of care received by people with TRD. Future research examining representative samples of patients could determine recommendations for optimising care pathways, and ultimately outcomes, for individuals with this illness.
High-quality diets have been found to be beneficial in preventing long-term weight gain. However, concurrent changes in diet quality and body weight over time have rarely been reported. We examined the association between 10-year changes in diet quality and body weight in the Multiethnic Cohort Study. Analyses included 53 977 African Americans, Native Hawaiians, Japanese Americans, Latinos and Whites, who completed both baseline (1993–1996, 45–69 years) and 10-year follow-up (2003–2008) surveys including a FFQ and had no history of heart disease or cancer. Using multivariable regression, weight changes were regressed on changes in four diet quality indexes, Healthy Eating Index-2015, Alternative Healthy Eating Index-2010, alternate Mediterranean Diet and Dietary Approaches to Stop Hypertension scores. Mean weight change over 10 years was 1·2 (sd 6·8) kg in men and 1·5 (sd 7·2) kg in women. Compared with stable diet quality (< 0·5 sd change), the greatest increase (≥ 1 sd increase) in the diet scores was associated with less weight gain (by 0·55–1·17 kg in men and 0·62–1·31 kg in women). Smaller weight gain with improvement in diet quality was found in most subgroups by race/ethnicity, baseline age and baseline BMI. The inverse association was stronger in younger age and higher BMI groups. Ten-year improvement in diet quality was associated with a smaller weight gain, which varied by race/ethnicity and baseline age and BMI. Our findings suggest that maintaining a high-quality diet and improving diet quality over time may prevent excessive weight gain.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
A new analysis of the type material of cheralite specifies the individual rare earth elements. Unit cell contents are: (REE1.58Th1.15Ca1.03Pb0.05U0.15)3.96 (P3.67Si0.33)4.01O16. Refinement of the cell parameters based on new XRD data shows that a = 6.7515 ± 0,0005 Å, b = 6.9625 ± 0.0005, c = 6.468 ± 0.0005, β = 103° 53′, giving a cell volume of 295.2 Å3. Space group P21/n.