To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The Grothendieck construction establishes an equivalence between fibrations, a.k.a. fibred categories and indexed categories and is one of the fundamental results of category theory. Cockett and Cruttwell introduced the notion of fibrations into the context of tangent categories and proved that the fibres of a tangent fibration inherit a tangent structure from the total tangent category. The main goal of this paper is to provide a Grothendieck construction for tangent fibrations. Our first attempt will focus on providing a correspondence between tangent fibrations and indexed tangent categories, which are collections of tangent categories and tangent morphisms indexed by the objects and morphisms of a base tangent category. We will show that this construction inverts Cockett and Cruttwell’s result, but it does not provide a full equivalence between these two concepts. In order to understand how to define a genuine Grothendieck equivalence in the context of tangent categories, inspired by Street’s formal approach to monad theory we introduce a new concept: tangent objects. We show that tangent fibrations arise as tangent objects of a suitable $2$-category and we employ this characterisation to lift the Grothendieck construction between fibrations and indexed categories to a genuine Grothendieck equivalence between tangent fibrations and tangent indexed categories.
This study evaluates the relationship between Systemic Immune-Inflammation Index and secondary tonsillar haemorrhage after tonsillectomy.
Methods
Sixty pediatric patients with secondary haemorrhage and 60 without bleeding were grouped for comparative analysis. Laboratory parameters and Systemic Immune-Inflammation Index values were collected preoperatively, on the bleeding day and on the control day, then compared.
Results
Secondary haemorrhage occurred in 60 patients (3.11 per cent), with a mean age of 8.85 ± 3.07 years. Bleeding occurred at 8.63 ± 2.32 days post-operatively (range: 72 hours–21 days). Tonsillectomy day: Neutrophil count and Systemic Immune-Inflammation Index were significantly higher in the haemorrhage group (p < 0.001). Haemorrhage vs. tonsillectomy day (haemorrhage group): Platelet, neutrophil and Systemic Immune-Inflammation Index increased, while lymphocytes decreased (p < 0.001). Haemorrhage vs. control day: Neutrophil count and Systemic Immune-Inflammation Index remained significantly higher (p < 0.001).
Conclusion
Systemic Immune-Inflammation Index, a novel inflammatory marker, may help predict post-tonsillectomy haemorrhage risk.
Direct numerical simulation (DNS) studies of power-law (PL) fluids are performed for purely viscous-shear-thinning ($n\in [0.5,0.75]$), Newtonian ($n=1$) and purely viscous-shear-thickening ($n=2.0$) fluids, considering two Reynolds numbers ($Re_{\tau }\in [395,590]$), and both smooth and rough surfaces. We carefully designed a numerical experiment to isolate key effects and simplify the complex problem of turbulent flow of non-Newtonian fluids over rough surfaces, enabling the development of a theoretical model to explain the observed phenomena and provide predictions. The DNS results of the present work were validated against literature data for smooth and rough Newtonian turbulent flows, as well as smooth shear-thinning cases. A new analytical expression for the mean velocity profile – extending the classical Blasius $1/7$ profile to power-law fluids – was proposed and validated. In contrast to common belief, the decrease in $n$ leads to smaller Kolmogorov length scales and the formation of larger structures, requiring finer grids and longer computational domains for accurate simulations. Our results confirm that purely viscous shear-thinning fluids exhibit drag reduction, while shear-thickening fluids display an opposite trend. Interestingly, we found that viscous-thinning turbulence shares similarities with Newtonian transitional flows, resembling the behaviour of shear-thinning, extensional-thickening viscoelastic fluids. This observation suggests that the extensional and elastic effects in turbulent flows within constant cross-section geometries may not be significant. However, the shear-thickening case exhibits characteristics similar to high-Reynolds-number Newtonian turbulence, suggesting that phenomena observed in such flows could be studied at significantly lower Reynolds numbers, reducing computational costs. In the analysis of rough channels, we found that the recirculation bubble between two roughness elements is mildly influenced by the thinning nature of the fluid. Moreover, we observed that shear-thinning alters the flow in the fully rough regime, where the friction factor typically reaches a plateau. Our results indicate the possibility that, at sufficiently high Reynolds numbers, this plateau may not exist for shear-thinning fluids. Finally, we provide detailed turbulence statistics for different rheologies, allowing, for the first time, an in-depth study of the effects of rheology on turbulent flow over rough surfaces.
A common feature of public policy in Australia in recent decades has been use of wage caps to restrain public sector wage growth. In this paper we explore the nature of the relationship between wage growth in public and private sectors, and thereby whether wage caps have also influenced private sector wage growth. Despite the differences in wage setting institutions and mechanisms, analysis presented reveals that private and public sector wage growth are closely entwined at the aggregate level for Australia, and in all states and territories. Naïve Vector Error Correction Models identify the private sector as the long run wage leader for Australia and half the states and territories. However, after controlling for a structural break occurring during the COVID-19 era, our results indicate that joint or bi-directional wage leadership between both sectors is the norm. Findings suggest that wage caps implemented after the GFC to suppress public sector wage growth likely spilled over to the private sector, contributing to widespread wage stagnation experienced throughout the 2010s. More recently, these public sector wage caps stifled the ability of public sector wages to adjust to rapid private sector wage growth. These findings have important policy implications for public sector wage policy as a key contributor to governments’ labour market and macroeconomic management.
Patient involvement enhances transparency, legitimacy, and responsiveness in pharmaceutical reimbursement decisions. Guided by the mosaic model, this study recognizes effective patient engagement requires diverse context-specific approaches. Despite Taiwan’s National Health Insurance Administration (NHIA) implementing policies, gaps remain between intent and practice. This study evaluates NHIA’s incorporation of patient inputs into reimbursement decisions and examines factors influencing involvement.
Methods
We analyzed pharmaceutical company-initiated reimbursement submissions for catastrophic illnesses reviewed by the Pharmaceutical Benefit and Reimbursement Scheme Joint Committee (PBRS) from 2016 to 2023. Data sources included PBRS meeting records, the Online Patient Opinion Platform (OPOP), and NHIA notification E-mails. Generalized linear models identified predictors of patient involvement. The association between patient involvement and PBRS decisions was also explored.
Results
Patient involvement occurred in 28.4 percent (80/282) of all submissions, increasing from 17 percent (2016) to 44 percent (2023). Despite aligning with OPOP criteria, patient involvement remained incomplete. Discussion-type submissions, oncology drugs, and new drug applications showed higher involvement, whereas autoimmune diseases and new indication submissions had lower involvement. Budget impact and innovation categories were not significant predictors in adjusted models. The presence of patient involvement was not significantly associated with the PBRS approval rate. Ad hoc analysis revealed increased involvement for new indications following policy expansion.
Conclusions
Despite NHIA’s efforts, patient involvement implementation remains suboptimal. Structured mechanisms and expanded patient involvement beyond high-profile submissions and PBRS are crucial to broaden patient involvement. This study provides practical insights for East Asian healthcare systems advancing patient involvement amid limited empirical research.
The overall objective of this study is to shed light on the disaster preparedness status of geriatric patients visiting tertiary hospitals in Istanbul while assessing the relationship between frailty scores, self-efficacy, and independence among geriatric patients.
Methods
This prospective cross-sectional study was conducted in the Emergency Medicine Departments of 2 tertiary centers in Istanbul. In the survey, health and frailty status, demographics, and earthquake preparedness and planning were assessed. The Clinical Frailty Scale (CFS), Tilburg Frailty Indicator (TFI), and PRISMA-7 score were administered. Contingency tables were constructed to examine the associations between frailty categories and categorical outcomes related to disaster preparedness, self-efficacy, and independence.
Results
A small portion (5.4%) of patients had received earthquake preparedness training. Regarding emergency preparedness, 32.4% had easy access to a list of emergency contacts, and 32.1% knew the location of the emergency kit. A relationship was found between the presence of an earthquake preparedness kit and the CFS and TFI (P<0.005). All the self-efficacy and independence parameters needed during disasters were found to be significantly higher among frailer patients (P<0.005).
Conclusions
Inadequate disaster preparedness, characterized by low self-efficacy and high external dependence, are influenced by frailty. Enhancing disaster preparedness requires identifying and supporting frail individuals.
The green lacewing Chrysoperla zastrowi sillemi (Esben–Peterson) (Neuroptera: Chrysopidae), a polyphagous predator, is an effective biocontrol agent against various aphid species. Its efficacy was assessed against Pterochloroides persicae (Hemiptera: Aphididae), a major pest of peach and nectarine orchards. This study investigates the developmental biology, population growth parameters, host-kill dynamics, and aphid consumption by C. zastrowi sillemi when fed on P. persicae. The development of C. zastrowi sillemi stages was recorded, with egg, larval instar, and adult durations averaging 2.21, 3.71, 2.29, and 3.21 days, respectively. Adult longevity was 34.33 days for males and 42.12 days for females. The female pre-ovipositional period was 6.25 days, with a total ovipositional period of 21.88 days. Population growth parameters indicated a true generation time of 35.39 ± 0.322 days, intrinsic rate of increase of 0.110 and a net reproductive rate of 52.64. A total fecundity of 131.77 eggs per female was recorded. The consumption of P. persicae by the first, second and third larval instars of C. zastrowi sillemi was 18.36, 25.07, and 85.21 aphids, respectively, with the third instar being the most voracious. The net predation rate was 90.868 aphids per day, with a transformation rate of 1.84 aphids per offspring produced. These results highlight the probability of C. zastrowi sillemi as a potential biocontrol agent for P. persicae management in agro-ecosystems, offering insights into its predation behaviour, reproductive parameters and will be useful in conducting further field evaluations before formulating it in integrated pest management programme.
Good welfare is of inherent value to all captive animals and promotes species conservation objectives. Concern for animal welfare is growing globally, and research shows that animal welfare is a top priority for zoo visitors. There is, therefore, an urgent need for zoos to develop and validate species-specific welfare assessment tools with a shift in focus away from avoiding negative affective states, and towards promoting positive ones. This shift in emphasis requires the development of comprehensive and robust welfare assessment protocols incorporating species-specific indicators. This study aimed to identify and propose welfare indicators for captive chimpanzees (Pan troglodytes) that could be used to adapt the EU Welfare Quality® protocol for this species. A literature review was carried out according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines and the authors followed the principles of a systematic review to identify a comprehensive set of welfare indicators for this species. Overall, 14 animal-based and 16 resource-based indicators are proposed to assess the 12 criteria and four principles of Welfare Quality®. This study represents the first effort to adapt the EU Welfare Quality® protocol to assess captive chimpanzee welfare and illustrates how this protocol can be adapted to develop a taxon-specific welfare assessment tool once species-specific natural history and biology are considered.
Theology has traditionally been understood as a speculative discipline centered on God. However, the increasing dominance of historical methods in modern theological inquiry raises a fundamental question: Has theology shifted from being a science of God to a study of historical documents? This article examines how four early 20th-century Dominicans from the Saulchoir—Antoine Lemonnyer, Mannès Jacquin, Marie-Benoît Schwalm, and Ambroise Gardeil—responded to this challenge. Writing in the context of the Modernist crisis, they defended the primacy of speculative theology while integrating historical studies within a Thomistic framework. Their work articulated a synthesis in which historical research serves theology without displacing its speculative and supernatural character. These insights remain relevant for contemporary theological discourse, offering a model for balancing historical inquiry with the contemplative and systematic study of God.
To evaluate early postoperative complaints using the Palate postoperative Problems Score in patients undergoing modified barbed reposition pharyngoplasty with tonsillectomy and tonsillectomy alone.
Methods
The study included 40 patients who underwent modified barbed reposition pharyngoplasty with tonsillectomy and 18 patients who had tonsillectomy alone. Patients completed the Palate Postoperative Problems Score questionnaire at the first, third and sixth months post-operatively, and changes in their complaints were observed. Additional data included the Epworth Sleepiness Scale and sleep parameters (apnea-hypopnea Index, body mass index and oxygen saturation).
Results
In the modified barbed reposition pharyngoplasty group, Palate Postoperative Problems Scores decreased significantly from 8.85 (month 1) to 4.07 (month 6). The tonsillectomy group also showed significant improvement (from 5.28 to 2.61 by month 3).
Conclusion
The Palate Postoperative Problems Score questionnaire is an effective tool for assessing post-operative symptoms after palate surgery. Repeated use enables monitoring of patient recovery and the impact of tonsillectomy should be considered in Palate Postoperative Problems Score-based evaluations.
This text accompanies the performance A Foot, A Mouth, A Hundred Billion Stars, which premiered at the Lapworth Museum of Geology in the United Kingdom on 18 March 2023, as part of the Flatpack film festival. It includes both the text and a film version, developed during a residency at the museum. Over 18 months, I had full access to the collection and archives, selecting objects that served as prompts for stories about time and memory. A central theme of the work is slippage – misremembering and misunderstanding – as a generative methodology for exploring the connection between the collection, our past, and possible futures.
A Foot, A Mouth, A Hundred Billion Stars combines analogue media and digital technologies to examine our understanding of remembering and forgetting. I used a live digital feed and two analogue slide projectors to explore the relationships between image and memory. This article does not serve as a guide to the performance but instead reflects on the process and the ideas behind the work. My goal is to share my practice of rethinking memory through direct engagement with materials. In line with the performance’s tangential narrative, this text weaves together diverse references, locations, thoughts, and ideas, offering a deeper look into the conceptual framework of the work.
The next-generation radio astronomy instruments are providing a massive increase in sensitivity and coverage, largely through increasing the number of stations in the array and the frequency span sampled. The two primary problems encountered when processing the resultant avalanche of data are the need for abundant storage and the constraints imposed by I/O, as I/O bandwidths drop significantly on cold storage. An example of this is the data deluge expected from the SKA Telescopes of more than 60 PB per day, all to be stored on the buffer filesystem. While compressing the data is an obvious solution, the impacts on the final data products are hard to predict. In this paper, we chose an error-controlled compressor – MGARD – and applied it to simulated SKA-Mid and real pathfinder visibility data, in noise-free and noise-dominated regimes. As the data have an implicit error level in the system temperature, using an error bound in compression provides a natural metric for compression. MGARD ensures the compression incurred errors adhere to the user-prescribed tolerance. To measure the degradation of images reconstructed using the lossy compressed data, we proposed a list of diagnostic measures, exploring the trade-off between these error bounds and the corresponding compression ratios, as well as the impact on science quality derived from the lossy compressed data products through a series of experiments. We studied the global and local impacts on the output images for continuum and spectral line examples. We found relative error bounds of as much as 10%, which provide compression ratios of about 20, have a limited impact on the continuum imaging as the increased noise is less than the image RMS, whereas a 1% error bound (compression ratio of 8) introduces an increase in noise of about an order of magnitude less than the image RMS. For extremely sensitive observations and for very precious data, we would recommend a $0.1\%$ error bound with compression ratios of about 4. These have noise impacts two orders of magnitude less than the image RMS levels. At these levels, the limits are due to instabilities in the deconvolution methods. We compared the results to the alternative compression tool DYSCO, in both the impacts on the images and in the relative flexibility. MGARD provides better compression for similar error bounds and has a host of potentially powerful additional features.
Accelerator-driven systems (ADSs) may offer a promising technology for energy production and transmutation of nuclear waste. Here we introduce the concept of utilizing high-intensity laser acceleration technology in realizing an ADS, with a focus on the use of thorium fuel in subcritical systems. We explore state-of-the-art laser-driven particle sources for neutron generation by nuclear fusion, spallation or photonuclear reactions and the prospect of reaching the flux of ${10}^{15}$ n/s required to drive a subcritical reactor. We review recent advances in high-power laser amplification and assess their technological readiness in view of integration in an ADS. Finally, we present a risk analysis of a laser-driven ADS in terms of laser and target development, radiation safety and operational stability. Our conclusion highlights the potential of laser-driven ADSs as a transformative approach to nuclear fission energy. With continued research and development, technological hurdles can be overcome to fully realize sustainable, green energy production that can meet global energy demands while addressing safety and environmental concerns.
The training of Artificial Intelligence (AI) models relies on extensive amounts of “data,” often sourced from content protected by copyright, related and sui generis rights. The discussion of whether and how to strike a balance between licensing and exceptions under copyright law is one of global relevance. While some countries have adopted or considered adopting specific exceptions to allow text and data mining (TDM), others (most) have not introduced any new legislation. In Europe, much of the attention has so far centred on Article 4 of Directive 2019/790 (DSMD), including in the context of a potential UK reform.
The starting point of this contribution is the following four-fold observation. First, TDM may be part of AI training processes, but it is neither synonymous with AI training nor is it all that AI training entails, including in terms of acts restricted by copyright and related rights. Second, from a European (thus including both the EU and the UK) perspective, limiting the attention to Article 4 DSMD is myopic, as national case law demonstrates. Third, calls have recently been made to relax EU copyright rules to facilitate “research,” seemingly including the President of the European Commission herself, who announced forthcoming legislative proposals “to make Europe the home of innovation again.” Fourth, the UK Government’s Copyright and AI consultation has recently ended: should no reform be ultimately undertaken, the application of the existing TDM exception will depend to a large extent on how courts construe the notions of “research” and the “non-commercial” requirement thereof.
Moving from the above, this study investigates whether and to what extent unlicensed AI training activities could be undertaken by relying, not on Article 4 DSMD as transposed into national law or a hypothetical reform of the UK system of exceptions, but rather on what appear to be so far potentially overlooked defences. Reference is made specifically to research and education exceptions, notably Article 3 DSMD and Article 5(3)(a) of Directive 2001/29 (InfoSoc Directive), also read in light of Article 5 DSMD. The discussion of other jurisdictions – including the US and countries, like South Korea and Singapore, which have adopted open-ended fair use-style defences – is also undertaken. This is done to determine whether unlicensed AI training, including training seemingly done for the purpose of research or education/learning, might be considered lawful.
In light of the context summarized above, the study tackles two key questions: (a) whether unlicensed AI training may be classified as “research” or even “learning” in the context of “teaching,” and (b) whether commercial AI developers may take advantage of the provisions above. Ultimately, both questions are answered in the negative, finding that no exception or open-ended defence fully covers unlicensed AI training activities. As a result, a licensing approach (and culture) appears to be the way for AI training to be undertaken lawfully, including when this is done for “research” and “learning.”
This editorial essay describes what is phenomenon-based research and why it is important to conduct indigenous Chinese management research. Grounded in the Chinese context, the author identifies emerging new organizational phenomena in the digital age that calls for new theoretical explanations and empirical validation. Adopting an evolution of theories perspective, the author outlines the various paths that can move a new theory explaining an indigenous phenomenon toward becoming a universal theory that can transcend time and space.
The 2007 adoption of the United Nations Declaration on the Rights of Indigenous Peoples (UNDRIP) marked a critical juncture in the area of Indigenous rights. As a nonbinding agreement, its adoption is at the discretion of each state, resulting in significant state-level variation. Importantly, within-state variations remain underexplored. These differences are potentially significant in federal, decentralized countries such as Canada. This article examines why some provinces and territories lead in implementing the key principles embedded in UNDRIP, whereas others have dragged their feet. We collected 230 Canadian regulations introduced at the subnational level between 2007 and 2023, and assessed the impact of three key variables (i.e. political ideology, resource politics and issue voting). We found that none of these variables explained within-state variations on their own. To further explore the role of these variables, we subsequently compared two provinces at different stages of the UNDRIP implementation spectrum (Québec and British Columbia).