We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In individuals with irritable bowel syndrome (IBS), eliminating dietary triggers can alleviate symptoms but may lead to nutrient deficiencies and overall health decline. Although various nutritional supplements show promising results in relieving IBS symptoms due to their potential to alter the microbiome, conclusive scientific evidence remains lacking. This exploratory study aims to assess the bifidogenic properties of four nutritional supplement interventions and their impact on IBS-symptoms, faecal microbiota composition, faecal short-chain fatty acid (SCFA) concentrations, stool pattern, and quality of life (QoL), compared to a placebo control. Seventy subjects with IBS, meeting the ROME IV criteria, participated in this randomised, double-blind, placebo-controlled parallel intervention study. Subjects were assigned to one of the four treatment groups, receiving either resistant starch, pea fibre, chondroitin sulfate, protein hydrolysate, or placebo daily for four weeks. Daily reports on stool pattern and gastrointestinal complaints were collected. Stool samples and questionnaires on dietary intake, symptom severity, QoL, and anxiety and depression were collected at baseline and after the 4-week intervention. The results show no significant increase in Bifidobacterium abundance or faecal SCFA levels after the 4-week intervention with any of the four nutritional supplement interventions. While some improvements in symptom severity and QoL were observed within-groups, these were not significantly different from changes observed with placebo. In conclusion, the tested nutritional supplements did not increase Bifidobacterium abundance in subjects with IBS within four weeks. Furthermore, we conclude that future studies should consider a run-in period and a larger sample size to study improvements in IBS symptoms.
Legume lectins represent a broad class of environmental toxicants that bind to cell surface glycoproteins. Raw red kidney beans (RRKB), a widely consumed common source of dietary protein, are rich in the lectin phytohemagglutinin (PHA). Consumption of improperly cooked (which may require overnight presoaking and boiling at least at 100°C for 45 min) red kidney beans causes severe gastrointestinal symptoms. Since the relationship between lectin toxicity and the cellular chaperone machinery remains unknown, the study aimed to determine the effects of heat-denatured PHA on epithelial barrier function and heat shock protein 70 (HSP70) expression and its function as a molecular chaperone in PHA-treated Caco-2 cells and animals. Twelve male Sprague-Dawley rats were randomised to an ad libitum diet of either standard rat chow or chow containing 26% crude red kidney beans. We measured HSP70 and heat shock factor 1 gene expressions in the small intestine and HSP70 protein expression in Caco-2 cells. In Caco-2 cells, luciferase activity was measured to investigate protein folding. Fluorescein-5-isothiocyanate (FITC)-labelled lectin was used to study its intracellular uptake by Caco-2 cells. PHA lectin reduced transepithelial electrical resistance in Caco-2 cells. FITC-labelled PHA entered Caco-2 cells within 3 h of treatment. PHA treatment significantly reduced HSP70 levels and luciferase activity in Caco-2 cells, which was prevented by HSP70 overexpression. In rats fed RRKB chow consisting of legume lectins, we found reduced levels of HSP70 and heat shock factor 1. These observations suggest that lectins counter the protective function of HSP70 on intestinal barrier function.
Inflammation and infections such as malaria affect concentrations of many micronutrient biomarkers and hence estimates of nutritional status. We aimed to assess the relationship between malaria infection and micronutrient biomarker concentrations in pre-school children (PSC), school-age children (SAC) and women of reproductive age (WRA) in Malawi and examine the potential role of malarial immunity on the relationship between malaria and micronutrient biomarkers. Data from the 2015/2016 Malawi micronutrient survey were used. The associations between current or recent malaria infection, detected by rapid diagnostic test and concentration of serum ferritin, soluble transferrin receptor (sTfR), zinc, serum folate, red blood cell folate and vitamin B12 were estimated using multivariable linear regression. Factors related to malarial immunity including age, altitude and presence of hemoglobinopathies were examined as effect modifiers. Serum ferritin, sTfR and zinc were adjusted for inflammation using the BRINDA method. Malaria infection was associated with 68 % (95 % CI 51, 86), 28 % (18, 40) and 34 % (13, 45) greater inflammation-adjusted ferritin in PSC, SAC and WRA, respectively (P < 0·001 for each). In PSC, the positive association was stronger in younger children, high altitude and children who were not carriers of the sickle cell trait. In PSC and SAC, sTfR was elevated (+ 25 % (16, 29) and + 15 % (9, 22) respectively, P < 0·001). Serum folate and erythrocyte folate were elevated in WRA with malaria (+ 18 % (3, 35) and + 11 % (1, 23), P = 0·01 and P = 0·003 respectively). Malaria affects the interpretation of micronutrient biomarker concentrations, and examining factors related to malarial immunity may be informative.
Coronavirus disease-2019 precipitated the rapid deployment of novel therapeutics, which led to operational and logistical challenges for healthcare organizations. Four health systems participated in a qualitative study to abstract lessons learned, challenges, and promising practices from implementing neutralizing monoclonal antibody (nMAb) treatment programs. Lessons are summarized under three themes that serve as critical building blocks for health systems to rapidly deploy novel therapeutics during a pandemic: (1) clinical workflows, (2) data infrastructure and platforms, and (3) governance and policy. Health systems must be sufficiently agile to quickly scale programs and resources in times of uncertainty. Real-time monitoring of programs, policies, and processes can help support better planning and improve program effectiveness. The lessons and promising practices shared in this study can be applied by health systems for distribution of novel therapeutics beyond nMAbs and toward future pandemics and public health emergencies.
Inflammation and infections such as malaria affect micronutrient biomarker concentrations and hence estimates of nutritional status. It is unknown whether correction for C-reactive protein (CRP) and α1-acid glycoprotein (AGP) fully captures the modification in ferritin concentrations during a malaria infection, or whether environmental and sociodemographic factors modify this association. Cross-sectional data from eight surveys in children aged 6–59 months (Cameroon, Cote d’Ivoire, Kenya, Liberia, Malawi, Nigeria and Zambia; n 6653) from the Biomarkers Reflecting Inflammation and Nutritional Determinants of Anaemia (BRINDA) project were pooled. Ferritin was adjusted using the BRINDA adjustment method, with values < 12 μg/l indicating iron deficiency. The association between current or recent malaria infection, detected by microscopy or rapid test kit, and inflammation-adjusted ferritin was estimated using pooled multivariable linear regression. Age, sex, malaria endemicity profile (defined by the Plasmodium falciparum infection prevalence) and malaria diagnostic methods were examined as effect modifiers. Unweighted pooled malaria prevalence was 26·0 % (95 % CI 25·0, 27·1) and unweighted pooled iron deficiency was 41·9 % (95 % CI 40·7, 43·1). Current or recent malaria infection was associated with a 44 % (95 % CI 39·0, 52·0; P < 0·001) increase in inflammation-adjusted ferritin after adjusting for age and study identifier. In children, ferritin increased less with malaria infection as age and malaria endemicity increased. Adjustment for malaria increased the prevalence of iron deficiency, but the effect was small. Additional information would help elucidate the underlying mechanisms of the role of endemicity and age in the association between malaria and ferritin.
To investigate the efficacy and safety of non-invasive ventilation (NIV) with high PEEP levels application in patients with COVID–19–related acute respiratory distress syndrome (ARDS).
Methods:
This is a retrospective cohort study with data collected from 95 patients who were administered NIV as part of their treatment in the COVID-19 intensive care unit (ICU) at University Hospital Centre Zagreb between October 2021 and February 2022. The definite outcome was NIV failure.
Results:
High PEEP NIV was applied in all 95 patients; 54 (56.84%) patients could be kept solely on NIV, while 41 (43.16%) patients required intubation. ICU mortality of patients solely on NIV was 3.70%, while total ICU mortality was 35.79%. The most significant difference in the dynamic of respiratory parameters between 2 patient groups was visible on Day 3 of ICU stay: By that day, patients kept solely on NIV required significantly lower PEEP levels and had better improvement in PaO2, P/F ratio, and HACOR score.
Conclusion:
High PEEP applied by NIV was a safe option for the initial respiratory treatment of all patients, despite the severity of ARDS. For some patients, it was also shown to be the only necessary form of oxygen supplementation.
The polymerase chain reaction (PCR) is the subject for Chapter 9. The basic principle is outlined, and the standard end-point PCR technique is described to illustrate how DNA amplification from defined primers is achieved. The design of primers for PCR is detailed, and the effect of redundancy of the genetic code noted when working from amino acid sequence data. The use of thermostable DNA polymerases in enabling automation of the PCR process using thermal cycling is outlined. Many different applications have been developed for the PCR, with variants of the basic protocol becoming more complex and sophisticated. PCR from mRNA templates is described, with other variants, including nested PCR, inverse PCR, quantitative and digital PCR, outlined. The extensive range of PCR variants is listed for comparison, and used to illustrate how the original technique of sequential amplification of DNA has become a key technique for the detection, analysis and quantification of DNA.
In Chapter 8, various strategies that can be used to clone DNA fragments are described. Cloning genomic DNA and complementary DNA (cDNA) to generate libraries of cloned fragments remain two of the most common methods for primary library construction. Fragments may also be generated by polymerase chain reaction (PCR), or may be designed from a sequence database and synthesised in vitro. The choice of vector (plasmid, bacteriophage, virus or artificial chromosome) depends on the intended outcome, the size and origin of fragments and whether it is a primary cloning or a sub-cloning protocol. Restriction-dependent and restriction-independent methods can be used to join fragments to vectors. Techniques such as Golden Gate cloning, Gateway technology and Gibson assembly have mostly replaced earlier methods and can be used to assemble several fragments into a multi-fragment construct.
In India, the restriction of genetically modified (GM) crops and derived products not approved in the country necessitates surveillance for transgene(s) in plant material/products imported into the country. CDC Triffid expressing acetolactate synthase (ALS) conferring tolerance to sulphonylurea herbicide is the only GM flax event that has got approval in Canada in 1990s and subsequently deregistered in 2001. In spite of deregistration, the unexpected and unauthorized detection of traces of GM flax in the consignments imported from Canada to Europe has further necessitated the stringent monitoring of flax shipments from Canada for suspected GM incidents. This study reports on the detection of transgenic elements being present in GM flax employing polymerase chain reaction assays, in a set of 123 flaxseed accessions imported from Canada for research purpose. Based on the tests conducted, none of the transgenic elements, namely, nos promoter (P-nos), nos terminator (T-nos), nptII marker gene, ALS transgene, as present in the GM flax CDC Triffid were detected in any of the tested accessions. The well-known herbicide tolerance gene cp4-epsps, being employed in Roundup® Ready events of other crops, was also not detected in these samples. This case study has demonstrated the importance of monitoring the presence of transgene(s) in flaxseed imports, and such studies need to be carried out for the imported seeds from the country where GM events of respective crop are being approved whereas they have not been approved in the country of import as a part of precautionary approach.
On August 4, 2020, a massive explosion struck the Beirut Harbor in Lebanon. Approximately 220 people were killed and around 7,000 were injured, of which 12% were hospitalized. Despite being weakened by economic crisis and increasing numbers of coronavirus disease 2019 (COVID-19) cases, the national health care system responded promptly. Within a day, international health care assistance in the form of International Emergency Medical Teams (I-EMTs) started arriving. Previous studies have found that I-EMTs have arrived late and have not been adapted to the context and dominating health care needs. The aim of this study was to document the organization, type, activity, and timing of I-EMTs deployed to Beirut and to discuss their relevance in relation to medical needs.
Methods:
Data on all deployed I-EMTs were retrieved from all available sources, including internet searches, I-EMT contacts, and from the World Health Organization (WHO) EMT coordination cell (EMT CC) in Lebanon. The WHO EMT classification was used to categorize deployed teams. Information on characteristics, timing, and activities was retrieved and systematically assessed.
Results:
Nine I-EMTs were deployed to Beirut following the explosion. Five were equivalent to EMT Type 2 (field hospitals), out of which three were military. The first EMT Type 2 arrived within 24 hours, while the last EMT set up one month after the explosion. Four civilian I-EMTs provided non-clinical support as EMT Specialized Care Teams. A majority of the I-EMTs were focused on trauma care. Three of the four I-EMT Specialized Care Teams were rapidly re-tasked to support COVID-19 care in public hospitals.
Conclusion:
A majority of the deployed I-EMT Type 2 were military and focused on trauma care rather than the normal burden of disease including COVID-19. Re-tasking of EMTs requires flexible EMTs. To be better adapted, the I-EMT response should be guided by a systematic assessment of both health care capacities in the affected country as well as the varying health effects of hazards before deployment.
The Howard Springs Quarantine Facility (HSQF) is located in tropical Northern Australia and has 875 blocks of four rooms (3,500 rooms in total) spread over 67 hectares. The HSQF requires a large outdoor workforce walking outdoor pathways to provide individual care in the ambient climate. The personal protective equipment (PPE) required for the safety of quarantine workers varies between workgroups and limits body heat dissipation that anecdotally contributes to excessive sweating, which combined with heat stress symptoms of fatigue, headache, and irritability, likely increases the risk of workplace injuries including infection control breaches.
Study Objective:
The purpose of this study was the description of qualitative and quantitative assessment for HSQF workers exposed to tropical environmental conditions and provision of evidenced-based strategies to mitigate the risk of heat stress in an outdoor quarantine and isolation workforce.
Methods:
The study comprised two components - a cross-sectional physiological monitoring study of 18 workers (eight males/ten females; means: 41.4 years; 1.69m; 80.6kg) during a single shift in November 2020 and a subjective heat health survey completed by participants on a minimum of four occasions across the wet season/summer period from November 2020 through February 2021. The physiological monitoring included continuous core temperature monitoring and assessment of fluid balance.
Results:
The mean apparent temperature across first-half and second-half of the shift was 34.7°C (SD = 0.8) and 35.6°C (SD = 1.9), respectively. Across the work shift (mean duration 10.1 hours), the mean core temperature of participants was 37.3°C (SD = 0.2) with a range of 37.0°C - 37.7°C. The mean maximal core temperature of participants was 37.7°C (SD = 0.3). In the survey, for the workforce in full PPE, 57% reported feeling moderately, severely, or unbearably hot compared to 49% of those in non-contact PPE, and the level of fatigue was reported as moderate to severe in just over 25% of the workforce in both groups.
Conclusion:
Heat stress is a significant risk in outdoor workers in the tropics and is amplified in the coronavirus disease 2019 (COVID-19) frontline workforce required to wear PPE in outdoor settings. A heat health program aimed at mitigating risk, including workplace education, limiting exposure times, encouraging hydration, buddy system, active cooling, and monitoring, is recommended to limit PPE breaches and other workplace injuries in this workforce.
There is evidence to suggest that patients delayed seeking urgent medical care during the first wave of the coronavirus disease 2019 (COVID-19) pandemic. A delay in health-seeking behavior could increase the disease severity of patients in the prehospital setting. The combination of COVID-19-related missions and augmented disease severity in the prehospital environment could result in an increase in the number and severity of physician-staffed prehospital interventions, potentially putting a strain on this highly specialized service.
Study Objective:
The aim was to investigate if the COVID-19 pandemic influences the frequency of physician-staffed prehospital interventions, prehospital mortality, illness severity during prehospital interventions, and the distribution in the prehospital diagnoses.
Methods:
A retrospective, multicenter cohort study was conducted on prehospital charts from March 14, 2020 through April 30, 2020, compared to the same period in 2019, in an urban area. Recorded data included demographics, prehospital diagnosis, physiological parameters, mortality, and COVID-status. A modified National Health Service (NHS) National Early Warning Score (NEWS) was calculated for each intervention to assess for disease severity. Data were analyzed with univariate and descriptive statistics.
Results:
There was a 31% decrease in physician-staffed prehospital interventions during the period under investigation in 2020 as compared to 2019 (2019: 644 missions and 2020: 446 missions), with an increase in prehospital mortality (OR = 0.646; 95% CI, 0.435 – 0.959). During the study period, there was a marked decrease in the low and medium NEWS groups, respectively, with an OR of 1.366 (95% CI, 1.036 – 1.802) and 1.376 (0.987 – 1.920). A small increase was seen in the high NEWS group, with an OR of 0.804 (95% CI, 0.566 – 1.140); 2019: 80 (13.67%) and 2020: 69 (16.46%). With an overall decrease in cases in all diagnostic categories, a significant increase was observed for respiratory illness (31%; P = .004) and cardiac arrest (54%; P < .001), combined with a significant decrease for intoxications (-58%; P = .007). Due to the national test strategy at that time, a COVID-19 polymerase chain reaction (PCR) result was available in only 125 (30%) patients, of which 20 (16%) were positive.
Conclusion:
The frequency of physician-staffed prehospital interventions decreased significantly. There was a marked reduction in interventions for lower illness severity and an increase in higher illness severity and mortality. Further investigation is needed to fully understand the reasons for these changes.
Coronavirus disease 2019 (COVID-19) temporary hospitals, also called “alternate care sites” (ACS), as support to the health network have had uneven use. The World Health Organization (WHO) has published different recommendations in this regard. World-wide, many health services have improved their surge capacity with the implementation of new temporary hospital structures, but there have been few experiences of use over time despite representing an important element as support to the hospital network in the management of COVID-19 patients. In this article, the experiences are explained in the design, execution, and use of the temporary COVID-19 Hospital H144 of the Health Service of the Principality of Asturias (Sespa), with 144 beds, which was in operation from April 1 through July 1, 2020 (without admitting patients) and from November 12, 2020 through March 5, 2121, admitting a total of 334 COVID-19 patients (66% women; 34% men) and generating 3,149 hospital stays. Maximum occupancy was 74 patients. Mean stay was 9.42 days (MD = 3.99; [1-34]). At discharge, 126 patients (38%) went to a nursing home, 112 (33%) to their home, 40 (12%) were transferred to another hospital, and 56 (17%) died. The mean age of the admitted patients was 82.79 years (MD = 8.68; [29-104]) and was higher in women (85.09; MD = 7.57; P = .000) than in men (78.28; MD = 9.22). Some aspects to consider for future experiences of use have been: teamwork from different fields of knowledge (ie, architecture, engineering, medicine, and nursing) is essential for success; integration in the health system must be fully developed from different perspectives (ie, information system, logistics, medical records, or clinical procedures, among others); clear procedures for patient admission from different structures (ie, home, hospitals, nursing homes, or primary health care network) must combine with flexibility of use to adapt to new and unknown circumstances; and they must not compromise specialized staff availability in other health facilities.
Bipolar disorder (BD) may be connected with accelerated aging, the marker of this can be shorter telomere length (TL). Some data suggest that lithium may exert a protective effect against telomere shortening. The study aimed to compare the TL between patients with BD and control subjects. The effect of long-term lithium treatment was also assessed.
Methods:
The study group comprised 41 patients with BD, including 29 patients treated longitudinally with lithium (mean 16.5 years) and 20 healthy people. TL was assessed by the quantitative polymerase chain reaction (qPCR).
Results:
In the control group, the TL was significantly longer in males than in females. Male bipolar patients had significantly shorter TL compared with the control male group. In bipolar patients, there was no correlation between TL and duration of treatment. The TL was negatively correlated with age in male bipolar patients.
Conclusions:
The study did not confirm the lithium effect on TL in bipolar patients. TL showed gender differences, being shorter in BD males, compared to control males, and longer in healthy males, compared to control females.
Although first responders (FRs) represent a high-risk group for exposure, little information is available regarding their risk of coronavirus disease 2019 (COVID-19) infection. The purpose of the current study was to determine the serological prevalence of past COVID-19 infection in a cohort of municipal law enforcement (LE) and firefighters (FFs).
Methods:
Descriptive analysis of a de-identified data reporting Severe Acute Respiratory Syndrome Coronavirus-2 (SARS-CoV-2) immunoglobulin G (IgG), or COR2G, serology results for municipal FRs. As part of the serology process, FRs were surveyed for COVID-19-like symptoms since February 2020 and asked to report any prior COVID-19 nasal swab testing. Descriptive statistics and two-sided Chi Square tests with Yates correction were used to compare groups.
Results:
Of 318 FRs, 225 (80.2%) underwent serology testing (LE: 163/207 [78.7%]; FF: 92/111 [82.9%]). The prevalence of positive serology for all FRs tested was 3/255 (1.2%). Two LE (1.2%) and one FF (1.1%) had positive serology (P = 1.0). Two hundred and twenty-four FRs responded to a survey regarding prior symptoms and testing. Fifty-eight (25.9%) FRs (44 LE; 14 FFs) reported the presence of COVID-19-like symptoms. Of these, only nine (15.5%) received reverse transcriptase – polymerase chain reaction (RT-PCR) testing; none were positive. Two of the three FRs with positive serology reported no COVID-19-like symptoms and none of these responders had received prior nasal RT-PCR swabs. The overall community positive RT-PCR rate was 0.36%, representing a three-fold higher rate of positive seroprevalence amongst FRs compared with the general population (P = .07).
Conclusions:
Amongst a cohort of municipal FRs with low community COVID-19 prevalence, the seroprevalence of SARS-CoV-19 IgG Ab was three-fold greater than the general community. Two-thirds of positive FRs reported a lack of symptoms. Only 15.5% of FRs with COVID-19-like symptoms received RT-PCR testing. In addition to workplace control measures, increased testing availability to FRs is critical in limiting infection spread and ensuring response capability.
Deaths are frequently under-estimated during emergencies, times when accurate mortality estimates are crucial for emergency response. This study estimates excess all-cause, pneumonia and influenza mortality during the coronavirus disease 2019 (COVID-19) pandemic using the 11 September 2020 release of weekly mortality data from the United States (U.S.) Mortality Surveillance System (MSS) from 27 September 2015 to 9 May 2020, using semiparametric and conventional time-series models in 13 states with high reported COVID-19 deaths and apparently complete mortality data: California, Colorado, Connecticut, Florida, Illinois, Indiana, Louisiana, Massachusetts, Michigan, New Jersey, New York, Pennsylvania and Washington. We estimated greater excess mortality than official COVID-19 mortality in the U.S. (excess mortality 95% confidence interval (CI) 100 013–127 501 vs. 78 834 COVID-19 deaths) and 9 states: California (excess mortality 95% CI 3338–6344) vs. 2849 COVID-19 deaths); Connecticut (excess mortality 95% CI 3095–3952) vs. 2932 COVID-19 deaths); Illinois (95% CI 4646–6111) vs. 3525 COVID-19 deaths); Louisiana (excess mortality 95% CI 2341–3183 vs. 2267 COVID-19 deaths); Massachusetts (95% CI 5562–7201 vs. 5050 COVID-19 deaths); New Jersey (95% CI 13 170–16 058 vs. 10 465 COVID-19 deaths); New York (95% CI 32 538–39 960 vs. 26 584 COVID-19 deaths); and Pennsylvania (95% CI 5125–6560 vs. 3793 COVID-19 deaths). Conventional model results were consistent with semiparametric results but less precise. Significant excess pneumonia deaths were also found for all locations and we estimated hundreds of excess influenza deaths in New York. We find that official COVID-19 mortality substantially understates actual mortality, excess deaths cannot be explained entirely by official COVID-19 death counts. Mortality reporting lags appeared to worsen during the pandemic, when timeliness in surveillance systems was most crucial for improving pandemic response.
It is shown that W UMa-type and SX Phe-type stellar populations are both perfectly and uniquely suited for maintaining hyper-effective biopolymer chain reactions (BCR) on their planets once the planet is in the stellar habitable zone. W UMa-type stars are known to be contact binaries, and SX Phe-type stars are presumably post-binaries, i.e., products of stellar mergers. In case of the contact binaries, the eclipse-driven periodic heating/cooling of planetary surfaces has period-amplitude parameters that perfectly satisfy stringent conditions for maintaining BCR-like reactions. In case of the post-binaries, the stars pulsate with periods and amplitudes also perfectly suited for maintaining the reactions. Therefore, the ‘W UMa – SX Phe’ metamorphosis (from a contact binary to a post-binary, via the merger) seems to provide a potential biosystem reboot on planets in these systems.
To investigate if toll-like receptor (TLR) 4/nuclear factor-kappa B (NF-κB) signaling pathways mediated crush injury induced acute kidney injury (AKI) in rats, and if TAK-242 (a specific inhibitor of TLR4) attenuates the injury through inhibiting the signaling pathways.
Methods:
This study was divided into two parts: (1) Establish the crush injury model: 50 rats were randomly divided into control group and four crush injury groups (n = 10/group). Crush injury groups were given 3kg pressure for eight hours and were sacrificed at the time points of 0h, 6h, 12h, and 24h after relieving pressure. And (2) Select the most obvious injury group (12h group) for drug intervention group. Thirty rats were randomly divided into control group, 12h group, and 12h+TAK-242 group (n = 10/group). Two parts detection were as follows: pathological changes of kidney tissues were observed in Haematoxylin and Eosin (HE) staining. Serum creatinine, blood urea nitrogen (BUN), myoglobin (Mb), and blood potassium were examined by automatic biochemical analysis instrument. Interleukin-6 (IL-6) and tumor necrosis factor-α (TNF-α) were measured by enzyme-linked immunosorbent assay (ELISA). The TLR4 messenger ribonucleic acid (mRNA), TLR4, and P65 were detected by real-time polymerase chain reaction (PCR), western blot, immunohistochemistry staining.
Results:
Compared with the control group, kidney tissues were damaged in crush injury groups, and most obvious in the 12h group. The level of serum creatinine, BUN, Mb, blood potassium, IL-6, TNF-α, and TLR4mRNA were increased in the crush injury groups and significantly increased in the 12h group (P <.05). The TLR4 and P65 were significantly increased in the 12h group (P <.05). Compared with the 12h group, kidney tissue damage was significantly reduced in the TAK-242 group (P <.05). The level of serum creatinine, BUN, Mb, blood potassium, IL-6, TNF-α, TLR4mRNA, TLR4, and P65 in the TAK-242 group were significantly reduced (P <.05).
Conclusion:
The present findings conclude that TLR4/NF-κB signaling pathways mediated crush injury induced AKI in rats, and TAK-242 attenuates the injury through inhibiting the signaling pathways.
Since the beginning of the coronavirus infectious disease 2019 (COVID-19) pandemic, an exponentially large amount of data has been published to describe the pathology, clinical presentations, and outcomes in patients infected with the severe acute respiratory syndrome novel coronavirus 2 (SARS-CoV-2). Although COVID-19 has been shown to cause a systemic inflammation predisposing the involvement of multiple organs, its mechanism affecting the urogenital system has not been well-documented. This case report presents the clinical course of two male patients with COVID-19 who developed sexual dysfunction, as anorgasmia, following recovery from the infection. Although no evidence of viral replication or inflammatory involvement could be identified in these cases’ urogenital organs, a lack of other known risk factors for anorgasmia points to the role of COVID-19 as the contributing factor.