We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The fossil record of dinosaurs in Scotland mostly comprises isolated highly fragmentary bones from the Great Estuarine Group in the Inner Hebrides (Bajocian–Bathonian). Here we report the first definite dinosaur body fossil ever found in Scotland (historically), having been discovered in 1973, but not collected until 45 years later. It is the first and most complete partial dinosaur skeleton currently known from Scotland. NMS G.2023.19.1 was recovered from a challenging foreshore location in the Isle of Skye, and transported to harbour in a semi-rigid inflatable boat towed by a motor boat. After manual preparation, micro-CT scanning was carried out, but this did not aid in identification. Among many unidentifiable elements, a neural arch, two ribs and part of the ilium are described herein, and their features indicate that this was a cerapodan or ornithopod dinosaur. Histological thin sections of one of the ribs support this identification, indicating an individual at least eight years of age, growing slowly at the time of death. If ornithopodan, as our data suggest, it could represent the world's oldest body fossil of this clade.
To examine the perspectives of caregivers that are not part of the antibiotic stewardship program (ASP) leadership team (eg, physicians, nurses, and clinical pharmacists), but who interact with ASPs in their role as frontline healthcare workers.
Design:
Qualitative semistructured interviews.
Setting:
The study was conducted in 2 large national healthcare systems including 7 hospitals in the Veterans’ Health Administration and 4 hospitals in Intermountain Healthcare.
Participants:
We interviewed 157 participants. The current analysis includes 123 nonsteward clinicians: 47 physicians, 26 pharmacists, 29 nurses, and 21 hospital leaders.
Methods:
Interviewers utilized a semistructured interview guide based on the Consolidated Framework for Implementation Research (CFIR), which was tailored to the participant’s role in the hospital as it related to ASPs. Qualitative analysis was conducted using a codebook based on the CFIR.
Results:
We identified 4 primary perspectives regarding ASPs. (1) Non-ASP pharmacists considered antibiotic stewardship activities to be a high priority despite the added burden to work duties: (2) Nurses acknowledged limited understanding of ASP activities or involvement with these programs; (3) Physicians criticized ASPs for their restrictions on clinical autonomy and questioned the ability of antibiotic stewards to make recommendations without the full clinical picture; And (4) hospital leaders expressed support for ASPs and recognized the unique challenges faced by non-ASP clinical staff.
Conclusion:
Further understanding these differing perspectives of ASP implementation will inform possible ways to improve ASP implementation across clinical roles.
As part of a project to implement antimicrobial dashboards at select facilities, we assessed physician attitudes and knowledge regarding antibiotic prescribing.
Design:
An online survey explored attitudes toward antimicrobial use and assessed respondents’ management of four clinical scenarios: cellulitis, community-acquired pneumonia, non–catheter-associated asymptomatic bacteriuria, and catheter-associated asymptomatic bacteriuria.
Setting:
This study was conducted across 16 Veterans’ Affairs (VA) medical centers in 2017.
Participants:
Physicians working in inpatient settings specializing in infectious diseases (ID), hospital medicine, and non-ID/hospitalist internal medicine.
Methods:
Scenario responses were scored by assigning +1 for answers most consistent with guidelines, 0 for less guideline-concordant but acceptable answers and −1 for guideline-discordant answers. Scores were normalized to 100% guideline concordant to 100% guideline discordant across all questions within a scenario, and mean scores were calculated across respondents by specialty. Differences in mean score per scenario were tested using analysis of variance (ANOVA).
Results:
Overall, 139 physicians completed the survey (19 ID physicians, 62 hospitalists, and 58 other internists). Attitudes were similar across the 3 groups. We detected a significant difference in cellulitis scenario scores (concordance: ID physicians, 76%; hospitalists, 58%; other internists, 52%; P = .0087). Scores were numerically but not significantly different across groups for community-acquired pneumonia (concordance: ID physicians, 75%; hospitalists, 60%; other internists, 56%; P = .0914), for non–catheter-associated asymptomatic bacteriuria (concordance: ID physicians, 65%; hospitalists, 55%; other internists, 40%; P = .322), and for catheter-associated asymptomatic bacteriuria (concordance: ID physicians, 27% concordant; hospitalists, 8% discordant; other internists 13% discordant; P = .12).
Conclusions:
Significant differences in performance regarding management of cellulitis and low overall performance regarding asymptomatic bacteriuria point to these conditions as being potentially high-yield targets for stewardship interventions.
To examine how individual steward characteristics (eg, steward role, sex, and specialized training) are associated with their views of antimicrobial stewardship program (ASP) implementation at their institution.
Design:
Descriptive survey from a mixed-methods study.
Setting:
Two large national healthcare systems; the Veterans’ Health Administration (VA) (n = 134 hospitals) and Intermountain Healthcare (IHC; n = 20 hospitals).
Participants:
We sent the survey to 329 antibiotic stewards serving in 154 hospitals; 152 were physicians and 177 were pharmacists. In total, 118 pharmacists and 64 physicians from 126 hospitals responded.
Methods:
The survey was grounded in constructs of the Consolidated Framework for Implementation Research, and it assessed stewards’ views on the development and implementation of antibiotic stewardship programs (ASPs) at their institutions We then examined differences in stewards’ views by demographic factors.
Results:
Regardless of individual factors, stewards agreed that the ASP added value to their institution and was advantageous to patient care. Stewards also reported high levels of collegiality and self-efficacy. Stewards who had specialized training or those volunteered for the role were less likely to think that the ASP was implemented due to a mandate. Similarly volunteers and those with specialized training felt that they had authority in the antibiotic decisions made in their facility.
Conclusions:
Given the importance of ASPs, it may be beneficial for healthcare institutions to recruit and train individuals with a true interest in stewardship.
The Australian prime lamb industry is seeking to improve lean meat yield (LMY) as a means to increasing efficiency and profitability across the whole value chain. The LMY of prime lambs is affected by genetics and on-farm nutrition from birth to slaughter and is the total muscle weight relative to the total carcass weight. Under the production conditions of south eastern Australia, many ewe flocks experience a moderate reduction in nutrition in mid to late pregnancy due to a decrease in pasture availability and quality. Correcting nutritional deficits throughout gestation requires the feeding of supplements. This enables the pregnant ewe to meet condition score (CS) targets at lambing. However, limited resources on farm often mean it is difficult to effectively manage nutritional supplementation of the pregnant ewe flock. The impact of reduced ewe nutrition in mid to late pregnancy on the body composition of finishing lambs and subsequent carcass composition remains unknown. This study investigated the effect of moderately reducing ewe nutrition in mid to late gestation on the body composition of finishing lambs and carcass composition at slaughter on a commercial scale. Multiple born lambs to CS2.5 target ewes were lighter at birth and weaning, had lower feedlot entry and exit weights with lower pre-slaughter and carcass weights compared with CS3.0 and CS3.5 target ewes. These lambs also had significantly lower eye muscle and fat depth when measured by ultrasound prior to slaughter and carcass subcutaneous fat depth measured 110 mm from the spine along the 12th rib (GR 12th) and at the C-site (C-fat). Although carcasses were ~5% lighter, results showed that male progeny born to ewes with reduced nutrition from day 50 gestation to a target CS2.5 at lambing had a higher percentage of lean tissue mass as measured by dual energy X-ray absorptiometry and a lower percentage of fat during finishing and at slaughter, with the multiple born progeny from CS3.0 and CS3.5 target ewes being similar. These data suggest lambs produced from multiple bearing ewes that have had a moderate reduction in nutrition during pregnancy are less mature. This effect was also independent of lamb finishing system. The 5% reduction in carcass weight observed in this study would have commercially relevant consequences for prime lamb producers, despite a small gain in LMY.
The COllaborative project of Development of Anthropometrical measures in Twins (CODATwins) project is a large international collaborative effort to analyze individual-level phenotype data from twins in multiple cohorts from different environments. The main objective is to study factors that modify genetic and environmental variation of height, body mass index (BMI, kg/m2) and size at birth, and additionally to address other research questions such as long-term consequences of birth size. The project started in 2013 and is open to all twin projects in the world having height and weight measures on twins with information on zygosity. Thus far, 54 twin projects from 24 countries have provided individual-level data. The CODATwins database includes 489,981 twin individuals (228,635 complete twin pairs). Since many twin cohorts have collected longitudinal data, there is a total of 1,049,785 height and weight observations. For many cohorts, we also have information on birth weight and length, own smoking behavior and own or parental education. We found that the heritability estimates of height and BMI systematically changed from infancy to old age. Remarkably, only minor differences in the heritability estimates were found across cultural–geographic regions, measurement time and birth cohort for height and BMI. In addition to genetic epidemiological studies, we looked at associations of height and BMI with education, birth weight and smoking status. Within-family analyses examined differences within same-sex and opposite-sex dizygotic twins in birth size and later development. The CODATwins project demonstrates the feasibility and value of international collaboration to address gene-by-exposure interactions that require large sample sizes and address the effects of different exposures across time, geographical regions and socioeconomic status.
Patients with chronic obstructive pulmonary disease (COPD) who experience acute exacerbations usually require treatment with oral steroids or antibiotics, depending on the etiology of the exacerbation. Current management is based on clinician's assessment and judgement, which lacks diagnostic accuracy and results in overtreatment. A test to guide these decisions in primary care is in development. We developed an early decision model to evaluate the cost-effectiveness of this treatment stratification test in the primary care setting in the United Kingdom.
Methods
A combined decision tree and Markov model was developed of COPD progression and the exacerbation care pathway. Sensitivity analysis was carried out to guide technology development and inform evidence generation requirements.
Results
The base case test strategy cost GBP 423 (USD 542) less and resulted in a health gain of 0.15 quality-adjusted life-years per patient compared with not testing. Testing reduced antibiotic prescriptions by 30 percent, potentially lowering the risk of antimicrobial resistance developing. In sensitivity analysis, the result depended on the clinical effects of treating patients according to the test result, as opposed to treating according to clinical judgement alone, for which there is limited evidence. The results were less sensitive to the accuracy of the test.
Conclusions
Testing may be cost-saving in primary care, but this requires robust evidence on whether test-guided treatment is effective. High quality evidence on the clinical utility of testing is required for early modeling of diagnostic tests generally.
Campylobacteriosis, the most frequent bacterial enteric disease, shows a clear yet unexplained seasonality. The study purpose was to explore the influence of seasonal fluctuation in the contamination of and in the behaviour exposures to two important sources of Campylobacter on the seasonality of campylobacteriosis. Time series analyses were applied to data collected through an integrated surveillance system in Canada in 2005–2010. Data included sporadic, domestically-acquired cases of Campylobacter jejuni infection, contamination of retail chicken meat and of surface water by C. jejuni, and exposure to each source through barbequing and swimming in natural waters. Seasonal patterns were evident for all variables with a peak in summer for human cases and for both exposures, in fall for chicken meat contamination, and in late fall for water contamination. Time series analyses showed that the observed campylobacteriosis summer peak could only be significantly linked to behaviour exposures rather than sources contamination (swimming rather than water contamination and barbequing rather than chicken meat contamination). The results indicate that the observed summer increase in human cases may be more the result of amplification through more frequent risky exposures rather than the result of an increase of the Campylobacter source contamination.
Whether monozygotic (MZ) and dizygotic (DZ) twins differ from each other in a variety of phenotypes is important for genetic twin modeling and for inferences made from twin studies in general. We analyzed whether there were differences in individual, maternal and paternal education between MZ and DZ twins in a large pooled dataset. Information was gathered on individual education for 218,362 adult twins from 27 twin cohorts (53% females; 39% MZ twins), and on maternal and paternal education for 147,315 and 143,056 twins respectively, from 28 twin cohorts (52% females; 38% MZ twins). Together, we had information on individual or parental education from 42 twin cohorts representing 19 countries. The original education classifications were transformed to education years and analyzed using linear regression models. Overall, MZ males had 0.26 (95% CI [0.21, 0.31]) years and MZ females 0.17 (95% CI [0.12, 0.21]) years longer education than DZ twins. The zygosity difference became smaller in more recent birth cohorts for both males and females. Parental education was somewhat longer for fathers of DZ twins in cohorts born in 1990–1999 (0.16 years, 95% CI [0.08, 0.25]) and 2000 or later (0.11 years, 95% CI [0.00, 0.22]), compared with fathers of MZ twins. The results show that the years of both individual and parental education are largely similar in MZ and DZ twins. We suggest that the socio-economic differences between MZ and DZ twins are so small that inferences based upon genetic modeling of twin data are not affected.
The hypothesis that plants treated with 2,4-dichlorophenoxyacetic acid (2,4-D) die as a consequence of abnormal growth has been examined. Cocklebur (Xanthium sp) plants spot treated on one leaf with 2,4-D show three phases of growth toward death. For the first 2 days after treatment, there is a net weight increase, largely due to abnormal growth of the axis (apex, stem, and tap root). Root and leaf growth are drastically curtailed. Between 2 and 7 days, axis growth continues primarily at the expense of leaf tissue which is induced to senesce, especially the cotyledons. The last phase between 7 and 10 days leads to collapse and withering of plants, and was not examined in detail.
Analyses for nitrogenous constituents show the usual mobilization to the axis, with early and large increases in nucleic acid. Both photosynthesis and ion absorption are initially stimulated but decline sharply after the first day. Translocation to leaves and roots is drastically reduced in favor of the proliferating axis.
The death of the plant appears to be due to suppression of normal apical growth coupled with induction of abnormal axis growth. Failure to produce new root and leaf tissue plus inadequate nutritional maintenance of existing roots and leaves lead to lack of autotrophism and eventual death. The biochemical basis for these responses appears to be with aberrant nucleic acid metabolism.
Studies were conducted to examine over time the effects of propachlor (2-chloro-N-isopropylacetanilide) on the growth of cucumber (Cucumus sativus L. ‘Straight Eight’) roots and associated biosynthetic reactions. Complete inhibition of root elongation occurred within 16 hr after exposure to propachlor. Inhibition of growth was not found to be a result of an effect on ATP formation or respiration. Protein biosynthesis was reduced several hours before the observed inhibition of growth therefore implicating it as the causal factor. Inhibition of protein synthesis occurred prior to an observed reduction in RNA synthesis suggesting that the primary effect of propachlor is on protein biosynthesis and that its effect on nucleic acid synthesis is secondary. It is concluded that the primary mechanism of action of propachlor is its effect on nascent protein biosynthesis.
Compared to corn (Zea mays L.) (resistant), oats (Avena sativa L.) (susceptible), and giant foxtail (Setaria faberii Herrm.) (susceptible), fall panicum (Panicum dichotomiflorum Michx.) and large crabgrass (Digitaria sanguinalis (L.) Scop.) metabolized 2-chloro-4-(ethylamino)-6-(isopropylamino)-s-triazine (atrazine) at an intermediate rate. The order of tolerance of these five species (corn > fall panicum and large crabgrass > giant foxtail > oats) is identical to the order of their ability to metabolize atrazine. In 6 hr, corn, fall panicum, large crabgrass, giant foxtail, and oats metabolized 96, 44, 50, 17, and 2%, respectively, of the 14C-atrazine absorbed from a 10 ppm solution and translocated to the foliage, leaving concentrations of 2.2, 34.8, 30.1, 59.8, and 66.3 mμ moles, respectively, of atrazine per g of fresh weight of shoots. Hydroxyatrazine [2-hydroxy-4-(ethylamino)-6-(isopropylamino)-s-triazine] was found in the shoots of corn and giant foxtail. Corn shoots also contained a more hydrophilic metabolite, presumably a peptide conjugate. Hydrophilic metabolites found in the shoots of giant foxtail, fall panicum, and large crabgrass were chromatographically identical to the hydrophilic metabolite found in corn.
Broomsedge control studies were conducted on six broomsedge-infested pastures in southeastern Oklahoma from 1995 to 1997. Glyphosate applied in spring at 2.24 kg ai/ha decreased broomsedge plant density by 58% 3 mo after treatment (MAT), on areas where the previous year's forage was grazed, and by 95% 3 MAT, where spring fire had removed the old top-growth before glyphosate application. Broomsedge plant density was not affected where glyphosate was applied in spring to sites with old-standing top-growth. Paraquat applied in spring at 0.56 kg ai/ha and spring burning without a herbicide treatment had no effect on broomsedge plant density. Glyphosate at 0.56 and 1.12 kg ai/ha applied in late summer reduced the number of broomsedge stems 1 yr after treatment (YAT) by an average of 65 and 80%, respectively. Paraquat at 0.56 kg/ha applied in late summer of 1995, followed by burning 1 wk after treatment (WAT), decreased broomsedge stem density by more than 60% 1 YAT at four of six locations when compared with mowing in late summer. Burning in November after an October frost decreased broomsedge stem density by more than 47% 1 YAT at four locations. Two consecutive years of burning after frost and paraquat applied in late summer followed by burning 1 WAT reduced broomsedge dry matter production by 68 and 96%, respectively, when compared with mowing in late summer. These data suggest that good to excellent control of established broomsedge is possible with herbicides alone, with a combination of herbicides and late-summer burning, and with fall burning after an early frost in a dry fall. However, broomsedge control was short-lived with all the treatments because of the establishment of new broomsedge seedlings. Thus, it will be important to integrate the destruction of broomsedge plants with proper fertility and grazing management in order to provide satisfactory broomsedge control.
Various medications and devices are available for facilitation of emergent endotracheal intubations (EETIs). The objective of this study was to survey which medications and devices are being utilized for intubation by Canadian physicians.
Methods
A clinical scenario-based survey was developed to determine which medications physicians would administer to facilitate EETI, their first choice of intubation device, and backup strategy should their first choice fail. The survey was distributed to Canadian emergency medicine (EM) and intensive care unit (ICU) physicians using web-based and postal methods. Physicians were asked questions based on three scenarios (trauma; pneumonia; heart failure) and responded using a 5-point scale ranging from “always” to “never” to capture usual practice.
Results
The survey response rate was 50.2% (882/1,758). Most physicians indicated a Macintosh blade with direct laryngoscopy would “always/often” be their first choice of intubation device in the three scenarios (mean 85% [79%-89%]) followed by video laryngoscopy (mean 37% [30%-49%]). The most common backup device chosen was an extraglottic device (mean 59% [56%-60%]). The medications most physicians would “always/often” administer were fentanyl (mean 45% [42%-51%]) and etomidate (mean 38% [25%-50%]). EM physicians were more likely than ICU physicians to paralyze patients for EETI (adjusted odds ratio 3.40; 95% CI 2.90-4.00).
Conclusions
Most EM and ICU physicians utilize direct laryngoscopy with a Macintosh blade as a primary device for EETI and an extraglottic device as a backup strategy. This survey highlights variation in Canadian practice patterns for some aspects of intubation in critically ill patients.
A technique for the prediction of buffeting response in flight from wind-tunnel tests on models of conventional construction is described and assessed. Results are presented from tests on models of a Gnat T Mk 2 trainer aircraft, and predictions are compared with flight measurements of buffeting made during high incidence manoeuvres. Flight/tunnel comparisons of buffeting response measurements on the TACT F-111 and SAAB 105 aircraft are also discussed. In general, good agreement is demonstrated between measured and predicted responses. Some remarks are also made on the determination of damping ratios from accelerometer or strain gauge readings recorded under buffeting conditions.
The usefulness of boundary-layer control (B.L.C.) at the knee of a trailing-edge flap, over the wing nose close to the leading-edge or at the knee of a leading-edge flap is first noted. Various methods of providing B.L.C. are outlined, comprising slot blowing, slot suction, area suction, inclined air-jets, and specially-designed aerofoil shapes. The aerodynamic aspects of slot blowing over trailing-edge flaps and the wing nose are then examined in detail and both slot suction and area suction are also considered. The associated practical design features required for good performance are discussed and some flight-handling implications are mentioned.
A trend toward greater body size in dizygotic (DZ) than in monozygotic (MZ) twins has been suggested by some but not all studies, and this difference may also vary by age. We analyzed zygosity differences in mean values and variances of height and body mass index (BMI) among male and female twins from infancy to old age. Data were derived from an international database of 54 twin cohorts participating in the COllaborative project of Development of Anthropometrical measures in Twins (CODATwins), and included 842,951 height and BMI measurements from twins aged 1 to 102 years. The results showed that DZ twins were consistently taller than MZ twins, with differences of up to 2.0 cm in childhood and adolescence and up to 0.9 cm in adulthood. Similarly, a greater mean BMI of up to 0.3 kg/m2 in childhood and adolescence and up to 0.2 kg/m2 in adulthood was observed in DZ twins, although the pattern was less consistent. DZ twins presented up to 1.7% greater height and 1.9% greater BMI than MZ twins; these percentage differences were largest in middle and late childhood and decreased with age in both sexes. The variance of height was similar in MZ and DZ twins at most ages. In contrast, the variance of BMI was significantly higher in DZ than in MZ twins, particularly in childhood. In conclusion, DZ twins were generally taller and had greater BMI than MZ twins, but the differences decreased with age in both sexes.
For over 100 years, the genetics of human anthropometric traits has attracted scientific interest. In particular, height and body mass index (BMI, calculated as kg/m2) have been under intensive genetic research. However, it is still largely unknown whether and how heritability estimates vary between human populations. Opportunities to address this question have increased recently because of the establishment of many new twin cohorts and the increasing accumulation of data in established twin cohorts. We started a new research project to analyze systematically (1) the variation of heritability estimates of height, BMI and their trajectories over the life course between birth cohorts, ethnicities and countries, and (2) to study the effects of birth-related factors, education and smoking on these anthropometric traits and whether these effects vary between twin cohorts. We identified 67 twin projects, including both monozygotic (MZ) and dizygotic (DZ) twins, using various sources. We asked for individual level data on height and weight including repeated measurements, birth related traits, background variables, education and smoking. By the end of 2014, 48 projects participated. Together, we have 893,458 height and weight measures (52% females) from 434,723 twin individuals, including 201,192 complete twin pairs (40% monozygotic, 40% same-sex dizygotic and 20% opposite-sex dizygotic) representing 22 countries. This project demonstrates that large-scale international twin studies are feasible and can promote the use of existing data for novel research purposes.
In Australia, hepatitis B (HBV) vaccination is recommended for injecting drug users (IDUs), Indigenous adults and prisoners. We compared immunity to HBV in prisoners and the general population obtained from national serosurveys in 2007. Individuals with HBV surface antibody (HBsAb) positive sera were considered immune from past infection [HBV core antibody (HBcAb) positive] or from vaccination (HBcAb negative). Male prisoners aged 18–58 years had a higher HBsAb seroprevalence than the general population (46·4% vs. 39·4%, P = 0·061). Comparison of HBcAb results was possible for males aged 18–29 years. In this group, higher HBsAb seroprevalence was due to past infection (12·9% vs. 3·0%, P < 0·001), rather than vaccine-conferred immunity (35·3% vs. 43·4%, P = 0·097). All prisoner groups, but especially IDUs, those of Indigenous heritage or those with a previous episode of imprisonment had higher levels of immunity from past infection than the general population (19·3%, 33·0%, 17·1%, respectively, vs. 3·0%, P < 0·05). Indigenous prisoners, non-IDUs and first-time entrants had significantly lower levels of vaccine-conferred immunity than the general population (26·4%, 26·2% and 20·7% respectively vs. 43·4%, P < 0·05). Improving prison-based HBV vaccination would prevent transmission in the prison setting and protect vulnerable members of the community who are at high risk of both infection and entering the prison system.
Stereotactic radiosurgery offers a unique and effective means of controlling cavernous sinus meningiomas with a low rate of complications.
Methods:
We retrospectively reviewed all cavernous sinus meningiomas treated with Gamma Knife (GK) radiosurgery between November 2003 and April 2011 at our institution.
Results:
Thirty patients were treated, four were lost to follow- up. Presenting symptoms included: headache (9), trigeminal nerve dysesthesias/paresthesias (13), abducens nerve palsy (11), oculomotor nerve palsy (8), Horner's syndrome (2), blurred vision (9), and relative afferent pupillary defect (1). One patient was asymptomatic with documented tumor growth. Treatment planning consisted of MRI and CT in 17 of 30 patients (56.7%), the remainder were planned with MRI alone (44.3%). There were 8 males (26.7%) and 22 females (73.3%). Twelve patients had previous surgical debulking prior to radiosurgery. Average diameter and volume at time of radiosurgery was 3.4 cm and 7.9 cm3 respectively. Average dose at the 50% isodose line was 13.5 Gy. Follow-up was available in 26 patients. Average follow-up was 36.1 months. Mean age 55.1 years. Tumor size post GK decreased in 9 patients (34.6%), remained stable in 15 patients (57.7%), and continued to grow in 2 (7.7%). Minor transient complications occurred in 12 patients, all resolving. Serious permanent complications occurred in 5 patients: new onset trigeminal neuropathic pain (2), frame related occipital neuralgia (1), worsening of pre-GK seizures (1), and panhypopituitarism (1).
Conclusion:
GK offers an effective treatment method for halting meningioma progression in the cavernous sinus, with an acceptable permanent complication rate.