To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In this work we present a Bayesian approach to obtain the non-rotating stellar spectra from an observed spectrum of a rotating star. This is our first attempt to solve an inverse problem expressed in terms of a Fredholm integral. Our preliminary results with synthetic spectra are promising. More studies are required to compare our Bayesian approach with the standard method for real spectra.
We show that the U-Net neural network architecture provides an efficient and effective way of locating sources in SKA Data Challenge datasets. The improved performance relative to PyBDSF is quantified and U-Net is proposed as an efficient source finder for real radio surveys.
In this paper, we present a convolutional neural network (CNN)-based architecture trained on a dataset of meteorites and terrestrial rocks and another dataset trained on meteors and light sources. For meteorites, the dataset comprises augmented images from the meteorite collection at the Sharjah Academy for Astronomy, Space Sciences, and Technology (SAASST). For meteors, the images are taken from the United Arab Emirates (UAE) Meteor Monitoring Network (MMN). Such a project’s significance is to expand machine learning applications in astronomy to include the solar system’s small bodies upon contact with the Earth’s atmosphere. This, in return, acts as deep learning research, which examines a computer’s ability to mimic a human’s brain in recognizing meteorites from rocks, and meteors from airplanes and other noise sources. When testing the CNN models, results have shown that both the meteorite and meteor models reached an accuracy of above 80%.
The Gaia mission DR3 provides accurate data of around two billion stars in the Galaxy, including a classification based on astronomical classes of objects. In this work we present a web visualization tool to analyze one of the products published in the DR3, the Outlier Analysis Self-Organizing Map†.
In a recent search (Kim et al. 2022), we looked for microlensing signature in gravitational waves from spectrograms of the binary black hole events in the first and second gravitational-wave transient catalogs. For the search, we have implemented a deep learning-based method (Kim et al. 2021) and figured out that one event, GW190707_093326, out of forty-six events, is classified into the lensed class. However, upon estimating the p-value of this event, we observed that the uncertainty of the p-value still includes the possibility of the event being unlensed. Therefore, we concluded that no significant evidence of beating patterns from the evaluated binary black hole events has found from the search. For a consequence study, we discuss the distinguishability between microlensed GWs and the signal from precessing black hole binaries.
As the scale of cosmological surveys increases, so does the complexity in the analyses. This complexity can often make it difficult to derive the underlying principles, necessitating statistically rigorous testing to ensure the results of an analysis are consistent and reasonable. This is particularly important in multi-probe cosmological analyses like those used in the Dark Energy Survey (DES) and the upcoming Legacy Survey of Space and Time, where accurate uncertainties are vital. In this paper, we present a statistically rigorous method to test the consistency of contours produced in these analyses and apply this method to the Pippin cosmological pipeline used for type Ia supernova cosmology with the DES. We make use of the Neyman construction, a frequentist methodology that leverages extensive simulations to calculate confidence intervals, to perform this consistency check. A true Neyman construction is too computationally expensive for supernova cosmology, so we develop a method for approximating a Neyman construction with far fewer simulations. We find that for a simulated dataset, the 68% contour reported by the Pippin pipeline and the 68% confidence region produced by our approximate Neyman construction differ by less than a percent near the input cosmology; however, they show more significant differences far from the input cosmology, with a maximal difference of 0.05 in $\Omega_{M}$ and 0.07 in w. This divergence is most impactful for analyses of cosmological tensions, but its impact is mitigated when combining supernovae with other cross-cutting cosmological probes, such as the cosmic microwave background.
With the advent of deep, all-sky radio surveys, the need for ancillary data to make the most of the new, high-quality radio data from surveys like the Evolutionary Map of the Universe (EMU), GaLactic and Extragalactic All-sky Murchison Widefield Array survey eXtended, Very Large Array Sky Survey, and LOFAR Two-metre Sky Survey is growing rapidly. Radio surveys produce significant numbers of Active Galactic Nuclei (AGNs) and have a significantly higher average redshift when compared with optical and infrared all-sky surveys. Thus, traditional methods of estimating redshift are challenged, with spectroscopic surveys not reaching the redshift depth of radio surveys, and AGNs making it difficult for template fitting methods to accurately model the source. Machine Learning (ML) methods have been used, but efforts have typically been directed towards optically selected samples, or samples at significantly lower redshift than expected from upcoming radio surveys. This work compiles and homogenises a radio-selected dataset from both the northern hemisphere (making use of Sloan Digital Sky Survey optical photometry) and southern hemisphere (making use of Dark Energy Survey optical photometry). We then test commonly used ML algorithms such as k-Nearest Neighbours (kNN), Random Forest, ANNz, and GPz on this monolithic radio-selected sample. We show that kNN has the lowest percentage of catastrophic outliers, providing the best match for the majority of science cases in the EMU survey. We note that the wider redshift range of the combined dataset used allows for estimation of sources up to $z = 3$ before random scatter begins to dominate. When binning the data into redshift bins and treating the problem as a classification problem, we are able to correctly identify $\approx$76% of the highest redshift sources—sources at redshift $z > 2.51$—as being in either the highest bin ($z > 2.51$) or second highest ($z = 2.25$).
We present the third data release from the Parkes Pulsar Timing Array (PPTA) project. The release contains observations of 32 pulsars obtained using the 64-m Parkes ‘Murriyang’ radio telescope. The data span is up to 18 yr with a typical cadence of 3 weeks. This data release is formed by combining an updated version of our second data release with $\sim$3 yr of more recent data primarily obtained using an ultra-wide-bandwidth receiver system that operates between 704 and 4032 MHz. We provide calibrated pulse profiles, flux density dynamic spectra, pulse times of arrival, and initial pulsar timing models. We describe methods for processing such wide-bandwidth observations and compare this data release with our previous release.
The putative host galaxy of FRB 20171020A was first identified as ESO 601-G036 in 2018, but as no repeat bursts have been detected, direct confirmation of the host remains elusive. In light of recent developments in the field, we re-examine this host and determine a new association confidence level of 98%. At 37 Mpc, this makes ESO 601-G036 the third closest FRB host galaxy to be identified to date and the closest to host an apparently non-repeating FRB (with an estimated repetition rate limit of $<$$0.011$ bursts per day above $10^{39}$ erg). Due to its close distance, we are able to perform detailed multi-wavelength analysis on the ESO 601-G036 system. Follow-up observations confirm ESO 601-G036 to be a typical star-forming galaxy with H i and stellar masses of $\log_{10}\!(M_{\rm{H\,{\small I}}} / M_\odot) \sim 9.2$ and $\log_{10}\!(M_\star / M_\odot) = 8.64^{+0.03}_{-0.15}$, and a star formation rate of $\text{SFR} = 0.09 \pm 0.01\,{\rm M}_\odot\,\text{yr}^{-1}$. We detect, for the first time, a diffuse gaseous tail ($\log_{10}\!(M_{\rm{H\,{\small I}}} / M_\odot) \sim 8.3$) extending to the south-west that suggests recent interactions, likely with the confirmed nearby companion ESO 601-G037. ESO 601-G037 is a stellar shred located to the south of ESO 601-G036 that has an arc-like morphology, is about an order of magnitude less massive, and has a lower gas metallicity that is indicative of a younger stellar population. The properties of the ESO 601-G036 system indicate an ongoing minor merger event, which is affecting the overall gaseous component of the system and the stars within ESO 601-G037. Such activity is consistent with current FRB progenitor models involving magnetars and the signs of recent interactions in other nearby FRB host galaxies.
Next-generation astronomical surveys naturally pose challenges for human-centred visualisation and analysis workflows that currently rely on the use of standard desktop display environments. While a significant fraction of the data preparation and analysis will be taken care of by automated pipelines, crucial steps of knowledge discovery can still only be achieved through various level of human interpretation. As the number of sources in a survey grows, there is need to both modify and simplify repetitive visualisation processes that need to be completed for each source. As tasks such as per-source quality control, candidate rejection, and morphological classification all share a single instruction, multiple data (SIMD) work pattern, they are amenable to a parallel solution. Selecting extragalactic neutral hydrogen (Hi) surveys as a representative example, we use system performance benchmarking and the visual data and reasoning methodology from the field of information visualisation to evaluate a bespoke comparative visualisation environment: the encube visual analytics framework deployed on the 83 Megapixel Swinburne Discovery Wall. Through benchmarking using spectral cube data from existing Hi surveys, we are able to perform interactive comparative visualisation via texture-based volume rendering of 180 three-dimensional (3D) data cubes at a time. The time to load a configuration of spectral cubes scale linearly with the number of voxels, with independent samples of 180 cubes (8.4 Gigavoxels or 34 Gigabytes) each loading in under 5 min. We show that parallel comparative inspection is a productive and time-saving technique which can reduce the time taken to complete SIMD-style visual tasks currently performed at the desktop by at least two orders of magnitude, potentially rendering some labour-intensive desktop-based workflows obsolete.
Investigating rare and new objects have always been an important direction in astronomy. Cataclysmic variables (CVs) are ideal and natural celestial bodies for studying the accretion process of semi-detached binaries with accretion processes. However, the sample size of CVs must increase because a lager gap exists between the observational and the theoretical expanding CVs. Astronomy has entered the big data era and can provide massive images containing CV candidates. CVs as a type of faint celestial objects, are highly challenging to be identified directly from images using automatic manners. Deep learning has rapidly developed in intelligent image processing and has been widely applied in some astronomical fields with excellent detection results. YOLOX, as the latest YOLO framework, is advantageous in detecting small and dark targets. This work proposes an improved YOLOX-based framework according to the characteristics of CVs and Sloan Digital Sky Survey (SDSS) photometric images to train and verify the model to realise CV detection. We use the Convolutional Block Attention Module to increase the number of output features with the feature extraction network and adjust the feature fusion network to obtain fused features. Accordingly, the loss function is modified. Experimental results demonstrate that the improved model produces satisfactory results, with average accuracy (mean average Precision at 0.5) of 92.0%, Precision of 92.9%, Recall of 94.3%, and $F1-score$ of 93.6% on the test set. The proposed method can efficiently achieve the identification of CVs in test samples and search for CV candidates in unlabeled images. The image data vastly outnumber the spectra in the SDSS-released data. With supplementary follow-up observations or spectra, the proposed model can help astronomers in seeking and detecting CVs in a new manner to ensure that a more extensive CV catalog can be built. The proposed model may also be applied to the detection of other kinds of celestial objects.
Gamma-ray bursts (GRBs) and double neutron star merger gravitational-wave events are followed by afterglows that shine from X-rays to radio, and these broadband transients are generally interpreted using analytical models. Such models are relatively fast to execute, and thus easily allow estimates of the energy and geometry parameters of the blast wave, through many trial-and-error model calculations. One problem, however, is that such analytical models do not capture the underlying physical processes as well as more realistic relativistic numerical hydrodynamic (RHD) simulations do. Ideally, those simulations are used for parameter estimation instead, but their computational cost makes this intractable. To this end, we present DeepGlow, a highly efficient neural network architecture trained to emulate a computationally costly RHD-based model of GRB afterglows, to within a few percent accuracy. As a first scientific application, we compare both the emulator and a different analytical model calibrated to RHD simulations, to estimate the parameters of a broadband GRB afterglow. We find consistent results between these two models, and also give further evidence for a stellar wind progenitor environment around this GRB source. DeepGlow fuses simulations that are otherwise too complex to execute over all parameters, to real broadband data of current and future GRB afterglows.
The International VLBI Service for Geodesy and Astrometry (IVS) regularly provides high-quality data to produce Earth Orientation Parameters (EOP), and for the maintenance and realisation of the International Terrestrial and Celestial Reference Frames, ITRF and ICRF. The first iteration of the celestial reference frame (CRF) at radio wavelengths, the ICRF1, was adopted by the International Astronomical Union (IAU) in 1997 to replace the FK5 optical frame. Soon after, the IVS began official operations and in 2009 there was a significant increase in data sufficient to warrant a second iteration of the CRF, ICRF2. The most recent ICRF3, was adopted by the IAU in 2018. However, due to the geographic distribution of observing stations being concentrated in the Northern hemisphere, CRFs are generally weaker in the South due to there being fewer Southern Hemisphere observations. To increase the Southern Hemisphere observations, and the density, precision of the sources, a series of deep South observing sessions was initiated in 1995. This initiative in 2004 became the IVS Celestial Reference Frame Deep South (IVS-CRDS) observing programme. This paper covers the evolution of the CRDS observing programme for the period 1995–2021, details the data products and results, and concludes with a summary of upcoming improvements to this ongoing project.
We present a comparison between the performance of a selection of source finders (SFs) using a new software tool called Hydra. The companion paper, Paper I, introduced the Hydra tool and demonstrated its performance using simulated data. Here we apply Hydra to assess the performance of different source finders by analysing real observational data taken from the Evolutionary Map of the Universe (EMU) Pilot Survey. EMU is a wide-field radio continuum survey whose primary goal is to make a deep ($20\mu$Jy/beam RMS noise), intermediate angular resolution ($15^{\prime\prime}$), 1 GHz survey of the entire sky south of $+30^{\circ}$ declination, and expecting to detect and catalogue up to 40 million sources. With the main EMU survey it is highly desirable to understand the performance of radio image SF software and to identify an approach that optimises source detection capabilities. Hydra has been developed to refine this process, as well as to deliver a range of metrics and source finding data products from multiple SFs. We present the performance of the five SFs tested here in terms of their completeness and reliability statistics, their flux density and source size measurements, and an exploration of case studies to highlight finder-specific limitations.