Abstract:Reionization is one of the least understood processes in the evolution history of the Universe, mostly because of the numerous astrophysical processes occurring simultaneously about which we do not have a very clear idea so far. In this article, we use the Gaussian Process Regression (GPR) method to learn the reionization history and infer the astrophysical parameters. We reconstruct the UV luminosity density function using the HFF and early JWST data. From the reconstructed history of reionization, the global differential brightness temperature fluctuation during this epoch has been computed. We perform MCMC analysis of the global 21-cm signal using the instrumental specifications of SARAS, in combination with Lyman-$\alpha$ ionization fraction data, Planck optical depth measurements and UV luminosity data. Our analysis reveals that GPR can help infer the astrophysical parameters in a model-agnostic way than conventional methods. Additionally, we analyze the 21-cm power spectrum using the reconstructed history of reionization and demonstrate how the future 21-cm mission SKA, in combination with Planck and Lyman-$\alpha$ forest data, improves the bounds on the reionization astrophysical parameters by doing a joint MCMC analysis for the astrophysical parameters plus 6 cosmological parameters for $\Lambda$CDM model. The results make the GPR-based reconstruction technique a robust learning process and the inferences on the astrophysical parameters obtained therefrom are quite reliable that can be used for future analysis.
Abstract:We investigate the prospect of reconstructing the ``cosmic distance ladder'' of the Universe using a novel deep learning framework called LADDER - Learning Algorithm for Deep Distance Estimation and Reconstruction. LADDER is trained on the apparent magnitude data from the Pantheon Type Ia supernovae compilation, incorporating the full covariance information among data points, to produce predictions along with corresponding errors. After employing several validation tests with a number of deep learning models, we pick LADDER as the best performing one. We then demonstrate applications of our method in the cosmological context, that include serving as a model-independent tool for consistency checks for other datasets like baryon acoustic oscillations, calibration of high-redshift datasets such as gamma ray bursts, use as a model-independent mock catalog generator for future probes, etc. Our analysis advocates for interesting yet cautious consideration of machine learning applications in these contexts.
Abstract:We study the prospects of Machine Learning algorithms like Gaussian processes (GP) as a tool to reconstruct the Hubble parameter $H(z)$ with two upcoming gravitational wave missions, namely the evolved Laser Interferometer Space Antenna (eLISA) and the Einstein Telescope (ET). We perform non-parametric reconstructions of $H(z)$ with GP using realistically generated catalogues, assuming various background cosmological models, for each mission. We also take into account the effect of early-time and late-time priors separately on the reconstruction, and hence on the Hubble constant ($H_0$). Our analysis reveals that GPs are quite robust in reconstructing the expansion history of the Universe within the observational window of the specific mission under study. We further confirm that both eLISA and ET would be able to constrain $H(z)$ and $H_0$ to a much higher precision than possible today, and also find out their possible role in addressing the Hubble tension for each model, on a case-by-case basis.