Abstract:Reionization is one of the least understood processes in the evolution history of the Universe, mostly because of the numerous astrophysical processes occurring simultaneously about which we do not have a very clear idea so far. In this article, we use the Gaussian Process Regression (GPR) method to learn the reionization history and infer the astrophysical parameters. We reconstruct the UV luminosity density function using the HFF and early JWST data. From the reconstructed history of reionization, the global differential brightness temperature fluctuation during this epoch has been computed. We perform MCMC analysis of the global 21-cm signal using the instrumental specifications of SARAS, in combination with Lyman-$\alpha$ ionization fraction data, Planck optical depth measurements and UV luminosity data. Our analysis reveals that GPR can help infer the astrophysical parameters in a model-agnostic way than conventional methods. Additionally, we analyze the 21-cm power spectrum using the reconstructed history of reionization and demonstrate how the future 21-cm mission SKA, in combination with Planck and Lyman-$\alpha$ forest data, improves the bounds on the reionization astrophysical parameters by doing a joint MCMC analysis for the astrophysical parameters plus 6 cosmological parameters for $\Lambda$CDM model. The results make the GPR-based reconstruction technique a robust learning process and the inferences on the astrophysical parameters obtained therefrom are quite reliable that can be used for future analysis.
Abstract:In this work, we implement Gaussian process regression to reconstruct the expansion history of the universe in a model-agnostic manner, using the Pantheon-Plus SN-Ia compilation in combination with two different BAO measurements (SDSS-IV and DESI DR1). In both the reconstructions, the $\Lambda$CDM model is always included in the 95\% confidence intervals. We find evidence that the DESI LRG data at $z_{\text{eff}} = 0.51$ is not an outlier within our model-independent framework. We study the $\mathcal{O}m$-diagnostics and the evolution of the total equation of state (EoS) of our universe, which hint towards the possibility of a quintessence-like dark energy scenario with a very slowly varying EoS, and a phantom-crossing in higher $z$. The entire exercise is later complemented by considering two more SN-Ia compilations - DES-5YR and Union3 - in combination with DESI BAO. Reconstruction with the DESI BAO + DES-5YR SN data sets predicts that the $\Lambda$CDM model lies outside the 3$\sigma$ confidence levels, whereas with DESI BAO + Union3 data, the $\Lambda$CDM model is always included within 1$\sigma$. We also report constraints on $H_0 r_d$ from our model-agnostic analysis, independent of the pre-recombination physics. Our results point towards an $\approx$ 2$\sigma$ discrepancy between the DESI + Pantheon-Plus and DESI + DES-5YR data sets, which calls for further investigation.
Abstract:The strengthening of tensions in the cosmological parameters has led to a reconsideration of fundamental aspects of standard cosmology. The tension in the Hubble constant can also be viewed as a tension between local and early Universe constraints on the absolute magnitude $M_B$ of Type Ia supernova. In this work, we reconsider the possibility of a variation of this parameter in a model-independent way. We employ neural networks to agnostically constrain the value of the absolute magnitude as well as assess the impact and statistical significance of a variation in $M_B$ with redshift from the Pantheon+ compilation, together with a thorough analysis of the neural network architecture. We find an indication for a transition redshift at the $z\approx 1$ region.
Abstract:We investigate the prospect of reconstructing the ``cosmic distance ladder'' of the Universe using a novel deep learning framework called LADDER - Learning Algorithm for Deep Distance Estimation and Reconstruction. LADDER is trained on the apparent magnitude data from the Pantheon Type Ia supernovae compilation, incorporating the full covariance information among data points, to produce predictions along with corresponding errors. After employing several validation tests with a number of deep learning models, we pick LADDER as the best performing one. We then demonstrate applications of our method in the cosmological context, that include serving as a model-independent tool for consistency checks for other datasets like baryon acoustic oscillations, calibration of high-redshift datasets such as gamma ray bursts, use as a model-independent mock catalog generator for future probes, etc. Our analysis advocates for interesting yet cautious consideration of machine learning applications in these contexts.
Abstract:We study the prospects of Machine Learning algorithms like Gaussian processes (GP) as a tool to reconstruct the Hubble parameter $H(z)$ with two upcoming gravitational wave missions, namely the evolved Laser Interferometer Space Antenna (eLISA) and the Einstein Telescope (ET). We perform non-parametric reconstructions of $H(z)$ with GP using realistically generated catalogues, assuming various background cosmological models, for each mission. We also take into account the effect of early-time and late-time priors separately on the reconstruction, and hence on the Hubble constant ($H_0$). Our analysis reveals that GPs are quite robust in reconstructing the expansion history of the Universe within the observational window of the specific mission under study. We further confirm that both eLISA and ET would be able to constrain $H(z)$ and $H_0$ to a much higher precision than possible today, and also find out their possible role in addressing the Hubble tension for each model, on a case-by-case basis.