MBU / CAS
Abstract:Two-dimensional mass spectrometry (2D MS) is a method for tandem mass spectrometry that enables the correlation between precursor and fragment ions without the need for ion isolation. On a Fourier transform ion cyclotron resonance mass spectrometer, the phase correction functions for absorption mode data processing were found to be linear in the precursor ion dimension and quadratic in the fragment ion dimension. Absorption mode data processing on limited data sets has previously shown improvements in signal-to-noise ratio and resolving power by a factor of 2. Here, we have expanded absorption mode data processing to 2D mass spectra regardless of size and frequency range. We have applied absorption mode 2D MS to top-down analysis of variously oxidized ubiquitin proteoforms generated by fast photochemical oxidation of proteins (FPOP) and to an extract of ergot alkaloids. We show that absorption mode data processing significantly improves both the signal-to-noise ratio and the resolving power of the 2D mass spectrum compared to standard magnitude mode in terms of sequence coverage in top-down proteomics, as well as the accuracy of precursor-fragment correlation in metabolomics.




Abstract:The aim of this paper is two-fold: firstly, to present subspace embedding properties for $s$-hashing sketching matrices, with $s\geq 1$, that are optimal in the projection dimension $m$ of the sketch, namely, $m=\mathcal{O}(d)$, where $d$ is the dimension of the subspace. A diverse set of results are presented that address the case when the input matrix has sufficiently low coherence (thus removing the $\log^2 d$ factor dependence in $m$, in the low-coherence result of Bourgain et al (2015) at the expense of a smaller coherence requirement); how this coherence changes with the number $s$ of column nonzeros (allowing a scaling of $\sqrt{s}$ of the coherence bound), or is reduced through suitable transformations (when considering hashed -- instead of subsampled -- coherence reducing transformations such as randomised Hadamard). Secondly, we apply these general hashing sketching results to the special case of Linear Least Squares (LLS), and develop Ski-LLS, a generic software package for these problems, that builds upon and improves the Blendenpik solver on dense input and the (sequential) LSRN performance on sparse problems. In addition to the hashing sketching improvements, we add suitable linear algebra tools for rank-deficient and for sparse problems that lead Ski-LLS to outperform not only sketching-based routines on randomly generated input, but also state of the art direct solver SPQR and iterative code HSL on certain subsets of the sparse Florida matrix collection; namely, on least squares problems that are significantly overdetermined, or moderately sparse, or difficult.