A natural problem in high-dimensional inference is to decide if a classifier $f:\mathbb{R}^n \rightarrow [-1,1]$ depends on a small number of linear directions of its input data. Call a function $g: \mathbb{R}^n \rightarrow [-1,1]$, a linear $k$-junta if it is completely determined by some $k$-dimensional subspace of the input space. A recent work of the authors showed that linear $k$-juntas are testable. Thus there exists an algorithm to distinguish between: 1. $f: \mathbb{R}^n \rightarrow \{-1,1\}$ which is a linear $k$-junta with surface area $s$, 2. $f$ is $\epsilon$-far from any linear $k$-junta, where the query complexity of the algorithm is independent of the ambient dimension $n$. Following the surge of interest in noise-tolerant property testing, in this paper we prove a noise-tolerant (or robust) version of this result. Namely, we give an algorithm which given any $c>0$, $\epsilon>0$, distinguishes between 1. $f: \mathbb{R}^n \rightarrow \{-1,1\}$ has correlation at least $c$ with some linear $k$-junta with surface area $s$. 2. $f$ has correlation at most $c-\epsilon$ with any linear $k$-junta. Our query complexity is qualitatively the same, i.e., remains independent of $n$ and is polynomially dependent on $k$. A major motivation for studying Linear Junta Testing come from statistical models where it is crucial to allow noise. In the language of model compression, our results show statistical models can be "compressed" in query complexity that depends only on the size of the desired compression, when the compression is a linear Junta.