Recognizing and identifying human locomotion is a critical step to ensuring fluent control of wearable robots, such as transtibial prostheses. In particular, classifying the intended locomotion mode and estimating the gait phase are key. In this work, a novel, interpretable, and computationally efficient algorithm is presented for simultaneously predicting locomotion mode and gait phase. Using able-bodied (AB) and transtibial prosthesis (PR) data, seven locomotion modes are tested including slow, medium, and fast level walking (0.6, 0.8, and 1.0 m/s), ramp ascent/descent (5 degrees), and stair ascent/descent (20 cm height). Overall classification accuracy was 99.1$\%$ and 99.3$\%$ for the AB and PR conditions, respectively. The average gait phase error across all data was less than 4$\%$. Exploiting the structure of the data, computational efficiency reached 2.91 $\mu$s per time step. The time complexity of this algorithm scales as $O(N\cdot M)$ with the number of locomotion modes $M$ and samples per gait cycle $N$. This efficiency and high accuracy could accommodate a much larger set of locomotion modes ($\sim$ 700 on Open-Source Leg Prosthesis) to handle the wide range of activities pursued by individuals during daily living.