In challenging environments where traditional sensing modalities struggle, in-air sonar offers resilience to optical interference. Placing a priori known landmarks in these environments can eliminate accumulated errors in autonomous mobile systems such as Simultaneous Localization and Mapping (SLAM) and autonomous navigation. We present a novel approach using a convolutional neural network to detect and classify ten different reflector landmarks with varying radii using in-air 3D sonar. Additionally, the network predicts the orientation angle of the detected landmarks. The neural network is trained on cochleograms, representing echoes received by the sensor in a time-frequency domain. Experimental results in cluttered indoor settings show promising performance. The CNN achieves a 97.3% classification accuracy on the test dataset, accurately detecting both the presence and absence of landmarks. Moreover, the network predicts landmark orientation angles with an RMSE lower than 10 degrees, enhancing the utility in SLAM and autonomous navigation applications. This advancement improves the robustness and accuracy of autonomous systems in challenging environments.