Despite many successes, machine learning (ML) methods such as neural nets often struggle to learn given small training sets. In contrast, biological neural nets (BNNs) excel at fast learning. We can thus look to BNNs for tools to improve performance of ML methods in this low-data regime. The insect olfactory network, though simple, can learn new odors very rapidly. Its two key structures are a layer with competitive inhibition (the Antennal Lobe, AL), followed by a high dimensional sparse plastic layer (the Mushroom Body, MB). This AL-MB network can rapidly learn not only odors but also handwritten digits, better in fact than standard ML methods in the few-shot regime. In this work, we deploy the AL-MB network as an automatic feature generator, using its Readout Neurons as additional features for standard ML classifiers. We hypothesize that the AL-MB structure has a strong intrinsic clustering ability; and that its Readout Neurons, used as input features, will boost the performance of ML methods. We find that these "insect cyborgs", ie classifiers that are part-moth and part-ML method, deliver significantly better performance than baseline ML methods alone on a generic (non-spatial) 85-feature, 10-class task derived from the MNIST dataset. Accuracy improves by an average of 6% to 33% for N < 15 training samples per class, and by 6% to 10% for N > 15. Remarkably, these moth-generated features increase ML accuracy even when the ML method's baseline accuracy already exceeds the AL-MB's own limited capacity. The two structures in the AL-MB, a competitive inhibition layer and a high-dimensional sparse layer with Hebbian plasticity, are novel in the context of artificial NNs but endemic in BNNs. We believe they can be deployed either prepended as feature generators or inserted as layers into deep NNs, to potentially improve ML performance.