Abstract:Deep learning models have set benchmark results in various Natural Language Processing tasks. However, these models require an enormous amount of training data, which is infeasible in many practical problems. While various techniques like domain adaptation, fewshot learning techniques address this problem, we introduce a new technique of actively infusing external knowledge into learning to solve low data regime problems. We propose a technique called ActKnow that actively infuses knowledge from Knowledge Graphs (KG) based "on-demand" into learning for Question Answering (QA). By infusing world knowledge from Concept-Net, we show significant improvements on the ARC Challenge-set benchmark over purely text-based transformer models like RoBERTa in the low data regime. For example, by using only 20% training examples, we demonstrate a 4% improvement in the accuracy for both ARC-challenge and OpenBookQA, respectively.