Abstract:Knowledge graph-based dialogue systems are capable of generating more informative responses and can implement sophisticated reasoning mechanisms. However, these models do not take into account the sparseness and incompleteness of knowledge graph (KG)and current dialogue models cannot be applied to dynamic KG. This paper proposes a dynamic Knowledge graph-based dialogue generation method with improved adversarial Meta-Learning (KDAD). KDAD formulates dynamic knowledge triples as a problem of adversarial attack and incorporates the objective of quickly adapting to dynamic knowledge-aware dialogue generation. We train a knowledge graph-based dialog model with improved ADML using minimal training samples. The model can initialize the parameters and adapt to previous unseen knowledge so that training can be quickly completed based on only a few knowledge triples. We show that our model significantly outperforms other baselines. We evaluate and demonstrate that our method adapts extremely fast and well to dynamic knowledge graph-based dialogue generation.
Abstract:Knowledge graph-based dialogue systems can narrow down knowledge candidates for generating informative and diverse responses with the use of prior information, e.g., triple attributes or graph paths. However, most current knowledge graph (KG) cover incomplete domain-specific knowledge. To overcome this drawback, we propose a knowledge graph based proactive dialogue generation model (KgDg) with three components, improved model-agnostic meta-learning algorithm (MAML), knowledge selection in knowledge triplets embedding, and knowledge aware proactive response generator. For knowledge triplets embedding and selection, we formulate it as a problem of sentence embedding to better capture semantic information. Our improved MAML algorithm is capable of learning general features from a limited number of knowledge graphs, which can also quickly adapt to dialogue generation with unseen knowledge triplets. Extensive experiments are conducted on a knowledge aware dialogue dataset (DuConv). The results show that KgDg adapts both fast and well to knowledge graph-based dialogue generation and outperforms state-of-the-art baseline.