Continual learning (CL) is an emerging learning paradigm that aims to emulate the human capability of learning and accumulating knowledge continually without forgetting the previously learned knowledge and also transferring the knowledge to new tasks to learn them better. This survey presents a comprehensive review of the recent progress of CL in the NLP field. It covers (1) all CL settings with a taxonomy of existing techniques. Besides dealing with forgetting, it also focuses on (2) knowledge transfer, which is of particular importance to NLP. Both (1) and (2) are not mentioned in the existing survey. Finally, a list of future directions is also discussed.