Non-intrusive load monitoring (NILM) decomposes the total load reading into appliance-level load signals. Many deep learning-based methods have been developed to accomplish NILM, and the training of deep neural networks (DNN) requires massive load data containing different types of appliances. For local data owners with inadequate load data but expect to accomplish a promising model performance, the conduction of effective NILM co-modelling is increasingly significant. While during the cooperation of local data owners, data exchange and centralized data storage may increase the risk of power consumer privacy breaches. To eliminate the potential risks, a novel NILM method named Fed-NILM ap-plying Federated Learning (FL) is proposed in this paper. In Fed-NILM, local parameters instead of load data are shared among local data owners. The global model is obtained by weighted averaging the parameters. In the experiments, Fed-NILM is validated on two real-world datasets. Besides, a comparison of Fed-NILM with locally-trained NILMs and the centrally-trained one is conducted in both residential and industrial scenarios. The experimental results show that Fed-NILM outperforms locally-trained NILMs and approximate the centrally-trained NILM which is trained on the entire load dataset without privacy preservation.