Semantic communications learned on background knowledge bases (KBs) have been identified as a promising technology for communications between intelligent agents. Existing works assume that transceivers of semantic communications share the same KB. However, intelligent transceivers may suffer from the communication burden or worry about privacy leakage to exchange data in KBs. Besides, the transceivers may independently learn from the environment and dynamically update their KBs, leading to timely sharing of the KBs infeasible. All these cause the mismatch between the KBs, which may result in a semantic-level misunderstanding on the receiver side. To address this issue, we propose a transceiver cooperative learning-assisted semantic communication (TCL-SC) scheme against mismatched KBs. In TCL-SC, the transceivers cooperatively train semantic encoder and decoder neuron networks (NNs) of the same structure based on their own KBs. They periodically share the parameters of NNs. To reduce the communication overhead of parameter sharing, parameter quantization is adopted. Moreover, we discuss the impacts of the number of communication rounds on the performance of semantic communication systems. Experiments on real-world data demonstrate that our proposed TCL-SC can reduce the semantic-level misunderstanding on the receiver side caused by the mismatch between the KBs, especially at the low signal-to-noise (SNR) ratio regime.