While knowledge graphs contain rich semantic knowledge of various entities and the relational information among them, temporal knowledge graphs (TKGs) further indicate the interactions of the entities over time. To study how to better model TKGs, automatic temporal knowledge graph completion (TKGC) has gained great interest. Recent TKGC methods aim to integrate advanced deep learning techniques, e.g., attention mechanism and Transformer, to boost model performance. However, we find that compared to adopting various kinds of complex modules, it is more beneficial to better utilize the whole amount of temporal information along the time axis. In this paper, we propose a simple but powerful graph encoder TARGCN for TKGC. TARGCN is parameter-efficient, and it extensively utilizes the information from the whole temporal context. We perform experiments on three benchmark datasets. Our model can achieve a more than 42% relative improvement on GDELT dataset compared with the state-of-the-art model. Meanwhile, it outperforms the strongest baseline on ICEWS05-15 dataset with around 18.5% fewer parameters.