The rewriting method for text summarization combines extractive and abstractive approaches, improving the conciseness and readability of extractive summaries using an abstractive model. Exiting rewriting systems take each extractive sentence as the only input, which is relatively focused but can lose necessary background knowledge and discourse context. In this paper, we investigate contextualized rewriting, which consumes the entire document and considers the summary context. We formalize contextualized rewriting as a seq2seq with group-tag alignments, introducing group-tag as a solution to model the alignments, identifying extractive sentences through content-based addressing. Results show that our approach significantly outperforms non-contextualized rewriting systems without requiring reinforcement learning, achieving strong improvements on ROUGE scores upon multiple extractors.