Recently, it has been revealed that small semantic segmentation (SS) models exhibit a tendency to make errors in maintaining boundary region completeness and preserving target region connectivity, despite their effective segmentation of the main object regions. To address these errors, we propose a targeted boundary and relation distillation (BRD) strategy using knowledge distillation from large teacher models to small student models. Specifically, the boundary distillation extracts explicit object boundaries from the hierarchical feature maps of the backbone network, subsequently enhancing the student model's mask quality in boundary regions. Concurrently, the relation distillation transfers implicit relations from the teacher model to the student model using pixel-level self-relation as a bridge, ensuring that the student's mask has strong target region connectivity. The proposed BRD is designed concretely for SS and is characterized by simplicity and efficiency. Through experimental evaluations on multiple SS datasets, including Pascal VOC 2012, Cityscapes, ADE20K, and COCO-Stuff 10K, we demonstrated that BRD significantly surpasses the current methods without increasing the inference costs, generating crisp region boundaries and smooth connecting regions that are challenging for small models.