Recently, several nonconvex sparse regularizers which can preserve the convexity of the cost function have received increasing attention. This paper proposes a general class of such convexity-preserving (CP) regularizers, termed partially smoothed difference-of-convex (pSDC) regularizer. The pSDC regularizer is formulated as a structured difference-of-convex (DC) function, where the landscape of the subtrahend function can be adjusted by a parameterized smoothing function so as to attain overall-convexity. Assigned with proper building blocks, the pSDC regularizer reproduces existing CP regularizers and opens the way to a large number of promising new ones. With respect to the resultant nonconvexly regularized convex (NRC) model, we derive a series of overall-convexity conditions which naturally embrace the conditions in previous works. Moreover, we develop a unified framework based on DC programming for solving the NRC model. Compared to previously reported proximal splitting type approaches, the proposed framework makes less stringent assumptions. We establish the convergence of the proposed framework to a global minimizer. Numerical experiments demonstrate the power of the pSDC regularizers and the efficiency of the proposed DC algorithm.