Fully symmetric learning rules for principal component analysis can be derived from a novel objective function suggested in our previous work. We observed that these learning rules suffer from slow convergence for covariance matrices where some principal eigenvalues are close to each other. Here we describe a modified objective function with an additional term which mitigates this convergence problem. We show that the learning rule derived from the modified objective function inherits all fixed points from the original learning rule (but may introduce additional ones). Also the stability of the inherited fixed points remains unchanged. Only the steepness of the objective function is increased in some directions. Simulations confirm that the convergence speed can be noticeably improved, depending on the weight factor of the additional term.