In this study, a novel machine learning algorithm is presented for disaggregation of satellite soil moisture (SM) based on self-regularized regressive models (SRRM) using high-resolution correlated information from auxiliary sources. It includes regularized clustering that assigns soft memberships to each pixel at fine-scale followed by a kernel regression that computes the value of the desired variable at all pixels. Coarse-scale remotely sensed SM were disaggregated from 10km to 1km using land cover, precipitation, land surface temperature, leaf area index, and in-situ observations of SM. This algorithm was evaluated using multi-scale synthetic observations in NC Florida for heterogeneous agricultural land covers. It was found that the root mean square error (RMSE) for 96% of the pixels was less than 0.02 $m^3/m^3$. The clusters generated represented the data well and reduced the RMSE by upto 40% during periods of high heterogeneity in land-cover and meteorological conditions. The Kullback Leibler divergence (KLD) between the true SM and the disaggregated estimates is close to 0, for both vegetated and baresoil landcovers. The disaggregated estimates were compared to those generated by the Principle of Relevant Information (PRI) method. The RMSE for the PRI disaggregated estimates is higher than the RMSE for the SRRM on each day of the season. The KLD of the disaggregated estimates generated by the SRRM is at least four orders of magnitude lower than those for the PRI disaggregated estimates, while the computational time needed was reduced by three times. The results indicate that the SRRM can be used for disaggregating SM with complex non-linear correlations on a grid with high accuracy.