This paper builds on recent developments in Bayesian network (BN) structure learning under the controversial assumption that the input variables are dependent. This assumption is geared towards real-world datasets that incorporate variables which are assumed to be dependent. It aims to address the problem of learning multiple disjoint subgraphs which do not enable full propagation of evidence. A novel hybrid structure learning algorithm is presented in this paper for this purpose, called SaiyanH. The results show that the algorithm discovers satisfactorily accurate connected DAGs in cases where all other algorithms produce multiple disjoint subgraphs for dependent variables. This problem is highly prevalent in cases where the sample size of the input data is low with respect to the dimensionality of the model, which is often the case when working with real data. Based on six case studies, five different sample sizes, three different evaluation metrics, and other state-of-the-art or well-established constraint-based, score-based and hybrid learning algorithms, the results rank SaiyanH 4th out of 13 algorithms for overall performance.