TY - JOUR
T1 - DICE
T2 - exploiting all bivariate dependencies in binary and multary search spaces
AU - Lane, Fergal
AU - Azad, R. Muhammad Atif
AU - Ryan, Conor
N1 - Publisher Copyright:
© 2017, Springer-Verlag GmbH Germany, part of Springer Nature.
PY - 2018/9/1
Y1 - 2018/9/1
N2 - Although some of the earliest Estimation of Distribution Algorithms (EDAs) utilized bivariate marginal distribution models, up to now, all discrete bivariate EDAs had one serious limitation: they were constrained to exploiting only a limited O(d) subset out of all possible O(d2) bivariate dependencies. As a first we present a family of discrete bivariate EDAs that can learn and exploit allO(d2) dependencies between variables, and yet have the same run-time complexity as their more limited counterparts. This family of algorithms, which we label DICE (DIscrete Correlated Estimation of distribution algorithms), is rigorously based on sound statistical principles, and particularly on a modelling technique from statistical physics: dichotomised multivariate Gaussian distributions. Initially (Lane et al. in European Conference on the Applications of Evolutionary Computation, Springer, 1999), DICE was trialled on a suite of combinatorial optimization problems over binary search spaces. Our proposed dichotomised Gaussian (DG) model in DICE significantly outperformed existing discrete bivariate EDAs; crucially, the performance gap increasingly widened as dimensionality of the problems increased. In this comprehensive treatment, we generalise DICE by successfully extending it to multary search spaces that also allow for categorical variables. Because correlation is not wholly meaningful for categorical variables, interactions between such variables cannot be fully modelled by correlation-based approaches such as in the original formulation of DICE. Therefore, here we extend our original DG model to deal with such situations. We test DICE on a challenging test suite of combinatorial optimization problems, which are defined mostly on multary search spaces. While the two versions of DICE outperform each other on different problem instances, they both outperform all the state-of-the-art bivariate EDAs on almost all of the problem instances. This further illustrates that these innovative DICE methods constitute a significant step change in the domain of discrete bivariate EDAs.
AB - Although some of the earliest Estimation of Distribution Algorithms (EDAs) utilized bivariate marginal distribution models, up to now, all discrete bivariate EDAs had one serious limitation: they were constrained to exploiting only a limited O(d) subset out of all possible O(d2) bivariate dependencies. As a first we present a family of discrete bivariate EDAs that can learn and exploit allO(d2) dependencies between variables, and yet have the same run-time complexity as their more limited counterparts. This family of algorithms, which we label DICE (DIscrete Correlated Estimation of distribution algorithms), is rigorously based on sound statistical principles, and particularly on a modelling technique from statistical physics: dichotomised multivariate Gaussian distributions. Initially (Lane et al. in European Conference on the Applications of Evolutionary Computation, Springer, 1999), DICE was trialled on a suite of combinatorial optimization problems over binary search spaces. Our proposed dichotomised Gaussian (DG) model in DICE significantly outperformed existing discrete bivariate EDAs; crucially, the performance gap increasingly widened as dimensionality of the problems increased. In this comprehensive treatment, we generalise DICE by successfully extending it to multary search spaces that also allow for categorical variables. Because correlation is not wholly meaningful for categorical variables, interactions between such variables cannot be fully modelled by correlation-based approaches such as in the original formulation of DICE. Therefore, here we extend our original DG model to deal with such situations. We test DICE on a challenging test suite of combinatorial optimization problems, which are defined mostly on multary search spaces. While the two versions of DICE outperform each other on different problem instances, they both outperform all the state-of-the-art bivariate EDAs on almost all of the problem instances. This further illustrates that these innovative DICE methods constitute a significant step change in the domain of discrete bivariate EDAs.
KW - Bivariate estimation of distribution algorithms
KW - Combinatorial optimization
KW - Dichotomised Gaussian models
UR - http://www.scopus.com/inward/record.url?scp=85038384118&partnerID=8YFLogxK
U2 - 10.1007/s12293-017-0246-1
DO - 10.1007/s12293-017-0246-1
M3 - Article
AN - SCOPUS:85038384118
SN - 1865-9284
VL - 10
SP - 245
EP - 255
JO - Memetic Computing
JF - Memetic Computing
IS - 3
ER -