TY - JOUR
T1 - On effective and inexpensive local search techniques in genetic programming regression
AU - Lane, Fergal
AU - Muhammad Atif Azad, R.
AU - Ryan, Conor
N1 - Publisher Copyright:
© Springer International Publishing Switzerland 2014.
PY - 2014
Y1 - 2014
N2 - Local search methods can harmoniously work with global search methods such as Evolutionary Algorithms (EAs); however, particularly in Genetic Programming (GP), concerns remain about the additional cost of local search (LS). One successful such system is Chameleon, which tunes internal GP nodes and addresses cost concerns by employing a number of strategies to make its LS both effective and inexpensive. Expense is reduced by an innovative approach to parsimony pressure whereby smaller trees are rewarded with local search opportunities more often than bigger trees. This paper investigates three new extensions to Chameleon’s original simple setup, seeking ways for an even more effective local search. These are: trying alternative, more cost-reflective parsimony measures such as visitation length instead of tree size; varying the reward function to more gently incentivize parsimony than that in the original setup; and having more tuning earlier in runs when smaller trees can be tuned more cheaply and effectively. These strategies were tested on a varied suite of 16 difficult artificial and real-world regression problems. All of these techniques improved performance. We show that these strategies successfully combined to cumulatively improve average test RMSE by 31% over the original and already effective Chameleon tuning scheme. A minimum of 64 simulations were run on every problem/tuning setup combination.
AB - Local search methods can harmoniously work with global search methods such as Evolutionary Algorithms (EAs); however, particularly in Genetic Programming (GP), concerns remain about the additional cost of local search (LS). One successful such system is Chameleon, which tunes internal GP nodes and addresses cost concerns by employing a number of strategies to make its LS both effective and inexpensive. Expense is reduced by an innovative approach to parsimony pressure whereby smaller trees are rewarded with local search opportunities more often than bigger trees. This paper investigates three new extensions to Chameleon’s original simple setup, seeking ways for an even more effective local search. These are: trying alternative, more cost-reflective parsimony measures such as visitation length instead of tree size; varying the reward function to more gently incentivize parsimony than that in the original setup; and having more tuning earlier in runs when smaller trees can be tuned more cheaply and effectively. These strategies were tested on a varied suite of 16 difficult artificial and real-world regression problems. All of these techniques improved performance. We show that these strategies successfully combined to cumulatively improve average test RMSE by 31% over the original and already effective Chameleon tuning scheme. A minimum of 64 simulations were run on every problem/tuning setup combination.
UR - http://www.scopus.com/inward/record.url?scp=84921757432&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-10762-2_44
DO - 10.1007/978-3-319-10762-2_44
M3 - Article
AN - SCOPUS:84921757432
SN - 0302-9743
VL - 8672
SP - 444
EP - 453
JO - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
JF - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
ER -