CLOP for Noisy Black-Box Parameter Optimization

Discussion of chess software programming and technical issues.

Moderators: hgm, Harvey Williamson, bob

Forum rules
This textbox is used to restore diagrams posted with the [d] tag before the upgrade.
User avatar
Zlaire
Posts: 62
Joined: Mon Oct 03, 2011 7:40 pm

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by Zlaire » Sat Nov 05, 2011 10:49 pm

Oh awesome stuff. Never really managed to create my own executable so ran it through the Qt Creator interface.

Thank you very much.

User avatar
ilari
Posts: 750
Joined: Mon Mar 27, 2006 5:45 pm
Location: Finland
Contact:

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by ilari » Sat Nov 05, 2011 11:17 pm

Rémi Coulom wrote:I uploaded a new version. I included a Windows version this time.
Big thanks. A couple of problems with compiling on Ubuntu 11.10 though:

- GCC says: "rclib/src/util/userflag.h:27:9: error: ‘size_t’ does not name a type". I fixed this by inserting "#include <cstring>" at the beginning of the file.

- In programs/_general/options.mk only the Python 2.5 and 2.6 include paths are used. I had to add 2.7 to make it compile.

User avatar
Zlaire
Posts: 62
Joined: Mon Oct 03, 2011 7:40 pm

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by Zlaire » Sun Nov 06, 2011 12:06 am

Had an interesting piece of "anomaly" happening. Not sure if this is a CLOP thing or general tuning (miss)behaviour.

I used one parameter to tune a number which was then used by a bunch more parameters. I.e. something like:

p1 * a
p2 * p1 * b
p3 * p1 * c
p4 * p1 * d

etc.

CLOP rushed p1 to very near 0 (in a few hundred games it was firmly set on close to miniscule numbers for p1 and didn't change for the next 20,000 games), and then tried to give huge numbers to p2, p3 etc.

Removing p1 (and setting it to some arbitrary positive constant) gave much more sensible numbers for the rest of the parameters.

I guess the end result was about the same. Maybe my use of p1 was so diverse that keeping it at close to insignificant was the best way to go...

Just thinking aloud here, but this tuning business is really fascinating to me. :)

Rémi Coulom
Posts: 404
Joined: Mon Apr 24, 2006 6:06 pm
Contact:

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by Rémi Coulom » Sun Nov 06, 2011 8:00 am

Zlaire wrote:Had an interesting piece of "anomaly" happening. Not sure if this is a CLOP thing or general tuning (miss)behaviour.

I used one parameter to tune a number which was then used by a bunch more parameters. I.e. something like:

p1 * a
p2 * p1 * b
p3 * p1 * c
p4 * p1 * d

etc.
Note that if you are going to optimize positive parameters that multiply or divide each other, it might be a good idea to declare them as "GammaParameter", ie perform regression on their logarithm. The performance is likely to be more quadratic then.

It might be a good idea to declare at least p1 as GammaParameter here. Make sure the minimum is > 0 (0.00001, for instance).

It might be an even better idea to not multiply p2, p3, and p4 by p1.

Having two parameters that multiply each other is likely to make the function Rosenbrock-ish, which is difficult to optimize with quadratic regression.

Rémi

mcostalba
Posts: 2679
Joined: Sat Jun 14, 2008 7:17 pm

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by mcostalba » Sun Nov 06, 2011 8:24 am

Rémi Coulom wrote: Also: when using "Correlations none", you can try to de-correlate your variables. For instance, values of Knight(N) and Bishop(B) are strongly correlated. Instead of optimizing N and B, you can optimize N+B and N-B: they are almost independent.
I thought about this statement for some time, and I would like to ask, if de-correlation is a good thing, why don't you de-correlate the variables already by yourself inside the algorithm ?

I mean, user asks you to optimize p1, p2. Instead of "playing" with p1 and p2 directly, you internally try to tune 2 other variables c1 and c2 that are bounded to p1, p2 by a de-correlation function:

(p1, p2) = D(c1, c2)

Of course this is hidden to the user that just sees the final values (p1, p2), but inside CLOP all the tuning is performed on (c1, c2).

What do you think ?

Rémi Coulom
Posts: 404
Joined: Mon Apr 24, 2006 6:06 pm
Contact:

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by Rémi Coulom » Sun Nov 06, 2011 8:30 am

mcostalba wrote:
Rémi Coulom wrote: Also: when using "Correlations none", you can try to de-correlate your variables. For instance, values of Knight(N) and Bishop(B) are strongly correlated. Instead of optimizing N and B, you can optimize N+B and N-B: they are almost independent.
I thought about this statement for some time, and I would like to ask, if de-correlation is a good thing, why don't you de-correlate the variables already by yourself inside the algorithm ?

I mean, user asks you to optimize p1, p2. Instead of "playing" with p1 and p2 directly, you internally try to tune 2 other variables c1 and c2 that are bounded to p1, p2 by a de-correlation function:

(p1, p2) = D(c1, c2)

Of course this is hidden to the user that just sees the final values (p1, p2), but inside CLOP all the tuning is performed on (c1, c2).

What do you think ?
This is what CLOP does by default (with "Correlations all"). The problem is that the number of pairs of variables grows like the square of the number of variables. So it is costly when there are many variables.

Rémi

mcostalba
Posts: 2679
Joined: Sat Jun 14, 2008 7:17 pm

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by mcostalba » Sun Nov 06, 2011 8:50 am

Rémi Coulom wrote: More generally, you can try to be creative to reduce the dimensionality of the optimization problem.
Thanks for your quick answer to previous question. Here is the next one ;-)

Why don't you reduce the dimensionality by yourself ?

I mean user asks to tune p1,....,p8 and then he sets also a new parameter that is "dimensionality": given dimensionality = 2 you tune the two derived values c1 and c2 that are derived from p1,..p8 for instance (but here you are much more creative than me) by a linear combination of p1...p8.

If you remember the ampli+bias idea that Joona and me reported, this is a kind of generalization of that idea.

What do you think ?


P.S: After many months tuning SF I made up my mind that the secret of a good tuning is the choice of the starting variables to tune. So a mapping of P1,...,Pn to C1,...,Ck and tune Ck could yield, if done properly, a much faster and better tune.

User avatar
Zlaire
Posts: 62
Joined: Mon Oct 03, 2011 7:40 pm

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by Zlaire » Sun Nov 06, 2011 10:20 am

Another question. How does a parameter that can't be tuned (let's say it doesn't effect the outcome of the game at all), interfere with other parameters in the same suite?

Would removing that ineffective parameter improve the result or is it disregarded anyway?

petero2
Posts: 561
Joined: Mon Apr 19, 2010 5:07 pm
Location: Sweden
Contact:

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by petero2 » Sun Nov 06, 2011 1:11 pm

mcostalba wrote:
Rémi Coulom wrote: More generally, you can try to be creative to reduce the dimensionality of the optimization problem.
Thanks for your quick answer to previous question. Here is the next one ;-)

Why don't you reduce the dimensionality by yourself ?

I mean user asks to tune p1,....,p8 and then he sets also a new parameter that is "dimensionality": given dimensionality = 2 you tune the two derived values c1 and c2 that are derived from p1,..p8 for instance (but here you are much more creative than me) by a linear combination of p1...p8.
It seems to me that you would get a similar effect if you let your CLOP parameters represent deltas from your current parameter values, and then project the solution vector onto the subspace corresponding to the k smallest eigenvalues of the hessian.

mcostalba
Posts: 2679
Joined: Sat Jun 14, 2008 7:17 pm

Re: CLOP for Noisy Black-Box Parameter Optimization

Post by mcostalba » Sun Nov 06, 2011 4:53 pm

petero2 wrote: It seems to me that you would get a similar effect if you let your CLOP parameters represent deltas from your current parameter values, and then project the solution vector onto the subspace corresponding to the k smallest eigenvalues of the hessian.
My CLOP parameters already represent deltas becuase, to get an uniform tuning interface, I have defined in the clop file:

Code: Select all

parameter p0 -100 100
parameter p1 -100 100
parameter p2 -100 100
......
And then I sum pN to the acutual default/current value of the parameter to tune.

Regarding the second part of your sentence I have not understand what "in practice" I should do.

Post Reply