sje wrote:Here's a question about null move pruning in Crafty. I mention this in the forum as perhaps others have used the code for inspiration.
Consider the code in search.c:
Code: Select all
if (depth - null_depth - 1 > 0)
value =
-Search(tree, -beta, 1 - beta, Flip(wtm), depth - null_depth - 1,
ply + 1, NO_NULL);
else
value = -QuiesceChecks(tree, -beta, 1 - beta, Flip(wtm), ply + 1);
HashKey = save_hash_key;
if (abort_search || tree->stop)
return (0);
if (value >= beta) {
HashStore(tree, ply, depth, wtm, LOWER, value, tree->curmv[ply]);
return (value);
The value taken from the null move search is stored in the transposition table as a lower bound with with a draft of "depth". However, isn't it true that the null move search really only had a draft of "depth - null_depth - 1", a value with a bit less confidence? In fact, if recursive null move searching is in place, the actual draft may be even less.
Here's the way to think about it. You reach this position with remaining depth D. First thing you do is a hash probe. If you have previously reached this same position, with the same draft, and the null-move search failed high, don't you want to fail high now without even doing the null-move search?
Positions below the current ply will be stored with a very shallow depth, for sure, since R=3 in Crafty. But at the current ply, if you reduce the depth, then when you reach this position again, you will do the null search again, it will fail high again, and you just wasted that effort...
BTW, if you think about it, this is an optimization, not a new idea. If I didn't store the entry there, what would happen? I would go one ply deeper, and then get a hash hit from the reduced-depth entry stored there. But I went thru the extra effort of another recursive search level...