Daniel Shawul wrote: Do you mean that you try null move iteratively starting from the lowest say from depth=1 to depth=d-3 until all of fail high, implemented like IID or via an explicitly for-loop over the depths?
Code: Select all
int CalculateNullDepth(int Depth)
{
if ((board->NumOfPieces[board->SideToMove] <= 8)) return Depth - 4 * INCPLY;
if (Depth - 5 * INCPLY < INCPLY) return (Depth - 4 * INCPLY);
return (Depth - 5 * INCPLY);
}
Code: Select all
if (do_null)
{
board->SideToMove = ChangeSide(board->SideToMove);
if (!QuickRefute(static_eval - beta) && (spm >= 0))
{
int cur_depth = NullDepth % INCPLY;
undo[spm++].moves = ZERO_MOVE;
tree[gl + 1].in_check = FALSE;
tree[gl].extended_reason = 0;
gl++;
do
{
best = -NextNullSearch(-(beta + ZERO_SAFETY), -(beta + ZERO_SAFETY) + 1, cur_depth, FALSE);
if (board->stopped || best < beta)
{
break;
}
cur_depth += INCPLY;
} while (cur_depth <= NullDepth);
gl--;
tree[gl].ThreatMove = tree[gl + 1].LastThreatMove;
board->SideToMove = ChangeSide(board->SideToMove);
spm--;
if (board->stopped) return 0;
if (best >= beta + ZERO_SAFETY)
{
PutInHash(Depth, ZERO_MOVE, beta, LOWER, mate_threat, flags);
return beta;
}
if (best == -INFINITY + gl + 2) mate_threat = TRUE;
}
else
{
board->SideToMove = ChangeSide(board->SideToMove);
}
}
Probably I should experiment with it. Are you sure that % of nodes in null-move subtrees is neglible? Have you tried to measure it?Daniel Shawul wrote:I think that the current top programs use depths big enough to make null move searches negligible, using depth reductions function of depth itself say R=depth/4. From my limited perft-analysis of null move trees, even a constant reductions of 3 makes the cost negligible when you factor in the thing is applied recursively.