First, for the last two questions, the answer is because null-move is not doing exactly what you appear to think it is doing. It is a quick way to bail out of positions where you are so far ahead, if you give your opponent the ability to make two moves in a row, he can't catch up. This idea is unrelated to the actual material balance. Null-move helps when you start off a queen ahead, because it will only cause cutoffs if you get way _more_ than a queen ahead, so it is a relative algorithm, that is not dependent on the initial position's material balance. Most of us are not really doing null-move on a 30 ply search as that is usually a pawn ending where null-move is not so safe unless you resort to the verification search. And since testing has shown that idea to not be useful (it loses a couple of elo, in fact)Dann Corbit wrote:Consider the following:
#ifdef SMOOTH_REDUCTION
double delta = approximateEval - beta;
delta = max(delta, 1.0);
double ddepth = double(depth);
double r = 0.18 * ddepth + 3.1 + log(delta)/5.0;
r = r > ddepth ? ddepth : r;
int R = int(r);
#else
// Null move dynamic reduction based on depth
int R = (depth >= 5 * OnePly ? 4 : 3);
// Null move dynamic reduction based on value
if (approximateEval - beta > PawnValueMidgame)
R++;
#endif
If depth = 5, then 0.18 * 5 = 0.9
So 0.9 + 3.1 gives 4.0 which is the same as the original formula.
In value.h, we find this:
const Value PawnValueMidgame = Value(0x0C6);
In decimal, C6 is 198.
log (base e) of 198 is 5.288
dividing 5.288 by 5 gives 1 (approximately).
So, for the points at the boundary conditions of the old formula, my curve gives the same answers (or answers pretty nearly). When you fit this idea to your code, I suggest that you calibrate the curve accordingly, especially if you have carefully weighed out your null move reduction constants by experiment.
The difference is that when we bend away from these answers the solution I propose slowly changes so that instead of slamming the door, we are slowly closing it.
The reason I came up with smooth scaling is that null move's abrupt nature really bothered me. Why do we reduce the same in a 30 ply search as we do in a 12 ply search? That does not make sense. Why do we reduce the same with a full queen advantage as with a knight advantage? That also seemed very strange. So I just did a simple extrapolation to smooth out the reductions in order that they made sense to me.
As far as deeper reductions (beyond R=3 go), I have not had any success with that, and I tested it an absolute ton before 23.0 was released. I tried linear interpolation, and exponential in both directions (tapers off rapidly or tapers off very slowly. Once I have a chance to test the old/new stockfish on the cluster, that will quickly show whether this idea is a good one or not. But it certainly isn't +150. And I'd have a hard time imagining it even being possible to reach +50 with this kind of change, since basic null-move doesn't give that big a boost by itself, and this is just an enhancement.
More when we get our A/C up so the cluster can wake up.