Which likely to be the strongest engine 6 months from now?

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

zullil
Posts: 6442
Joined: Tue Jan 09, 2007 12:31 am
Location: PA USA
Full name: Louis Zulli

Re: Which likely to be the strongest engine 6 months from no

Post by zullil »

Lyudmil Tsvetkov wrote:
zullil wrote:
MikeGL wrote: I like McBrain too. :)
I noticed the style of McBrain changes once I enter the config window
and click the tickbox of No_Null_Moves, this enables McBrain to
just crunch those difficult zugzwang puzzles like peanuts : )

McBrain would announce a mate in 25 or Mate in 30 with null_move disabled, while SF8 would take forever on the same position.

But for the question of the main poster, I guess an engine which can
implement a dynamic internal switch for null_move would rule. As a
thought experiment, imagine one of those difficult zugzwang puzzles
where all engines favors the losing side. What happens if you have this
null_move trick on your engine and steer the position into that seemingly
losing position and then trigger your "null_move dynamically disabled"
trick to knockout all engines. In theory your engine could score 100%
against all current top engines. But I am not sure if I missed something.


regards
Engines already have internal criteria for modifying/disabling null-move. For example, from Stockfish:

Code: Select all

// Step 8. Null move search with verification search (is omitted in PV nodes)
    if (   !PvNode
        &&  eval >= beta
        && (ss->staticEval >= beta - 35 * (depth / ONE_PLY - 6) || depth >= 13 * ONE_PLY)
        &&  pos.non_pawn_material(pos.side_to_move()))
    {
So, for example, null-move appears to be disabled when searching positions where the side to move has only pawns,
how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
I assume that some tuning was done to arrive at these particular constants, although I'm not a Stockfish developer and I don't follow the process as closely as I once did.

I also find this formulaic approach somehow unappealing, but Stockfish plays damn good chess! And anyone who wants to improve on it is welcome to contribute. :D
MikeGL
Posts: 1010
Joined: Thu Sep 01, 2011 2:49 pm

Re: Which likely to be the strongest engine 6 months from no

Post by MikeGL »

zullil wrote:
Lyudmil Tsvetkov wrote:
zullil wrote:
MikeGL wrote: I like McBrain too. :)
I noticed the style of McBrain changes once I enter the config window
and click the tickbox of No_Null_Moves, this enables McBrain to
just crunch those difficult zugzwang puzzles like peanuts : )

McBrain would announce a mate in 25 or Mate in 30 with null_move disabled, while SF8 would take forever on the same position.

But for the question of the main poster, I guess an engine which can
implement a dynamic internal switch for null_move would rule. As a
thought experiment, imagine one of those difficult zugzwang puzzles
where all engines favors the losing side. What happens if you have this
null_move trick on your engine and steer the position into that seemingly
losing position and then trigger your "null_move dynamically disabled"
trick to knockout all engines. In theory your engine could score 100%
against all current top engines. But I am not sure if I missed something.


regards
Engines already have internal criteria for modifying/disabling null-move. For example, from Stockfish:

Code: Select all

// Step 8. Null move search with verification search (is omitted in PV nodes)
    if (   !PvNode
        &&  eval >= beta
        && (ss->staticEval >= beta - 35 * (depth / ONE_PLY - 6) || depth >= 13 * ONE_PLY)
        &&  pos.non_pawn_material(pos.side_to_move()))
    {
So, for example, null-move appears to be disabled when searching positions where the side to move has only pawns,
how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
I assume that some tuning was done to arrive at these particular constants, although I'm not a Stockfish developer and I don't follow the process as closely as I once did.

I also find this formulaic approach somehow unappealing, but Stockfish plays damn good chess! And anyone who wants to improve on it is welcome to contribute. :D
You just copy pasted the conditional check, not the real meat of the code : )

Code: Select all


 {
        ss->currentMove = MOVE_NULL;
        ss->counterMoves = nullptr;

        assert(eval - beta >= 0);

        // Null move dynamic reduction based on depth and value
        Depth R = ((823 + 67 * depth / ONE_PLY) / 256 + std::min((eval - beta) / PawnValueMg, 3)) * ONE_PLY;

        pos.do_null_move(st);
        (ss+1)->skipEarlyPruning = true;
        nullValue = depth-R < ONE_PLY ? -qsearch<NonPV, false>&#40;pos, ss+1, -beta, -beta+1, DEPTH_ZERO&#41;
                                      &#58; - search<NonPV>&#40;pos, ss+1, -beta, -beta+1, depth-R, !cutNode&#41;;
        &#40;ss+1&#41;->skipEarlyPruning = false;
        pos.undo_null_move&#40;);

        if &#40;nullValue >= beta&#41;
        &#123;
            // Do not return unproven mate scores
            if &#40;nullValue >= VALUE_MATE_IN_MAX_PLY&#41;
                nullValue = beta;

            if &#40;depth < 12 * ONE_PLY && abs&#40;beta&#41; < VALUE_KNOWN_WIN&#41;
                return nullValue;

            // Do verification search at high depths
            ss->skipEarlyPruning = true;
            Value v = depth-R < ONE_PLY ? qsearch<NonPV, false>&#40;pos, ss, beta-1, beta, DEPTH_ZERO&#41;
                                        &#58;  search<NonPV>&#40;pos, ss, beta-1, beta, depth-R, false&#41;;
            ss->skipEarlyPruning = false;

            if &#40;v >= beta&#41;
                return nullValue;
        &#125;
regards
zullil
Posts: 6442
Joined: Tue Jan 09, 2007 12:31 am
Location: PA USA
Full name: Louis Zulli

Re: Which likely to be the strongest engine 6 months from no

Post by zullil »

MikeGL wrote: You just copy pasted the conditional check, not the real meat of the code : )


regards
Sure, because you were discussing dynamically disabling null move, not asking about the details of null move reduction! :D
Milos
Posts: 4190
Joined: Wed Nov 25, 2009 1:47 am

Re: Which likely to be the strongest engine 6 months from no

Post by Milos »

Lyudmil Tsvetkov wrote: how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
Gee, and how come you have 43 years when you reason as if you had 10???
Lyudmil Tsvetkov
Posts: 6052
Joined: Tue Jun 12, 2012 12:41 pm

Re: Which likely to be the strongest engine 6 months from no

Post by Lyudmil Tsvetkov »

zullil wrote:
Lyudmil Tsvetkov wrote:
zullil wrote:
MikeGL wrote: I like McBrain too. :)
I noticed the style of McBrain changes once I enter the config window
and click the tickbox of No_Null_Moves, this enables McBrain to
just crunch those difficult zugzwang puzzles like peanuts : )

McBrain would announce a mate in 25 or Mate in 30 with null_move disabled, while SF8 would take forever on the same position.

But for the question of the main poster, I guess an engine which can
implement a dynamic internal switch for null_move would rule. As a
thought experiment, imagine one of those difficult zugzwang puzzles
where all engines favors the losing side. What happens if you have this
null_move trick on your engine and steer the position into that seemingly
losing position and then trigger your "null_move dynamically disabled"
trick to knockout all engines. In theory your engine could score 100%
against all current top engines. But I am not sure if I missed something.


regards
Engines already have internal criteria for modifying/disabling null-move. For example, from Stockfish:

Code: Select all

// Step 8. Null move search with verification search &#40;is omitted in PV nodes&#41;
    if (   !PvNode
        &&  eval >= beta
        && &#40;ss->staticEval >= beta - 35 * &#40;depth / ONE_PLY - 6&#41; || depth >= 13 * ONE_PLY&#41;
        &&  pos.non_pawn_material&#40;pos.side_to_move&#40;)))
    &#123;
So, for example, null-move appears to be disabled when searching positions where the side to move has only pawns,
how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
I assume that some tuning was done to arrive at these particular constants, although I'm not a Stockfish developer and I don't follow the process as closely as I once did.

I also find this formulaic approach somehow unappealing, but Stockfish plays damn good chess! And anyone who wants to improve on it is welcome to contribute. :D
so you want to say the correct formula is formulaic and not formulistic? :)

on one thing we agree, SF tuning values are a mess, but SF playing good chess?
Lyudmil Tsvetkov
Posts: 6052
Joined: Tue Jun 12, 2012 12:41 pm

Re: Which likely to be the strongest engine 6 months from no

Post by Lyudmil Tsvetkov »

MikeGL wrote:
zullil wrote:
Lyudmil Tsvetkov wrote:
zullil wrote:
MikeGL wrote: I like McBrain too. :)
I noticed the style of McBrain changes once I enter the config window
and click the tickbox of No_Null_Moves, this enables McBrain to
just crunch those difficult zugzwang puzzles like peanuts : )

McBrain would announce a mate in 25 or Mate in 30 with null_move disabled, while SF8 would take forever on the same position.

But for the question of the main poster, I guess an engine which can
implement a dynamic internal switch for null_move would rule. As a
thought experiment, imagine one of those difficult zugzwang puzzles
where all engines favors the losing side. What happens if you have this
null_move trick on your engine and steer the position into that seemingly
losing position and then trigger your "null_move dynamically disabled"
trick to knockout all engines. In theory your engine could score 100%
against all current top engines. But I am not sure if I missed something.


regards
Engines already have internal criteria for modifying/disabling null-move. For example, from Stockfish:

Code: Select all

// Step 8. Null move search with verification search &#40;is omitted in PV nodes&#41;
    if (   !PvNode
        &&  eval >= beta
        && &#40;ss->staticEval >= beta - 35 * &#40;depth / ONE_PLY - 6&#41; || depth >= 13 * ONE_PLY&#41;
        &&  pos.non_pawn_material&#40;pos.side_to_move&#40;)))
    &#123;
So, for example, null-move appears to be disabled when searching positions where the side to move has only pawns,
how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
I assume that some tuning was done to arrive at these particular constants, although I'm not a Stockfish developer and I don't follow the process as closely as I once did.

I also find this formulaic approach somehow unappealing, but Stockfish plays damn good chess! And anyone who wants to improve on it is welcome to contribute. :D
You just copy pasted the conditional check, not the real meat of the code : )

Code: Select all


 &#123;
        ss->currentMove = MOVE_NULL;
        ss->counterMoves = nullptr;

        assert&#40;eval - beta >= 0&#41;;

        // Null move dynamic reduction based on depth and value
        Depth R = (&#40;823 + 67 * depth / ONE_PLY&#41; / 256 + std&#58;&#58;min&#40;&#40;eval - beta&#41; / PawnValueMg, 3&#41;) * ONE_PLY;

        pos.do_null_move&#40;st&#41;;
        &#40;ss+1&#41;->skipEarlyPruning = true;
        nullValue = depth-R < ONE_PLY ? -qsearch<NonPV, false>&#40;pos, ss+1, -beta, -beta+1, DEPTH_ZERO&#41;
                                      &#58; - search<NonPV>&#40;pos, ss+1, -beta, -beta+1, depth-R, !cutNode&#41;;
        &#40;ss+1&#41;->skipEarlyPruning = false;
        pos.undo_null_move&#40;);

        if &#40;nullValue >= beta&#41;
        &#123;
            // Do not return unproven mate scores
            if &#40;nullValue >= VALUE_MATE_IN_MAX_PLY&#41;
                nullValue = beta;

            if &#40;depth < 12 * ONE_PLY && abs&#40;beta&#41; < VALUE_KNOWN_WIN&#41;
                return nullValue;

            // Do verification search at high depths
            ss->skipEarlyPruning = true;
            Value v = depth-R < ONE_PLY ? qsearch<NonPV, false>&#40;pos, ss, beta-1, beta, DEPTH_ZERO&#41;
                                        &#58;  search<NonPV>&#40;pos, ss, beta-1, beta, depth-R, false&#41;;
            ss->skipEarlyPruning = false;

            if &#40;v >= beta&#41;
                return nullValue;
        &#125;
regards
the real meat of the code is that I don't understand what happens here:

Code: Select all

Depth R = (&#40;823 + 67 * depth / ONE_PLY&#41; / 256 + std&#58;&#58;min&#40;&#40;eval - beta&#41; / PawnValueMg, 3&#41;) * ONE_PLY;
it is easy to fill in the numbers, but anyone having a clue to what exactly above formula boils down in specific game situations with changing depth, eval and beta?

I mean, to establish a relationship between the different positions and the changing values?

as I see it, they just oblivion-tune something they have not a clue about what it is doing?

why not std::max((eval + beta) / PawnValueEg, 5)) ?
Lyudmil Tsvetkov
Posts: 6052
Joined: Tue Jun 12, 2012 12:41 pm

Re: Which likely to be the strongest engine 6 months from no

Post by Lyudmil Tsvetkov »

Milos wrote:
Lyudmil Tsvetkov wrote: how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
Gee, and how come you have 43 years when you reason as if you had 10???
look at the top of the ratings when I turn 45. :)
Uri
Posts: 473
Joined: Thu Dec 27, 2007 9:34 pm

Re: Which likely to be the strongest engine 6 months from no

Post by Uri »

I feel there is a lot of corruption in chess culture where top programmers don't want to publish their engines to the public.

In Playchess I see a lot of very strong engines which don't get published to the public.

Chess has become corrupt and that's why I'm leaving the game, probably for good.
MikeGL
Posts: 1010
Joined: Thu Sep 01, 2011 2:49 pm

Re: Which likely to be the strongest engine 6 months from no

Post by MikeGL »

Lyudmil Tsvetkov wrote:
MikeGL wrote:
zullil wrote:
Lyudmil Tsvetkov wrote:
zullil wrote:
MikeGL wrote: I like McBrain too. :)
I noticed the style of McBrain changes once I enter the config window
and click the tickbox of No_Null_Moves, this enables McBrain to
just crunch those difficult zugzwang puzzles like peanuts : )

McBrain would announce a mate in 25 or Mate in 30 with null_move disabled, while SF8 would take forever on the same position.

But for the question of the main poster, I guess an engine which can
implement a dynamic internal switch for null_move would rule. As a
thought experiment, imagine one of those difficult zugzwang puzzles
where all engines favors the losing side. What happens if you have this
null_move trick on your engine and steer the position into that seemingly
losing position and then trigger your "null_move dynamically disabled"
trick to knockout all engines. In theory your engine could score 100%
against all current top engines. But I am not sure if I missed something.


regards
Engines already have internal criteria for modifying/disabling null-move. For example, from Stockfish:

Code: Select all

// Step 8. Null move search with verification search &#40;is omitted in PV nodes&#41;
    if (   !PvNode
        &&  eval >= beta
        && &#40;ss->staticEval >= beta - 35 * &#40;depth / ONE_PLY - 6&#41; || depth >= 13 * ONE_PLY&#41;
        &&  pos.non_pawn_material&#40;pos.side_to_move&#40;)))
    &#123;
So, for example, null-move appears to be disabled when searching positions where the side to move has only pawns,
how do they know it is 35*(depth/oneply - 6), and not
36*(depth/oneply - 6) ?

judging from this formulistic approach, SF will be very difficult to be number 1 in a couple of years.
I assume that some tuning was done to arrive at these particular constants, although I'm not a Stockfish developer and I don't follow the process as closely as I once did.

I also find this formulaic approach somehow unappealing, but Stockfish plays damn good chess! And anyone who wants to improve on it is welcome to contribute. :D
You just copy pasted the conditional check, not the real meat of the code : )

Code: Select all


 &#123;
        ss->currentMove = MOVE_NULL;
        ss->counterMoves = nullptr;

        assert&#40;eval - beta >= 0&#41;;

        // Null move dynamic reduction based on depth and value
        Depth R = (&#40;823 + 67 * depth / ONE_PLY&#41; / 256 + std&#58;&#58;min&#40;&#40;eval - beta&#41; / PawnValueMg, 3&#41;) * ONE_PLY;

        pos.do_null_move&#40;st&#41;;
        &#40;ss+1&#41;->skipEarlyPruning = true;
        nullValue = depth-R < ONE_PLY ? -qsearch<NonPV, false>&#40;pos, ss+1, -beta, -beta+1, DEPTH_ZERO&#41;
                                      &#58; - search<NonPV>&#40;pos, ss+1, -beta, -beta+1, depth-R, !cutNode&#41;;
        &#40;ss+1&#41;->skipEarlyPruning = false;
        pos.undo_null_move&#40;);

        if &#40;nullValue >= beta&#41;
        &#123;
            // Do not return unproven mate scores
            if &#40;nullValue >= VALUE_MATE_IN_MAX_PLY&#41;
                nullValue = beta;

            if &#40;depth < 12 * ONE_PLY && abs&#40;beta&#41; < VALUE_KNOWN_WIN&#41;
                return nullValue;

            // Do verification search at high depths
            ss->skipEarlyPruning = true;
            Value v = depth-R < ONE_PLY ? qsearch<NonPV, false>&#40;pos, ss, beta-1, beta, DEPTH_ZERO&#41;
                                        &#58;  search<NonPV>&#40;pos, ss, beta-1, beta, depth-R, false&#41;;
            ss->skipEarlyPruning = false;

            if &#40;v >= beta&#41;
                return nullValue;
        &#125;
regards
the real meat of the code is that I don't understand what happens here:

Code: Select all

Depth R = (&#40;823 + 67 * depth / ONE_PLY&#41; / 256 + std&#58;&#58;min&#40;&#40;eval - beta&#41; / PawnValueMg, 3&#41;) * ONE_PLY;
it is easy to fill in the numbers, but anyone having a clue to what exactly above formula boils down in specific game situations with changing depth, eval and beta?

I mean, to establish a relationship between the different positions and the changing values?

as I see it, they just oblivion-tune something they have not a clue about what it is doing?

why not std::max((eval + beta) / PawnValueEg, 5)) ?
I think your question was already explained and answered by Louis Zulli in previous page.
How about if you just build your own engine and just set Depth R to some random static constant so as to save some clock cycles and make your engine faster? :lol:

regards
MikeGL
Posts: 1010
Joined: Thu Sep 01, 2011 2:49 pm

Re: Which likely to be the strongest engine 6 months from no

Post by MikeGL »

Uri wrote:I feel there is a lot of corruption in chess culture where top programmers don't want to publish their engines to the public.

In Playchess I see a lot of very strong engines which don't get published to the public.

Chess has become corrupt and that's why I'm leaving the game, probably for good.
How did you conclude that their code don't get published? Maybe those were just clones with minor tweaks in search or eval module and being run in more powerful hardware.

regards