Search-algorithm

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

diep
Posts: 1822
Joined: Thu Mar 09, 2006 11:54 pm
Location: The Netherlands

Re: Search-algorithm

Post by diep »

lara wrote:No problem :lol:

Thanks a lot for your help and links!
I'll have a look at it, perhaps there will be some more questions ;)
Basic ideas didn't change much of search:

- PVS starting with window -infinite,infinite
- Double nullmove with R=3
- Hashtable is very important. Currently doing 8 probes sequential
- Efficient rules to know what to try in qsearch
- popular now is what they call nowadays LMR type reductions,
yet in 90s everyone already was busy with stuff like that,
i'd call my stuff Fischer Pruning. He can comment himself :)
- extensions are important to do, especially don't do too much
- there is mixed experiences with forward pruning last few plies
As nullmove is already really effective there.

There is another zillion algorithms and every scientist claims they work, yet what i describe above is really in Diep. No lies there unlike some others.

Vincent
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: Search-algorithm

Post by Don »

diep wrote:
lara wrote:No problem :lol:

Thanks a lot for your help and links!
I'll have a look at it, perhaps there will be some more questions ;)
Basic ideas didn't change much of search:

- PVS starting with window -infinite,infinite
- Double nullmove with R=3
- Hashtable is very important. Currently doing 8 probes sequential
- Efficient rules to know what to try in qsearch
- popular now is what they call nowadays LMR type reductions,
yet in 90s everyone already was busy with stuff like that,
i'd call my stuff Fischer Pruning. He can comment himself :)
- extensions are important to do, especially don't do too much
- there is mixed experiences with forward pruning last few plies
As nullmove is already really effective there.

There is another zillion algorithms and every scientist claims they work, yet what i describe above is really in Diep. No lies there unlike some others.

Vincent
Vincent, what is double null move?

- Don
diep
Posts: 1822
Joined: Thu Mar 09, 2006 11:54 pm
Location: The Netherlands

Re: Search-algorithm

Post by diep »

Don wrote:
diep wrote:
lara wrote:No problem :lol:

Thanks a lot for your help and links!
I'll have a look at it, perhaps there will be some more questions ;)
Basic ideas didn't change much of search:

- PVS starting with window -infinite,infinite
- Double nullmove with R=3
- Hashtable is very important. Currently doing 8 probes sequential
- Efficient rules to know what to try in qsearch
- popular now is what they call nowadays LMR type reductions,
yet in 90s everyone already was busy with stuff like that,
i'd call my stuff Fischer Pruning. He can comment himself :)
- extensions are important to do, especially don't do too much
- there is mixed experiences with forward pruning last few plies
As nullmove is already really effective there.

There is another zillion algorithms and every scientist claims they work, yet what i describe above is really in Diep. No lies there unlike some others.

Vincent
Vincent, what is double null move?

- Don
Don't tell me you see this the first time Don :)
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: Search-algorithm

Post by Don »

diep wrote:
Don wrote:
diep wrote:
lara wrote:No problem :lol:

Thanks a lot for your help and links!
I'll have a look at it, perhaps there will be some more questions ;)
Basic ideas didn't change much of search:

- PVS starting with window -infinite,infinite
- Double nullmove with R=3
- Hashtable is very important. Currently doing 8 probes sequential
- Efficient rules to know what to try in qsearch
- popular now is what they call nowadays LMR type reductions,
yet in 90s everyone already was busy with stuff like that,
i'd call my stuff Fischer Pruning. He can comment himself :)
- extensions are important to do, especially don't do too much
- there is mixed experiences with forward pruning last few plies
As nullmove is already really effective there.

There is another zillion algorithms and every scientist claims they work, yet what i describe above is really in Diep. No lies there unlike some others.

Vincent
Vincent, what is double null move?

- Don
Don't tell me you see this the first time Don :)
NO, I know you advocate it for many years as being zugzwang-proof.

But I do not know what your exact definition is for this. I have heard many variations including:

1. Do not allow more than 2 null moves in any line.

2. Do not allow 3 consecutive null moves but always do null move.

Some programs only do null move if the static score >= beta and some do not allow even 2 consecutive null moves. If you use the beta test then you cannot do 2 consecutive null moves let alone 3!

I just wanted to hear your definition. I assume you mean always do null move if it's legal, where legal means you are not in check and the last 2 moves were not null moves. Is that what you mean?
diep
Posts: 1822
Joined: Thu Mar 09, 2006 11:54 pm
Location: The Netherlands

Re: Search-algorithm

Post by diep »

Don wrote:
diep wrote:
Don wrote:
diep wrote:
lara wrote:No problem :lol:

Thanks a lot for your help and links!
I'll have a look at it, perhaps there will be some more questions ;)
Basic ideas didn't change much of search:

- PVS starting with window -infinite,infinite
- Double nullmove with R=3
- Hashtable is very important. Currently doing 8 probes sequential
- Efficient rules to know what to try in qsearch
- popular now is what they call nowadays LMR type reductions,
yet in 90s everyone already was busy with stuff like that,
i'd call my stuff Fischer Pruning. He can comment himself :)
- extensions are important to do, especially don't do too much
- there is mixed experiences with forward pruning last few plies
As nullmove is already really effective there.

There is another zillion algorithms and every scientist claims they work, yet what i describe above is really in Diep. No lies there unlike some others.

Vincent
Vincent, what is double null move?

- Don
Don't tell me you see this the first time Don :)
NO, I know you advocate it for many years as being zugzwang-proof.

But I do not know what your exact definition is for this. I have heard many variations including:

1. Do not allow more than 2 null moves in any line.

2. Do not allow 3 consecutive null moves but always do null move.

Some programs only do null move if the static score >= beta and some do not allow even 2 consecutive null moves. If you use the beta test then you cannot do 2 consecutive null moves let alone 3!

I just wanted to hear your definition. I assume you mean always do null move if it's legal, where legal means you are not in check and the last 2 moves were not null moves. Is that what you mean?
Let's go point by point.

Of course when in check you can't nullmove obviously as you'd lose your king.

Several programs seem to not nullmove above beta indeed. If i try it in diep, i'm losing 1 ply of search depth. So i'm not sure why some engines are doing this. Note most of those engines seem to change this. The authors are not very talkative as we know on this subject. Trying to draw statements out of SMK is kind of tough, even after you offer him a german beer of 0.5 liter (me having the advantage of never drinking).

Limiting nullmove in any way is really hurting bad in search depth. So limitation of nullmove one should always restrict to a minimum. For example if you want to detect zugzwang and simply don't nullmove when less than or equal 2 pieces on the board for side to move, then you'll lose a lot of plies, sometimes even in middlegame, as with todays search depths you're hitting far endgames each time.

So from low level viewpoint optimization viewpoint seen, the best is to always nullmove. That misses however obvious zugzwangs. Let's now study when and where zugzwangs occur most.

Pawnendgames are only about zugzwangs, so any form of reducing the search by R=3+1 there is really risky. Also it is whole loads of zugzwangs over there in pawnendgames.

So in pawnendgame i'm NOT doing nullmove at all.

pawnendgame is a pawnendgame ONLY when both sides have no pieces, just 0 or more pawns.

Now in order to detect zugzwang i'm forbidding the 3d nullmove only and only if the previous 2 moves were a nullmove.

e4 null null null <== forbidden
e4 null null e5 null null Nf3 <== legal
e4 null d4 null Nf3 null null Nf6 e5 <== legal

As Christophe Theron noticed years ago it is a kind of implicit way of doing a reduced search. However it hardly requires extra code and it is very safe.

Also it 'by chance' serves sometimes as an IID.

Yet it DOES eat extra nodes compared to always nullmove and just forbidding pawn endgames. Yet with todays search depths, also the ones of a few years ago, you can easily even at blitz, with double nullmove detect a zugzwang in rook endgames.

Usually there sometimes is 1 there, nearly never 2 in a single search line needed to get found. In queen endgames sometimes there is 1 zugzwang, very very seldom, but you would hate to miss it, meanwhile still you want to always nullmove of course.

Doing a special search to detect zugzwangs is over the top IMHO if in this implicit way you can detect it.

And yes, there is testsets where double nullmove loses extra plies over simple tricks to limit nullmove here and there. Yet i found in games that it works great this double nullmove.

I also know some who use double nullmove but only when there is a few pieces left on the board, so not using any limitation to nullmove with a lot of pieces.

That's an alternative to it.

The most important reason to invent double nullmove was not all this by the way. It was just to modify a search using nullmove to a correct form of search. Correctness of search simply is important and might get more important in future, that though of course for last X years the dominating thing seems to be correctness of evaluation function (limiting your number of bugs per second to quote Johan de Koning).

Vincent

p.s. double nullmove has been Frans Morsch recognized. Not sure whether i should say that publicly though :)
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: Search-algorithm

Post by Don »

diep wrote: Let's go point by point.

Of course when in check you can't nullmove obviously as you'd lose your king.

Several programs seem to not nullmove above beta indeed. If i try it in diep, i'm losing 1 ply of search depth. So i'm not sure why some engines are doing this. Note most of those engines seem to change this. The authors are not very talkative as we know on this subject. Trying to draw statements out of SMK is kind of tough, even after you offer him a german beer of 0.5 liter (me having the advantage of never drinking).

Limiting nullmove in any way is really hurting bad in search depth. So limitation of nullmove one should always restrict to a minimum. For example if you want to detect zugzwang and simply don't nullmove when less than or equal 2 pieces on the board for side to move, then you'll lose a lot of plies, sometimes even in middlegame, as with todays search depths you're hitting far endgames each time.

So from low level viewpoint optimization viewpoint seen, the best is to always nullmove. That misses however obvious zugzwangs. Let's now study when and where zugzwangs occur most.

Pawnendgames are only about zugzwangs, so any form of reducing the search by R=3+1 there is really risky. Also it is whole loads of zugzwangs over there in pawnendgames.

So in pawnendgame i'm NOT doing nullmove at all.

pawnendgame is a pawnendgame ONLY when both sides have no pieces, just 0 or more pawns.

Now in order to detect zugzwang i'm forbidding the 3d nullmove only and only if the previous 2 moves were a nullmove.

e4 null null null <== forbidden
e4 null null e5 null null Nf3 <== legal
e4 null d4 null Nf3 null null Nf6 e5 <== legal

As Christophe Theron noticed years ago it is a kind of implicit way of doing a reduced search. However it hardly requires extra code and it is very safe.

Also it 'by chance' serves sometimes as an IID.

Yet it DOES eat extra nodes compared to always nullmove and just forbidding pawn endgames. Yet with todays search depths, also the ones of a few years ago, you can easily even at blitz, with double nullmove detect a zugzwang in rook endgames.

Usually there sometimes is 1 there, nearly never 2 in a single search line needed to get found. In queen endgames sometimes there is 1 zugzwang, very very seldom, but you would hate to miss it, meanwhile still you want to always nullmove of course.

Doing a special search to detect zugzwangs is over the top IMHO if in this implicit way you can detect it.

And yes, there is testsets where double nullmove loses extra plies over simple tricks to limit nullmove here and there. Yet i found in games that it works great this double nullmove.

I also know some who use double nullmove but only when there is a few pieces left on the board, so not using any limitation to nullmove with a lot of pieces.

That's an alternative to it.

The most important reason to invent double nullmove was not all this by the way. It was just to modify a search using nullmove to a correct form of search. Correctness of search simply is important and might get more important in future, that though of course for last X years the dominating thing seems to be correctness of evaluation function (limiting your number of bugs per second to quote Johan de Koning).

Vincent

p.s. double nullmove has been Frans Morsch recognized. Not sure whether i should say that publicly though :)
Very well explained - I will now do some experimenting in my own program.

I only do null move when the score is about beta - I have not tested this recently but for some reason I have never benefited from doing it at all points. But I will have to do this if I want to experiment with double null move.

My endgame condition is different than yours. In order to do null move the current side must have at least 1 piece that is not a knight or pawn. But the opponent can have anything. I also do this check dynamically.

If null move fails do you upgrade alpha?
diep
Posts: 1822
Joined: Thu Mar 09, 2006 11:54 pm
Location: The Netherlands

Re: Search-algorithm

Post by diep »

Don wrote:
diep wrote: Let's go point by point.

Of course when in check you can't nullmove obviously as you'd lose your king.

Several programs seem to not nullmove above beta indeed. If i try it in diep, i'm losing 1 ply of search depth. So i'm not sure why some engines are doing this. Note most of those engines seem to change this. The authors are not very talkative as we know on this subject. Trying to draw statements out of SMK is kind of tough, even after you offer him a german beer of 0.5 liter (me having the advantage of never drinking).

Limiting nullmove in any way is really hurting bad in search depth. So limitation of nullmove one should always restrict to a minimum. For example if you want to detect zugzwang and simply don't nullmove when less than or equal 2 pieces on the board for side to move, then you'll lose a lot of plies, sometimes even in middlegame, as with todays search depths you're hitting far endgames each time.

So from low level viewpoint optimization viewpoint seen, the best is to always nullmove. That misses however obvious zugzwangs. Let's now study when and where zugzwangs occur most.

Pawnendgames are only about zugzwangs, so any form of reducing the search by R=3+1 there is really risky. Also it is whole loads of zugzwangs over there in pawnendgames.

So in pawnendgame i'm NOT doing nullmove at all.

pawnendgame is a pawnendgame ONLY when both sides have no pieces, just 0 or more pawns.

Now in order to detect zugzwang i'm forbidding the 3d nullmove only and only if the previous 2 moves were a nullmove.

e4 null null null <== forbidden
e4 null null e5 null null Nf3 <== legal
e4 null d4 null Nf3 null null Nf6 e5 <== legal

As Christophe Theron noticed years ago it is a kind of implicit way of doing a reduced search. However it hardly requires extra code and it is very safe.

Also it 'by chance' serves sometimes as an IID.

Yet it DOES eat extra nodes compared to always nullmove and just forbidding pawn endgames. Yet with todays search depths, also the ones of a few years ago, you can easily even at blitz, with double nullmove detect a zugzwang in rook endgames.

Usually there sometimes is 1 there, nearly never 2 in a single search line needed to get found. In queen endgames sometimes there is 1 zugzwang, very very seldom, but you would hate to miss it, meanwhile still you want to always nullmove of course.

Doing a special search to detect zugzwangs is over the top IMHO if in this implicit way you can detect it.

And yes, there is testsets where double nullmove loses extra plies over simple tricks to limit nullmove here and there. Yet i found in games that it works great this double nullmove.

I also know some who use double nullmove but only when there is a few pieces left on the board, so not using any limitation to nullmove with a lot of pieces.

That's an alternative to it.

The most important reason to invent double nullmove was not all this by the way. It was just to modify a search using nullmove to a correct form of search. Correctness of search simply is important and might get more important in future, that though of course for last X years the dominating thing seems to be correctness of evaluation function (limiting your number of bugs per second to quote Johan de Koning).

Vincent

p.s. double nullmove has been Frans Morsch recognized. Not sure whether i should say that publicly though :)
Very well explained - I will now do some experimenting in my own program.

I only do null move when the score is about beta - I have not tested this recently but for some reason I have never benefited from doing it at all points. But I will have to do this if I want to experiment with double null move.

My endgame condition is different than yours. In order to do null move the current side must have at least 1 piece that is not a knight or pawn. But the opponent can have anything. I also do this check dynamically.

If null move fails do you upgrade alpha?
Obviously you can't improve alpha. Not by means of hashtable nor nullmove. That's too dubious.

Additionally consider this: we have a great working hashtable. First line you get out of hashtable is your mainline. We already have a GREAT BOUND to search with then. In short there is no benefit either from improving alpha.

I'm printing out in statistics mode of diep how many nodes have Alpha != beta-1 and that number of nodes is really tiny. A few thousands at a total of hundreds of millions.

Bart Weststrate observed in 90s already that when you get a major fail low in root from your best move that using a narrow window around root is not so clever. With good working hashtables starting with alfa = -infinite really is helpful then to get a lot faster that fail low and it resolves fail lows quicker than toying with root windows by hand. Hashtable really is a great invention there that boosts the KISS idea. Using those bestmoves from hashtable nearly always means that you really quickly have a great new bound. The average case in games is total unbeatable with other methods using root type windows. In games obviously you're not just interested in winning, even more important is what to do when your root score drops a tad each next iteration. LMR has severe worst case there, i'm not real sure yet how to fully cure it there.

Vincent
User avatar
mclane
Posts: 18748
Joined: Thu Mar 09, 2006 6:40 pm
Location: US of Europe, germany
Full name: Thorsten Czub

Re: Search-algorithm

Post by mclane »

even after you offer him a german beer of 0.5 liter (me having the advantage of never drinking).
vincent. you cannot make a german drunken by offering him 0.5 liter german beer.

5 bottles maybe. but with 0.5 liter it is only possible to make somebody drunken who never drinks beer.

btw.:

i found a video from paderborn championship 1999. that was 10 years ago and we were all much younger :-)

you are on the video too.and don also. also mark lefler and
christophe theron and stefan meyer-kahlen and gerd isenberg and
even frank quisinsky and marcus kästner.
some people seen on the video are dead, like jan louwmann and
elvis (=detlef pordzik).

if somebody is interested in this MPEG4 video i would ask somebody with enough server-space to put it on his server for anybody to download.

there is also a game with Ferret - Fritz and fritz doing his usual
0.00 when giving away the game.

Frans watches the game too and Frederic has to take Frans aside again.
(as usual).
Stefan's shredder came good out of book versus junior (although , as he said, the bookline was done by mistake).
whatever.

one can see elvis' motorbike and bruce moreland in action.
i think i have also seen werner schuele (CEGT) and many many others.
What seems like a fairy tale today may be reality tomorrow.
Here we have a fairy tale of the day after tomorrow....
diep
Posts: 1822
Joined: Thu Mar 09, 2006 11:54 pm
Location: The Netherlands

Re: Search-algorithm

Post by diep »

Thanks Thorsten:

ftp passive to 213.84.245.71
user: ftp
no password needed

02-08-2009 02:02 198.639.308 padb1.mp4
02-08-2009 02:23 116.872.896 padb2.mp4
2 File(s) 315.512.204 bytes

p.s. it's read only. If you want to upload something let me know.

note that vista nowadays from dos command prompt no longer supports passive ftp (would be funny to hear how they managed THAT). exploder does and linux does by default and XP also does (but need to type pasv there).

Passive data ports used are 1024 and 1025 (if i remember well from head).

A good ftp client is for example filezilla client. Free open source, no spyware, works great.

Vincent
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: Search-algorithm

Post by Don »

Wow, that really brings back memories. Almost makes me want to play in another tournament :-)
diep wrote:Thanks Thorsten:

ftp passive to 213.84.245.71
user: ftp
no password needed

02-08-2009 02:02 198.639.308 padb1.mp4
02-08-2009 02:23 116.872.896 padb2.mp4
2 File(s) 315.512.204 bytes

p.s. it's read only. If you want to upload something let me know.

note that vista nowadays from dos command prompt no longer supports passive ftp (would be funny to hear how they managed THAT). exploder does and linux does by default and XP also does (but need to type pasv there).

Passive data ports used are 1024 and 1025 (if i remember well from head).

A good ftp client is for example filezilla client. Free open source, no spyware, works great.

Vincent