Stockfish and optimizing options for SMP search

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

User avatar
sje
Posts: 4675
Joined: Mon Mar 13, 2006 7:43 pm

Re: Stockfish and optimizing options for SMP search

Post by sje »

bob wrote:My claim was they were the first I was aware of that used them. I graduated in 1970 (BS computer science) and my AI class did _not_ look at samuel's checker program or his papers. For board games, the only ones mentioned were the early chess projects at places like the USSR and Los Alamos. Some not even playing real chess but using a 6x6 board instead...
Well, I guess I just read different textbooks and all of them which covered game AI talked about the Samuel Checker Player. One of my instructors back in the 1970s thought it was important enough to hand out photocopies of Samuel's 1959 paper. A little while later I got a copy of the fresh-off-the-press Chess Skill in Man and Machine (1st ed.) and while reading the chapter on Chess 4.x I said to myself "Gosh, I've seen much of this already".

Really, everyone should read the paper. Here's the revised version that came out in 1967:

https://researcher.ibm.com/researcher/f ... eckers.pdf
Milos
Posts: 4190
Joined: Wed Nov 25, 2009 1:47 am

Re: Stockfish and optimizing options for SMP search

Post by Milos »

bob wrote:Actually, in any post you write here, "fact" is one thing nobody will find...
You seams to know what everybody else assumes for a fact. I wonder when have you developed that mind reading skill???
I did _not_ say your idea was good. I said it was flawed, and explained why. If you don't get that...

To judge if idea is good or not you first need to understand it. Your "explanation" was for totally unrelated thing, therefore the only conclusion is that you didn't get it.
Milos
Posts: 4190
Joined: Wed Nov 25, 2009 1:47 am

Re: Stockfish and optimizing options for SMP search

Post by Milos »

sje wrote:Really, everyone should read the paper. Here's the revised version that came out in 1967:

https://researcher.ibm.com/researcher/f ... eckers.pdf
Thanks Steven. Great paper, I read the mentioning of Samuel's checkers in couple of books but never seen the paper itself. So many modern concepts in a paper from more than half a century ago. Simply amazing...
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Stockfish and optimizing options for SMP search

Post by bob »

Milos wrote:
bob wrote:Actually, in any post you write here, "fact" is one thing nobody will find...
You seams to know what everybody else assumes for a fact. I wonder when have you developed that mind reading skill???
I did _not_ say your idea was good. I said it was flawed, and explained why. If you don't get that...

To judge if idea is good or not you first need to understand it. Your "explanation" was for totally unrelated thing, therefore the only conclusion is that you didn't get it.
How hard is it to understand? Perhaps you are the one that doesn't understand the idea, which would explain why you don't get the point that the idea is not a good one. Not horrible, but certainly not good.

One simple flaw is that when you reach a position with remaining depth of N, you have _no idea_ what the tree below each of the remaining moves looks like. Might be quite simple and narrow. Might be quite deep and bushy. But one thing is for sure, the more moves that have been searched, the more likely it is that all moves need to be searched, which is exactly where we'd like to split the tree to avoid search overhead caused by an unexpected fail high.

But of course you don't understand any of that...
Milos
Posts: 4190
Joined: Wed Nov 25, 2009 1:47 am

Re: Stockfish and optimizing options for SMP search

Post by Milos »

bob wrote:One simple flaw is that when you reach a position with remaining depth of N, you have _no idea_ what the tree below each of the remaining moves looks like. Might be quite simple and narrow. Might be quite deep and bushy. But one thing is for sure, the more moves that have been searched, the more likely it is that all moves need to be searched, which is exactly where we'd like to split the tree to avoid search overhead caused by an unexpected fail high.
Your logic is faulty. But since it's "easier for a camel to go through the eye of a needle than for Bob to admit he's wrong" it's pointless to expect for you to admit it.
The later the move (a CUT node child of the subtree in case you don't understand the terminology) is the higher the chance of a fail high and that its tree will be smaller, and therefore, the higher probability that a search overhead will happen compared to the early moves in the list. Therefore raising necessary depth for split on late moves is not faulty logic at all. The only faulty thing is your ego.
jwes
Posts: 778
Joined: Sat Jul 01, 2006 7:11 am

Re: Stockfish and optimizing options for SMP search

Post by jwes »

bob wrote:
Milos wrote:
bob wrote:Actually, in any post you write here, "fact" is one thing nobody will find...
You seams to know what everybody else assumes for a fact. I wonder when have you developed that mind reading skill???
I did _not_ say your idea was good. I said it was flawed, and explained why. If you don't get that...

To judge if idea is good or not you first need to understand it. Your "explanation" was for totally unrelated thing, therefore the only conclusion is that you didn't get it.
How hard is it to understand? Perhaps you are the one that doesn't understand the idea, which would explain why you don't get the point that the idea is not a good one. Not horrible, but certainly not good.

One simple flaw is that when you reach a position with remaining depth of N, you have _no idea_ what the tree below each of the remaining moves looks like. Might be quite simple and narrow. Might be quite deep and bushy. But one thing is for sure, the more moves that have been searched, the more likely it is that all moves need to be searched, which is exactly where we'd like to split the tree to avoid search overhead caused by an unexpected fail high.

But of course you don't understand any of that...
An interesting collorary to this is that the more sure that the more sure you are that this is a good split point, the better it would be to reduce or prune and save even more work.
User avatar
Evert
Posts: 2929
Joined: Sat Jan 22, 2011 12:42 am
Location: NL

Re: Stockfish and optimizing options for SMP search

Post by Evert »

Milos wrote: The later the move (a CUT node child of the subtree in case you don't understand the terminology) is the higher the chance of a fail high and that its tree will be smaller, and therefore, the higher probability that a search overhead will happen compared to the early moves in the list.
Surely that depends on how good your move ordering is?
If the move ordering is good, you'd normally expect the cutoff to be caused by one of the first few moves (probably the first, but possibly, say, one of the first three; it's easy to get that number for different engines). So the further down the list you go, the greater the probability that you will not get any fail-high at all. Occasionally it'll happen that a move way down at the bottom of the list fails high, but that should be sufficiently rare that you get a net gain by assuming that it won't and act accordingly.
Isn't that also the whole idea behind late move reductions?

Or do I misunderstand what you're saying?
Milos
Posts: 4190
Joined: Wed Nov 25, 2009 1:47 am

Re: Stockfish and optimizing options for SMP search

Post by Milos »

Evert wrote:Surely that depends on how good your move ordering is?
If the move ordering is good, you'd normally expect the cutoff to be caused by one of the first few moves (probably the first, but possibly, say, one of the first three; it's easy to get that number for different engines). So the further down the list you go, the greater the probability that you will not get any fail-high at all. Occasionally it'll happen that a move way down at the bottom of the list fails high, but that should be sufficiently rare that you get a net gain by assuming that it won't and act accordingly.
Isn't that also the whole idea behind late move reductions?

Or do I misunderstand what you're saying?
I bold the crucial sentence which is wrong. Since we are talking about split points that cause overhead we are talking about small remaining depths. And the predecessor node is ALL node. There we know very little about ordering (these kind of branches might have even been pruned in the previous iteration) and that move order is most probably wrong. So when you pass lets say first 3 or 5 moves on the list what you say (and probably Bob but he has difficulties articulating that) is that from that point it's higher chance that there will be no cut-off at all than that there will be some. That's what I claim is wrong. And, if there was a cut-off splitting work would only cause unnecessary overhead.
Of course the moveCount vs. depth curve needs to be nicely trimmed but saying a priori that any raising of split depth condition on later moves is a bad idea is just product of famous professor ego that always needs to be right, and no way product of facts!
Milos
Posts: 4190
Joined: Wed Nov 25, 2009 1:47 am

Re: Stockfish and optimizing options for SMP search

Post by Milos »

Milos wrote:
Evert wrote:Surely that depends on how good your move ordering is?
If the move ordering is good, you'd normally expect the cutoff to be caused by one of the first few moves (probably the first, but possibly, say, one of the first three; it's easy to get that number for different engines). So the further down the list you go, the greater the probability that you will not get any fail-high at all. Occasionally it'll happen that a move way down at the bottom of the list fails high, but that should be sufficiently rare that you get a net gain by assuming that it won't and act accordingly.
Isn't that also the whole idea behind late move reductions?

Or do I misunderstand what you're saying?
I bold the crucial sentence which is wrong. Since we are talking about split points that cause overhead we are talking about small remaining depths. And the predecessor node is ALL node. There we know very little about ordering (these kind of branches might have even been pruned in the previous iteration) and that move order is most probably wrong. So when you pass lets say first 3 or 5 moves on the list what you say (and probably Bob but he has difficulties articulating that) is that from that point it's higher chance that there will be no cut-off at all than that there will be some. That's what I claim is wrong. And, if there was a cut-off splitting work would only cause unnecessary overhead.
Of course the moveCount vs. depth curve needs to be nicely trimmed but saying a priori that any raising of split depth condition on later moves is a bad idea is just product of famous professor ego that always needs to be right, and no way product of facts!
Let me elaborate a little more, since ppl here (university professors included :)) are obviously bad with statistics.
Lets say you are in ALL node and you have 20 CUT node successors. And lets say probability to have a cut-off in one of these 20 CUT nodes is 95%.
You can assume this is similar to a Monty Hall problem.
So you search first 3 CUT nodes and you don't find a cut-off. So what's the probability that there will be a cut-off when you search the fourth CUT node, smaller or bigger than before (or even bigger than was the probability to have a cut-off when you've searched the first node)?
If you don't get it, try reading the Monty Hall problem. As I've said it's a common misconception which obviously not even university professors are immune of.
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Stockfish and optimizing options for SMP search

Post by bob »

Milos wrote:
bob wrote:One simple flaw is that when you reach a position with remaining depth of N, you have _no idea_ what the tree below each of the remaining moves looks like. Might be quite simple and narrow. Might be quite deep and bushy. But one thing is for sure, the more moves that have been searched, the more likely it is that all moves need to be searched, which is exactly where we'd like to split the tree to avoid search overhead caused by an unexpected fail high.
Your logic is faulty. But since it's "easier for a camel to go through the eye of a needle than for Bob to admit he's wrong" it's pointless to expect for you to admit it.
The later the move (a CUT node child of the subtree in case you don't understand the terminology) is the higher the chance of a fail high and that its tree will be smaller, and therefore, the higher probability that a search overhead will happen compared to the early moves in the list. Therefore raising necessary depth for split on late moves is not faulty logic at all. The only faulty thing is your ego.
What a crock. First, we do _not_ split at CUT nodes. At least not intentionally. That is the very principle behind YBW. Since most fail highs occur on the first move at any node, we never split until at least the first move has been searched. Read a bit and you will get it. The more moves you search without a fail high at _ANY_ node, the greater the probability that this is an ALL (OR PV) node, and no cutoff will occur.

Once again, you show the extent of your ignorance. But don't let me stop you. After all, only _one_ of us has actually studied this problem and implemented various solutions to it. Hint: that would _not_ be you...

This is not about "ego" which you seem to have plenty of. This is about knowledge and experience, of which you have none in this particular area.