Two suggestions for Stockfish

Discussion of chess software programming and technical issues.

Moderator: Ras

mcostalba
Posts: 2684
Joined: Sat Jun 14, 2008 9:17 pm

Re: Two suggestions for Stockfish

Post by mcostalba »

sje wrote: Besides, I always compile with maximum warning checking and I want to have a clean output. Also, I don't want to have cruft accumulation.
What compiler triggers the warnings with atoi ? I am not able to reproduce it
User avatar
sje
Posts: 4675
Joined: Mon Mar 13, 2006 7:43 pm

Re: Two suggestions for Stockfish

Post by sje »

mcostalba wrote:What compiler triggers the warnings with atoi ? I am not able to reproduce it
For some unknown reason, I can't seem to reproduce it either. It may not be reproducible on Linux, or at least on older versions. I may have seen it only on the OS/X compilation.

The OS/X man page for atoi says "deprecated", but no mention of this on the corresponding Linux man page.

With g++, try the -Wall flag or the -Wdeprecated flag.
jdart
Posts: 4420
Joined: Fri Mar 10, 2006 5:23 am
Location: http://www.arasanchess.org

Re: Two suggestions for Stockfish

Post by jdart »

atoi is bad because it doesn't give you any error on failture to parse (returns 0, which could also be a valid integer).

Since Stockfish is C++ you could also use streams.

stringstream s(numberString);
long x;
s >> x;
if (x.err()) {
// handle error
}

Another way is with istream_iterator. An example:

stringstream nums(minutesStr);
istream_iterator<float> it2(nums);
time1 = *it2++;
time2 = *it2;

--Jon
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: A third suggestion

Post by Don »

sje wrote:
mcostalba wrote:Too clever for my taste
Yet not nearly as clever as some of what's in Stockfish.

Symbolic runs on machines with RAM sizes from 758 MB to 24 GB and will use as much memory as it can without screwing another program which uses the same technique. I don't have to have special versions for different machines and I don't have to remember how much RAM is in a given machine.
This is just my own sensibilities on this, but I like that Komodo always comes up in the same predictable configuration on any machine - so hash table is fixed and the number the threads fixed to 1. All set to conservative machine friendly values.

Some programs figure out how many cores are supported and evidently as in your case also figure out how much memory is available and set the program accordingly. But this is often done incorrectly - and a 4 core machine will come up with 8 cores due to hyper-threading when I don't want that. So I am forced to set it anyway. And many times I don't want multi-core. Also, you generally want the hash table size to be an optimal size which means as small as possible and yet still enough - to get the most performance. Even our default of 64 meg is too much for hyper-blitz 3 second tests. More memory usage, slower program. In the case of Hash tables it should be according to your usage pattern - if you are running any sort of long time control game you usually do want very large hash tables - but that is not always the case. On some low end machines this may even be more important.

I know there are reasonable counter-arguments - so I am not trying to impose my methodology on everyone else, it's probably more of a preference than anything.
Capital punishment would be more effective as a preventive measure if it were administered prior to the crime.
lucasart
Posts: 3243
Joined: Mon May 31, 2010 1:29 pm
Full name: lucasart

Re: Two suggestions for Stockfish

Post by lucasart »

sje wrote:
zullil wrote:So for each type of piece, moves the increase centralization appear ahead of moves that don't? To help with move ordering and thus improve the search?
Yes. This is a very old heuristic which appeared in a version of Sargon many years ago and was reported in the ICCA Journal. The technique can alternatively be applied in move ordering with an application of a tropismbonus[frsq][tosq] 4,096 element array of bonus/penalty offsets. The latter is used in Symbolic.
I really don't think this can work. And if it does, it will not scale well. The method used by Stockfish (inherited from Glaurung with minor modification) of history table is much superior to what you suggest (what you suggest is basically some particular PST ordering).
Theory and practice sometimes clash. And when that happens, theory loses. Every single time.
lucasart
Posts: 3243
Joined: Mon May 31, 2010 1:29 pm
Full name: lucasart

Re: A third suggestion

Post by lucasart »

sje wrote:A third suggestion:

The program currently sets its transposition table size according to a UCI option command. What if the UCI command is not supplied for some reason? What should the default transposition table size be?

In Symbolic, the program sets the transposition table size and other tables by first asking how much physical RAM is available and then initializing the table memory allocation such that about 40% of RAM is used for the entire program.

To fetch the number of bytes in RAM:

Code: Select all

ui64 FetchRamByteCount(void)
{
  ui64 count;

#if HostOsApple
  {
    int mib[2];
    size_t countlen = sizeof(count);

    mib[0] = CTL_HW;
    mib[1] = HW_MEMSIZE;
    sysctl(mib, 2, &count, &countlen, NULL, 0);
  };
#endif

#if HostOsLinux
  {
    const ui pagecount = (ui) sysconf(_SC_PHYS_PAGES);
    const ui pagesize = (ui) sysconf(_SC_PAGE_SIZE);

    count = ((ui64) pagecount) * ((ui64) pagesize);
  };
#endif

#if HostOsUnknown
  {
    count = 1ull << 30;
  };
#endif

  return count;
}
This is horrible.

An UCI engine is supposed to receive the Hash value from the GUI, and the default should be a fixed number for reproductibility.

Besides, not only it's useless, and bad, but it introduces more plateform dependant code, and more ifdef and complexity in the makefile.

As Don says, just use a fixed value that is conservative (like 16MB for example, or even 1MB, you don't care as the GUI will override it). In fact a well designed engine should not allocate the hash before receiving the value from the GUI or getting the first isready (or anything before the first "go") command. That's how I do in DiscoCheck. So it will not crash with a default value of 16 if you have less than 16 and the GUI sets a lower value (because I don't allocate and then reallocate).
Theory and practice sometimes clash. And when that happens, theory loses. Every single time.
User avatar
michiguel
Posts: 6401
Joined: Thu Mar 09, 2006 8:30 pm
Location: Chicago, Illinois, USA

Re: Two suggestions for Stockfish

Post by michiguel »

lucasart wrote:
sje wrote:
zullil wrote:So for each type of piece, moves the increase centralization appear ahead of moves that don't? To help with move ordering and thus improve the search?
Yes. This is a very old heuristic which appeared in a version of Sargon many years ago and was reported in the ICCA Journal. The technique can alternatively be applied in move ordering with an application of a tropismbonus[frsq][tosq] 4,096 element array of bonus/penalty offsets. The latter is used in Symbolic.
I really don't think this can work. And if it does, it will not scale well. The method used by Stockfish (inherited from Glaurung with minor modification) of history table is much superior to what you suggest (what you suggest is basically some particular PST ordering).
You can do both. In fact, that is what I do. I reserve the lower bits for a centertropic [from][to] table. The reason I did it at the time was not related to strength, but to make the sorting stable. SF had some issues about it in the past, getting nodecounts that were different in different environments (because the libraries had different qsorts). I figure they may have solved the problem already, but that would be one alternative approach.

Miguel
mcostalba
Posts: 2684
Joined: Sat Jun 14, 2008 9:17 pm

Re: A third suggestion

Post by mcostalba »

Don wrote: so hash table is fixed and the number the threads fixed to 1. All set to conservative machine friendly values.

Some programs figure out how many cores are supported
Stockfish is one of those programs. He inherited the cpu count machinery from Galurung, but I am strongly tempted to drop it and follow your approach of always setting 1 fixed thread at start up and let the user to increase it to the required number of cores.

I also am a strong believer that a software should not try to "guess" what the user wants, but should make no assumptions, should be predictable and default on the simplest configuration and let the user to decide.
mcostalba
Posts: 2684
Joined: Sat Jun 14, 2008 9:17 pm

Re: Two suggestions for Stockfish

Post by mcostalba »

jdart wrote: Since Stockfish is C++ you could also use streams.
Thanks Jon, I will give it a look.
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: A third suggestion

Post by Don »

mcostalba wrote:
Don wrote: so hash table is fixed and the number the threads fixed to 1. All set to conservative machine friendly values.

Some programs figure out how many cores are supported
Stockfish is one of those programs. He inherited the cpu count machinery from Galurung, but I am strongly tempted to drop it and follow your approach of always setting 1 fixed thread at start up and let the user to increase it to the required number of cores.

I also am a strong believer that a software should not try to "guess" what the user wants, but should make no assumptions, should be predictable and default on the simplest configuration and let the user to decide.
Yes, that is exactly my thinking too.

I think in general it would be really good if the top programs standardized how they do things to the extent that it's reasonable to do so. Stockfish could set the lead in this since it the strongest open source program and could serve the role of "defacto standard" or reference for things that were not necessarily standardized by UCI. The primary two things are default hash table size and default number of threads.

I noticed that the default Hash size in Stockfish is 32, it's 64 in Komodo and 128 in Houdini!

If you are in agreement and want to change to 1 thread I'll follow the Stockfish lead in the setting of Hash table default size. So I would change the default to 32 in future versions. Do you plan to keep it at 32 for a while?
Capital punishment would be more effective as a preventive measure if it were administered prior to the crime.