SuneF wrote:6) the starting FENs and their multiplicity were confirmed by 3 or 4 people.
I'll comment some more on this, because it's an important point and because the work units I've posted or which others may post for various depths might be used for other perft() calculations.
If any of these are incorrect, then all subsequent efforts are a total waste.
That's why I took great care in preparing the
unique() files and have posted the files for
unique(0) through
unique(6). The
unique(7) file can be made by concatenating all of the raw work units; the work units themselves were made by the Unix split program working on the
unique(7) file with orders to produce 964 100,000 line files and a single 68 line straggler file.
All of the
unique() files have passed a number of tests by both
Symbolic and
Oscar. The files should match those produced by others when allowing unimportant differences due to ordering and non-distinctive differences due to possible variant half move counter values for some records.
(There are no two records in any
unique() file which are identical other than having different half move counter values. I have tested this with
sort,
cut, and
uniq. However, there are cases of records which of which two are identical except for a difference in the en passant target square; if the target is non nil, then the half move counter is, and must be, zero. If I were to re-write the
unique() generator, I'd have it output each FEN with the half move counter value being the minimum of all such values seen for that position with a nil en passant target. I just didn't think of this at the time.)
All of the
unique() files themselves were produced by
Symbolic using a multithreaded algorithm which does not use any disk operations until the final results are written. This was done in part to avoid the possibility of I/O errors, particularly those of the undetected variety. It was also done in part for speed of generation. The main disadvantage of the algorithm is that it's not suitable for higher order
unique() runs because of memory size limitations. Should I never need to make any higher order
unique() files, then I'll have to write a new generator.
--------
So based on heritage and test results, I have no concerns about the validity of the work units. And for similar reasons, I have no concerns about the
perft() calculation abilities of bitboard
Symbolic and mailbox
Oscar.
But I do have a concern for possible CPU/GPU/memory random errors due to unavoidable, temporary disruptions caused by high energy cosmic rays. These do happen, and they happen much more frequently than you might think -- this is why server grade computers employ ECC memory. Fortunately, these errors usually do not affect program operation as most of the time the induced bit-flip occurs in an unimportant memory location. But not always, and the only solution is a multiple re-run and comparison of the results; this is how BOINC handles the problem.