I would be interested to know how the times compare with others?
I'm also curious how depth 14/15 were computed... I'm guessing a lot of resources were available that would not be feasible on a single machine? I know perft 15 was done with a GPU implementation, but even then it probably took several devices?
chessbit wrote: ↑Wed Sep 24, 2025 5:25 pm
I'm also curious how depth 14/15 were computed... I'm guessing a lot of resources were available that would not be feasible on a single machine?
chessbit wrote: ↑Wed Sep 24, 2025 5:25 pm
I'm also curious how depth 14/15 were computed... I'm guessing a lot of resources were available that would not be feasible on a single machine?
Interesting, thanks. It took a lot more time to compute than I thought someone would care to do. What's up with the 8TB HDDs? How do you use this to your advantage to compute perft?
Also, how do my numbers compare to yours for the engine you used?
chessbit wrote: ↑Thu Sep 25, 2025 5:04 pm
Also, how do my numbers compare to yours for the engine you used?
My program needs 18.630 seconds to compute perft 9 from the starting position on a 7950X3D 16-core CPU. The "gperftd" program mentioned in one of the links was around twice as fast on the old hardware available to me back then.
chessbit wrote: ↑Thu Sep 25, 2025 5:04 pm
What's up with the 8TB HDDs? How do you use this to your advantage to compute perft?
The first part of the computation determined all positions that can be reached from the starting position by playing 11 half-moves, together with a count of how many move sequences lead to each position. The size of this data set is around 35TB uncompressed, so it has to be stored on disk.
To understand the algorithm used to compute this data set, I suggest studying the Wikipedia article on external sorting.
chessbit wrote: ↑Thu Sep 25, 2025 5:04 pm
What's up with the 8TB HDDs? How do you use this to your advantage to compute perft?
The first part of the computation determined all positions that can be reached from the starting position by playing 11 half-moves, together with a count of how many move sequences lead to each position. The size of this data set is around 35TB uncompressed, so it has to be stored on disk.
To understand the algorithm used to compute this data set, I suggest studying the Wikipedia article on external sorting.