What is the best way to obtain the 7-piece tablebases?

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

User avatar
yurikvelo
Posts: 710
Joined: Sat Dec 06, 2014 1:53 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by yurikvelo »

Supercomputers provide transparent (via kernel related special software) access to all memory as a globally adressable block, though it is not fastest possible approach (penalty can be 10-20x times, in some cases 100-1000x times).
Specially written algo can be faster, but any cluster-unaware program will work.

Cray XMT Eldorado (old 2005 arhictecture, using first K7 Opterons) has 256 TB globally adressable memory (32 Gb per node).
duncan
Posts: 12038
Joined: Mon Jul 07, 2008 10:50 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by duncan »

yurikvelo wrote: Mon Jun 22, 2020 11:40 am Supercomputers provide transparent (via kernel related special software) access to all memory as a globally adressable block, though it is not fastest possible approach (penalty can be 10-20x times, in some cases 100-1000x times).
Specially written algo can be faster, but any cluster-unaware program will work.

Cray XMT Eldorado (old 2005 arhictecture, using first K7 Opterons) has 256 TB globally adressable memory (32 Gb per node).
So Syzygy could run on Cray XMT Eldorado ?
User avatar
Nordlandia
Posts: 2821
Joined: Fri Sep 25, 2015 9:38 pm
Location: Sortland, Norway

Re: What is the best way to obtain the 7-piece tablebases?

Post by Nordlandia »

How capable is Lomonosov II supercomputer to take on the task of 8-men problem.

The russians said on Rybka forum that they'll consider it in 2019-2020.
Dann Corbit
Posts: 12541
Joined: Wed Mar 08, 2006 8:57 pm
Location: Redmond, WA USA

Re: What is the best way to obtain the 7-piece tablebases?

Post by Dann Corbit »

syzygy wrote: Sun Jun 21, 2020 10:44 pm Around 2,000 TB of disk space would probably be enough to store the full 8-piece set (WDL+DTZ).
Today, storage is $20/TB
https://edwardbetts.com/price_per_tb/
So only $40,000 for the disk.

Server memory currently runs about $10K/TB (unless you need the greasy fast stuff)
$640,000 for RAM.

In five years it should be on the order of (back of the envelope wild stab):
$1000-2000 for disk
$20,000 - $40,000 for RAM.
So RAM cost will dominate when it becomes feasible for ordinary {albeit a bit nuts} humans to do it.
I might even be able to give it a go. But who knows.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
syzygy
Posts: 5566
Joined: Tue Feb 28, 2012 11:56 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by syzygy »

duncan wrote: Mon Jun 22, 2020 1:20 am
syzygy wrote: Sun Jun 21, 2020 10:44 pm
duncan wrote: Sun Jun 21, 2020 2:25 am
yurikvelo wrote: Fri Jun 19, 2020 11:42 pm
This article from the Lomonosov tablebase page estimates that the size of an 8-man tablebase will be ~100x larger than a 7-man table base and will take a minimum of 10,000 TB of disk space and 50 TB of RAM, limiting its completion to a top-10 super computer
Syzygy 7-piece was about 1 Tb of Ram and 18Tb hard disk

Lets say you need 100 times as much ram and hard drive for each extra piece so 9 pieces needs 10 pb Ram and 180 Pb hard drive.
64 TB of RAM (shared by all cpu cores) would be enough for the current generator (with some minor tweaking).

Around 2,000 TB of disk space would probably be enough to store the full 8-piece set (WDL+DTZ).
This seems to be within the capability of Summit supercomputer
Supercomputers typically are clusters of computing nodes, with each computing node having direct access to only a small portion of total RAM memory. The current generator cannot work with that. It was designed to generate 6-piece TBs on a single machine with at least 16GB of RAM and preferably a bit more (which would have been a ridiculous requirement when Nalimov wrote his generator, but was quite reasonable in 2012). It would need 64 TB of RAM shared by all cores to generate 8-piece TBs. To tackle 8 piece, it is probably best to first write a new generator designed for the hardware that one may hope to be available 5-10 years from now.
Thanks for this.

HPE Integrity MC990 X Server has a maximum of 48 Tb. If this is upgraded in the future to 64 Tb, would it run your software for 8 pieces ?


https://buy.hpe.com/us/en/servers/missi ... 1008798952
Yes, that machine, when upgraded to 64 TB, looks like something that should be able to generate the 8-piece TBs with the current code. Probably only a few trivial modifications would be needed to make it run.

However, it is not clear how well the current code would scale on a machine with 32 sockets, each with up to 192 threads. There were quite some problems with many threads already on Bojun Guo's machines and I'm not sure how well these have been overcome. I suspect that such problems can be overcome, though (with extra coding effort but without a rewrite from scratch).

If the generator could be made to make efficient use of such a monster machine with such a huge amount of threads and such a machine were available, there wouldn't seem to be much need for a new generator designed to run on a supercomputer cluster. (Still, it would probably help to rewrite the generator to make better use of symmetry for tables with multiple identical pieces like KQQRvKQRB, which in principle should need only half the RAM that KQRBvKQRN needs. Even if the RAM is available, it would cut down on generation time.)
syzygy
Posts: 5566
Joined: Tue Feb 28, 2012 11:56 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by syzygy »

Nordlandia wrote: Mon Jun 22, 2020 10:33 pm How capable is Lomonosov II supercomputer to take on the task of 8-men problem.

The russians said on Rybka forum that they'll consider it in 2019-2020.
The Lomonosov generator was designed to run on a cluster. Compared with my generator running on a cluster with transparent globally addressable memory (see Yuri's post), their algorithm takes more care to reduce inter-node communication (at least that's what I suspect). I don't know what minimum amount of per-node RAM it would need to generate 8-piece TBs efficiently.
syzygy
Posts: 5566
Joined: Tue Feb 28, 2012 11:56 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by syzygy »

Dann Corbit wrote: Tue Jun 23, 2020 5:10 am
syzygy wrote: Sun Jun 21, 2020 10:44 pm Around 2,000 TB of disk space would probably be enough to store the full 8-piece set (WDL+DTZ).
Today, storage is $20/TB
https://edwardbetts.com/price_per_tb/
So only $40,000 for the disk.

Server memory currently runs about $10K/TB (unless you need the greasy fast stuff)
$640,000 for RAM.

In five years it should be on the order of (back of the envelope wild stab):
$1000-2000 for disk
$20,000 - $40,000 for RAM.
So RAM cost will dominate when it becomes feasible for ordinary {albeit a bit nuts} humans to do it.
I might even be able to give it a go. But who knows.
The good thing is that ordinary users won't have to generate the tables (but they'll need a really fast internet connection to download 2,000 TB of data ;-)).

The bad thing is that ordinary users wanting to use such tables for analysis would need about 1,000 TB of SSD for the WDL part. And they would probably still need many TBs of RAM to make things run smoothly.
duncan
Posts: 12038
Joined: Mon Jul 07, 2008 10:50 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by duncan »

syzygy wrote: Wed Jun 24, 2020 5:10 pm
Dann Corbit wrote: Tue Jun 23, 2020 5:10 am
syzygy wrote: Sun Jun 21, 2020 10:44 pm Around 2,000 TB of disk space would probably be enough to store the full 8-piece set (WDL+DTZ).
Today, storage is $20/TB
https://edwardbetts.com/price_per_tb/
So only $40,000 for the disk.

Server memory currently runs about $10K/TB (unless you need the greasy fast stuff)
$640,000 for RAM.

In five years it should be on the order of (back of the envelope wild stab):
$1000-2000 for disk
$20,000 - $40,000 for RAM.
So RAM cost will dominate when it becomes feasible for ordinary {albeit a bit nuts} humans to do it.
I might even be able to give it a go. But who knows.
The good thing is that ordinary users won't have to generate the tables (but they'll need a really fast internet connection to download 2,000 TB of data ;-)).

The bad thing is that ordinary users wanting to use such tables for analysis would need about 1,000 TB of SSD for the WDL part. And they would probably still need many TBs of RAM to make things run smoothly.
Lets say you just wanted the 30 most common positions, would you still need so much RAM?
Dann Corbit
Posts: 12541
Joined: Wed Mar 08, 2006 8:57 pm
Location: Redmond, WA USA

Re: What is the best way to obtain the 7-piece tablebases?

Post by Dann Corbit »

You don't need enormous ram to use the tablebase files.
You need enormous ram to build them.
But even accessing 8 men files, you will need a lot of RAM.
Either that, or there will be a speed trade-off.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
duncan
Posts: 12038
Joined: Mon Jul 07, 2008 10:50 pm

Re: What is the best way to obtain the 7-piece tablebases?

Post by duncan »

Dann Corbit wrote: Thu Jun 25, 2020 6:20 pm You don't need enormous ram to use the tablebase files.
You need enormous ram to build them.
But even accessing 8 men files, you will need a lot of RAM.
Either that, or there will be a speed trade-off.
and by a lot of RAM to access 8 men files do you mean many TBs of RAM ?