Hi guys,
A little newbie question, but what is the difference between 16gb and 4gb? I know that low size can get filled quickly, but every time it happens the engine suppose to just replace old entries with new ones, doesn't it? so why low size is generally recommended for blitz play (depends on one's specific hardware) and for long play or analysis you can safely allocate even half of your available ram.
The bigger the Hashtable size, older results can stay for longer inside the table. So why is it important to make it smaller for engine blitz play?
Thx.
Effect of Hahtable size
Moderators: hgm, Rebel, chrisw
-
- Posts: 20
- Joined: Fri Sep 04, 2020 9:20 pm
- Full name: Assaf Patishi
-
- Posts: 6808
- Joined: Wed Nov 18, 2009 7:16 pm
- Location: Gutweiler, Germany
- Full name: Frank Quisinsky
Re: Effect of Hahtable size
Hi Patishi,
more hash = better if lesser pieces on board (for endgames).
more hash = will be not produced weaker results with many pieces on board.
= Generally more hash can't be bad!
Example:
1-Core, 5-man
Time control 40 moves in 10 minutes / repeatedly produced, without resign-mode, a move average (TOP-41 engines) from 86,5 moves, a game need around 43 minutes.
Time control game in 20 minutes + 5 seconds fisher produced, without resign-mode, a move average (TOP-41 engines) from 86,5 moves, a game need around 41 minutes.
For one move the average is ~ 15 seconds!
Most games goes in endgames!
Max. hashtable size (full hash) should be reached with the 4 times higher time average = 1 minute!
My principle:
To which time "hashtables are full" with 14 pieces on board with a 4 times higher time control I have interest to use!
hash full = can be quite different for all the enignes!
512Mb or 768Mb hash for the time controls I am speaking before with 1 core is fully OK.
Again, if you give 1Gb for hash you will not produced more bad results with many pieces on board, but it is not necessary for the time controls I am speaking before with 1 core.
Hope this helps!
Best
Frank
For blitz games you need max. 256Mb or maybe 128Mb (1-Core).
more hash = better if lesser pieces on board (for endgames).
more hash = will be not produced weaker results with many pieces on board.
= Generally more hash can't be bad!
Example:
1-Core, 5-man
Time control 40 moves in 10 minutes / repeatedly produced, without resign-mode, a move average (TOP-41 engines) from 86,5 moves, a game need around 43 minutes.
Time control game in 20 minutes + 5 seconds fisher produced, without resign-mode, a move average (TOP-41 engines) from 86,5 moves, a game need around 41 minutes.
For one move the average is ~ 15 seconds!
Most games goes in endgames!
Max. hashtable size (full hash) should be reached with the 4 times higher time average = 1 minute!
My principle:
To which time "hashtables are full" with 14 pieces on board with a 4 times higher time control I have interest to use!
hash full = can be quite different for all the enignes!
512Mb or 768Mb hash for the time controls I am speaking before with 1 core is fully OK.
Again, if you give 1Gb for hash you will not produced more bad results with many pieces on board, but it is not necessary for the time controls I am speaking before with 1 core.
Hope this helps!
Best
Frank
For blitz games you need max. 256Mb or maybe 128Mb (1-Core).
-
- Posts: 1364
- Joined: Sat Jul 21, 2018 7:43 am
- Location: Szentendre, Hungary
- Full name: Gabor Szots
Re: Effect of Hahtable size
A very long time ago I did a little experiment and found that doubling the hash increased rating by about 5 points. That may not be valid today.Patishi wrote: ↑Sun Jan 31, 2021 7:04 am Hi guys,
A little newbie question, but what is the difference between 16gb and 4gb? I know that low size can get filled quickly, but every time it happens the engine suppose to just replace old entries with new ones, doesn't it? so why low size is generally recommended for blitz play (depends on one's specific hardware) and for long play or analysis you can safely allocate even half of your available ram.
The bigger the Hashtable size, older results can stay for longer inside the table. So why is it important to make it smaller for engine blitz play?
Thx.
To decrease hash has only one justification, as far as I know: available memory size. For example I use 256 MB per engine (running 6 engines simultaneously) because I have only 8 GB RAM and much of that is used for other purposes.
Gabor Szots
CCRL testing group
CCRL testing group
-
- Posts: 3291
- Joined: Wed Mar 08, 2006 8:15 pm
Re: Effect of Hahtable size
One test is here http://www.fastgm.de/hash.html. Stockfish testing uses 64 MB in short timecontrol!
Jouni
-
- Posts: 2204
- Joined: Sat Jan 18, 2014 10:24 am
- Location: Andorra
Re: Effect of Hahtable size
One of the reasons to use low ram in testing is to test how well it works the hash replacement schema. Also the engine goes a bit faster with less ram. But those are things sometimes interesting for developement, not really much for playing.Jouni wrote: ↑Sun Jan 31, 2021 10:27 am One test is here http://www.fastgm.de/hash.html. Stockfish testing uses 64 MB in short timecontrol!
Daniel José - http://www.andscacs.com
-
- Posts: 20
- Joined: Fri Sep 04, 2020 9:20 pm
- Full name: Assaf Patishi
Re: Effect of Hahtable size
Thx for the replies everyone,
I mainly use Fritz 17 to analyse my games. I have 32gb ram (10900k cpu) and right now the default Hashtable size is 4900gb...or something in that area.
I wanted to know if for analysis purposes (infinite analysis) it will be beneficial to increase the Hash size to 16gb.
I guess It's worth a try it it can't harm performance in any way.
I mainly use Fritz 17 to analyse my games. I have 32gb ram (10900k cpu) and right now the default Hashtable size is 4900gb...or something in that area.
I wanted to know if for analysis purposes (infinite analysis) it will be beneficial to increase the Hash size to 16gb.
I guess It's worth a try it it can't harm performance in any way.
-
- Posts: 27808
- Joined: Fri Mar 10, 2006 10:06 am
- Location: Amsterdam
- Full name: H G Muller
Re: Effect of Hahtable size
I once compared time-to-depth measurements with various hash sizes, and my conclusion was that the speed increases as the twelfth root of the hash size. So that to double the speed you need 4096 times larger hash size. A doubling of the has size would then give a speedup of 12th root 2 = 1.06, i.e. 6%. And I always use the rule of thumb that 1% faster = 1 more Elo. So that would be +6 Elo.Gabor Szots wrote: ↑Sun Jan 31, 2021 9:20 amA very long time ago I did a little experiment and found that doubling the hash increased rating by about 5 points. That may not be valid today.
To decrease hash has only one justification, as far as I know: available memory size. For example I use 256 MB per engine (running 6 engines simultaneously) because I have only 8 GB RAM and much of that is used for other purposes.
I would call that in excellent agreement with your result.
I think that increasing the size beyond a certain limit (which on older processors was 256MB, but probably is larger today), random memory access such as TT probes become intrinsically slower, because you overflow some cache in the paging unit (the TLB), and it would first have to do a memory access to know where the memory page is before it can actully try to fetch the data from it. As the time to access TT is a non-negligible part of the time needed to process a node, this could easily spoil the beneficial effect of the next few doublings.
-
- Posts: 3291
- Joined: Wed Mar 08, 2006 8:15 pm
Re: Effect of Hahtable size
"And I always use the rule of thumb that 1% faster = 1 more Elo." Definitely not anymore. So 100% is 100 ELO? In current 3600 level doubling speed gives maybe 20 ELO I think. Meaning 1% is 0.2 ELO only.
Jouni
-
- Posts: 27808
- Joined: Fri Mar 10, 2006 10:06 am
- Location: Amsterdam
- Full name: H G Muller
Re: Effect of Hahtable size
The dependence of Elo on thinking time is logarithmic (because it is the depth that count, and the ree size grows exponentially with depth). So the formula I apply is rating = constant + 100*log(T), where log is the natural (base e) logarithm. For small x log(1+x) ~ x, so that leads to the mentioned rule of thumb. For a doubling log(2) ~0.69, so that would give only 69 Elo, rather than 100. This was indeed for Elo that was not polluted by a high draw rate; I usually don't test with high draw rate anyway, due to sufficiently randomized openings. I don't think that Gabor tested this on 3600-rated engines either.
But yeah, for 3600 engines, measuring Elo from balanced openings, you would then probably only earn 1 Elo per hash-size doubling. Isn't that wonderful? By using 256GB hash instead of 256MB you would have gained 10 whole Elo's.
But yeah, for 3600 engines, measuring Elo from balanced openings, you would then probably only earn 1 Elo per hash-size doubling. Isn't that wonderful? By using 256GB hash instead of 256MB you would have gained 10 whole Elo's.