1024 MB hashtable memory OK on 2 GB physical RAM computer?

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

mrerk

1024 MB hashtable memory OK on 2 GB physical RAM computer?

Post by mrerk »

I own Dell Inspiron 1545 laptop. It contains Pentium Dual-Core T4200 @ 2.00 Ghz, 2 GB physical RAM and Windows Vista home basic. Also I have Rybka 3 running in Arena 2.0.1 chess gui.

For long time analysis or play such as 3 or more minutes average per move, is 1024 MB hashtable memory OK? Is it too much? Does it slow down Rybka?

Thanks in advance,
Eran
User avatar
Matthias Gemuh
Posts: 3245
Joined: Thu Mar 09, 2006 9:10 am

Re: 1024 MB hashtable memory OK on 2 GB physical RAM compute

Post by Matthias Gemuh »

mrerk wrote:I own Dell Inspiron 1545 laptop. It contains Pentium Dual-Core T4200 @ 2.00 Ghz, 2 GB physical RAM and Windows Vista home basic. Also I have Rybka 3 running in Arena 2.0.1 chess gui.

For long time analysis or play such as 3 or more minutes average per move, is 1024 MB hashtable memory OK? Is it too much? Does it slow down Rybka?

Thanks in advance,
Eran
1024 for analysis
512 for playing

Of course not both at once.

Matthias.
My engine was quite strong till I added knowledge to it.
http://www.chess.hylogic.de
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: 1024 MB hashtable memory OK on 2 GB physical RAM compute

Post by bob »

mrerk wrote:I own Dell Inspiron 1545 laptop. It contains Pentium Dual-Core T4200 @ 2.00 Ghz, 2 GB physical RAM and Windows Vista home basic. Also I have Rybka 3 running in Arena 2.0.1 chess gui.

For long time analysis or play such as 3 or more minutes average per move, is 1024 MB hashtable memory OK? Is it too much? Does it slow down Rybka?

Thanks in advance,
Eran
My advice is quite simple. Test Rybka at the time control you plan on using, and vary the hash size and see what it does to the time to depth measurement for a given position. In general, bigger hash is either better or no worse, but there are some exceptions. Too big and you will start to page using virtual memory, which is always bad. Too big can begin to cause more TLB misses which slows memory access down, and is also bad (although nowhere near as bad as paging).

Different processors have different sizes for their TLB. So testing is the best way, but you need to run several positions, for the time control you plan on using, and try different hash sizes and pick the one that gives you the correct move, same depth, shortest elapsed time.

Anything else is just a guess.
ernest
Posts: 2041
Joined: Wed Mar 08, 2006 8:30 pm

Re: 1024 MB hashtable memory OK on 2 GB physical RAM compute

Post by ernest »

bob wrote:...the time to depth measurement for a given position.
Hi Bob,
Just for clarity, does this mean

1*: setting the level at "Fixed depth = something (n)" and clicking "Move Now": indeed at the end of the fixed depth n, analysis stops, giving a time

2*: or doing Infinite Analysis and noting the time the 1st PV at depth n appears

(actually I know 2* is no good, but why?)
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: 1024 MB hashtable memory OK on 2 GB physical RAM compute

Post by bob »

ernest wrote:
bob wrote:...the time to depth measurement for a given position.
Hi Bob,
Just for clarity, does this mean

1*: setting the level at "Fixed depth = something (n)" and clicking "Move Now": indeed at the end of the fixed depth n, analysis stops, giving a time

2*: or doing Infinite Analysis and noting the time the 1st PV at depth n appears

(actually I know 2* is no good, but why?)
I was thinking of using the following idea.

Suppose you want to play your matches at a time control where you average 60 seconds of computing per move.

Run the test, telling the program to take 120 seconds to move and save the output. Run this repeatedly with different hash sizes. Then look at the output for each run and note what was displayed at around the 60 second mark. Pick the hash that leads to either the deepest depth at 60 seconds, or if they are all equal, pick the hash size that displays the best move at a constant depth, somewhere around the 60 second mark. Small hashes will make the searches take longer, ply by ply. Increasing hash will slowly speed up the search times for each ply, until you begin to see no improvement (as you go beyond optimal) and you will eventually see a speed drop-off as you either run into paging or TLB overloading as you continue to increase hash size...