If you have 20 endgames you will do just fine with a 32 entry table - just use linear probing. If end ending already in the slow go to the next one. But I really don't see any reason not to use a bigger table - 32, 64 or even 256 is small potatoes, not enough to have any noticeable impact on caching behavior.Steve Maughan wrote:Thanks everyone - some great input.
Here's what I'm thinking of trying first. I'm going to try something like Steven Edwards' suggestion. I'll hash the material and then use the "n" smallest bits to lookup the hash value in a table and compare the key. This table will be small. If I have 20 endgames I'd like to recognize (40 valid possible keys), I imagine I'll be able to get it in a 64 entry table. The table entry will contain a pointer to the function which will evaluate the endgame.
Thanks again - some good discussion!
Steve
You might also consider using a "Perfect hash" table - if you know in advance how many entries will be in the table just make sure they hash without collision. This can even be computed dynamically, basically have a table of random zobrist keys and if the table is not perfect, reinitialize the table by trying a different set of keys - and do this until there are no collisions. You will want the table to be large enough so that this is not an excercise in futility, for example a 32 entry table will be very difficult to compute without collisions on 20 entries. You could even size this dynamically if you are determined to make the table small (although I think this is waste of time) by trying to make a perfect hash and if this fails after N attempts, double the table size and try again .... like I say probably overkill but on the other hand there is not any real downside except perhaps a slow startup.
But you could have a procedure like this for the initial sizing of the table and key selection that is "hard coded" into the code - with the only downside that you have to redo when you add endings.