Jouni wrote: ↑Sun Jan 03, 2021 10:40 pm
Houdini 1.5a x64
(c) 2010-11 Robert Houdart
info string POPCNT available
info string 128 MB Large Page Hash
Stockfish got them 2020. BTW has Komodo Large Pages?
No, Komodo does not support large pages. It is trivial to add, but requires users to act as administrators to set up the pages, and after a while memory gets too fragmented for them to be of much use. But if enough people ask for it we can add it.
Jouni wrote: ↑Sun Jan 03, 2021 10:40 pm
Houdini 1.5a x64
(c) 2010-11 Robert Houdart
info string POPCNT available
info string 128 MB Large Page Hash
Stockfish got them 2020. BTW has Komodo Large Pages?
No, Komodo does not support large pages. It is trivial to add, but requires users to act as administrators to set up the pages, and after a while memory gets too fragmented for them to be of much use. But if enough people ask for it we can add it.
Mark
No need to run the engine as administrator. Just needs to give SeLockMemoryPrivilege once.
Jouni wrote: ↑Wed Jan 06, 2021 4:44 pm
I haven't noticed any fragmentation after some weeks large pages use.
I see it all the time.
If I want to use 100GB hash with large pages, I usually have to reboot.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
Jouni wrote: ↑Wed Jan 06, 2021 4:44 pm
I haven't noticed any fragmentation after some weeks large pages use.
I see it all the time.
If I want to use 100GB hash with large pages, I usually have to reboot.
Dann,
You may want to give this a try: https://www.wisecleaner.com/wise-memory-optimizer.html
On my 64GB machine the optimization takes less than a minute and it certainly beats rebooting just for this purpose.
I have been using it since ca. 2016 for exactly this purpose.
Jouni wrote: ↑Wed Jan 06, 2021 4:44 pm
I haven't noticed any fragmentation after some weeks large pages use.
I see it all the time.
If I want to use 100GB hash with large pages, I usually have to reboot.
Dann,
You may want to give this a try: https://www.wisecleaner.com/wise-memory-optimizer.html
On my 64GB machine the optimization takes less than a minute and it certainly beats rebooting just for this purpose.
I have been using it since ca. 2016 for exactly this purpose.
Peter
Thanks, that will be a big help.
I have a zillion services on my machine and booting takes a while.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
Jouni wrote: ↑Wed Jan 06, 2021 4:44 pm
I haven't noticed any fragmentation after some weeks large pages use.
I see it all the time.
If I want to use 100GB hash with large pages, I usually have to reboot.
Dann,
You may want to give this a try: https://www.wisecleaner.com/wise-memory-optimizer.html
On my 64GB machine the optimization takes less than a minute and it certainly beats rebooting just for this purpose.
I have been using it since ca. 2016 for exactly this purpose.
Jouni wrote: ↑Wed Jan 06, 2021 4:44 pm
I haven't noticed any fragmentation after some weeks large pages use.
I see it all the time.
If I want to use 100GB hash with large pages, I usually have to reboot.
Dann,
You may want to give this a try: https://www.wisecleaner.com/wise-memory-optimizer.html
On my 64GB machine the optimization takes less than a minute and it certainly beats rebooting just for this purpose.
I have been using it since ca. 2016 for exactly this purpose.