I’ve had some requests to update Dark Horse to the new larger HalfKA architecture. I haven’t done any testing beyond making sure the net is the strongest of the training run. Same data as DH 0.3 (less than 1b positions).
https://www.patreon.com/posts/dark-horse-0-3-52384228
Dark Horse Update
Moderator: Ras
-
- Posts: 1632
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Dark Horse Update
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 1439
- Joined: Sat Oct 27, 2018 12:58 am
- Location: Germany
- Full name: N.N.
Re: Dark Horse Update
Thank you Dietrich! I am a fan of White Rose. Would it be possible to produce White Rose as a large network?
-
- Posts: 476
- Joined: Sun Mar 17, 2019 12:00 pm
- Full name: Henk Drost
Re: Dark Horse Update
No mention of the license this time?
-
- Posts: 1632
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Dark Horse Update
That one relied on the old training software rather than the pytorch one.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 1632
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Dark Horse Update
Thanks for reminding me. Updated.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 3358
- Joined: Sat Feb 16, 2008 7:38 am
- Full name: Peter Martan
Re: Dark Horse Update
Thanks a lot, Dietrich, I was looking forward impatiently to thisdkappe wrote: ↑Fri Jun 11, 2021 6:18 pm I’ve had some requests to update Dark Horse to the new larger HalfKA architecture. I haven’t done any testing beyond making sure the net is the strongest of the training run. Same data as DH 0.3 (less than 1b positions).
https://www.patreon.com/posts/dark-horse-0-3-52384228

Peter.
-
- Posts: 3358
- Joined: Sat Feb 16, 2008 7:38 am
- Full name: Peter Martan
Re: Dark Horse Update
Here's ShashChess 1.7 with Dark Horse 0.3xl.bin, option Tal and MultiPV=4 at Vincent's short set of 114 out of Hard Talkchess- suitedkappe wrote: ↑Fri Jun 11, 2021 6:18 pm I’ve had some requests to update Dark Horse to the new larger HalfKA architecture. I haven’t done any testing beyond making sure the net is the strongest of the training run. Same data as DH 0.3 (less than 1b positions).
https://www.patreon.com/posts/dark-horse-0-3-52384228
http://talkchess.com/forum3/viewtopic.p ... 39#p884039
Code: Select all
Bisher gelöst: 81 von 114 ; 52:51m
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
-------------------------------------------------------------------------------------
0 | 3 15 - - 39 0 1 9 24 - 3 - - - - 8 28 4 2 2
20 | 2 13 24 15 1 - 0 1 36 0 - 17 2 6 41 0 3 - - 2
40 | 3 0 - - 20 49 1 0 0 27 - - - 60 54 10 1 3 10 2
60 | 0 8 14 0 60 2 33 4 60 0 - 12 0 3 26 5 2 0 0 13
80 | - - - 2 - - 4 3 11 24 60 - - - - 46 52 38 0 1
100 | 3 - - 0 4 - - 9 1 - 42 58 - -
81/114 still isn't one of the very best results in my list, but it's better then the one with default SF- net was, ShashChess had its best runs with the smaller nets before at my trials, probably due to my not quite up to date hardware. As far as I remember (always storing best runs only) 17.1 had at least 10 soulutions less with default big net then at single primary variant.
Maybe Vincent could give new Dark Horse NNUE a try with Honey and or ShashChess for his sheet at single- core conditions too, regards
Peter.
-
- Posts: 3358
- Joined: Sat Feb 16, 2008 7:38 am
- Full name: Peter Martan
Re: Dark Horse Update
Peter.
-
- Posts: 3358
- Joined: Sat Feb 16, 2008 7:38 am
- Full name: Peter Martan
Re: Dark Horse Update
And here the direct comparison with same version, setting, hardware and TC, only difference: new green SF dev.- NNUE (nn-8e47cf062333.nnue):peter wrote: ↑Sun Jun 13, 2021 10:08 pm Here's ShashChess 1.7 with Dark Horse 0.3xl.bin, option Tal and MultiPV=4 at Vincent's short set of 114 out of Hard Talkchess- suite
http://talkchess.com/forum3/viewtopic.p ... 39#p884039
1 min./position is my personal SMP- TC for the 12x3GHz Xeon CPU, only able of SSE4.1 popcnt- compiles, 8G hash.Code: Select all
Bisher gelöst: 81 von 114 ; 52:51m 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 ------------------------------------------------------------------------------------- 0 | 3 15 - - 39 0 1 9 24 - 3 - - - - 8 28 4 2 2 20 | 2 13 24 15 1 - 0 1 36 0 - 17 2 6 41 0 3 - - 2 40 | 3 0 - - 20 49 1 0 0 27 - - - 60 54 10 1 3 10 2 60 | 0 8 14 0 60 2 33 4 60 0 - 12 0 3 26 5 2 0 0 13 80 | - - - 2 - - 4 3 11 24 60 - - - - 46 52 38 0 1 100 | 3 - - 0 4 - - 9 1 - 42 58 - -
81/114 still isn't one of the very best results in my list, but it's better then the one with default SF- net was
Code: Select all
Bisher gelöst: 76 von 114 ; 54:11m
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
-------------------------------------------------------------------------------------
0 | 1 4 - - - 44 4 - - - - - 21 - - 0 0 9 1 1
20 | - - 42 5 2 - 0 14 33 1 - 15 1 - - 8 0 - - 2
40 | 2 0 13 57 5 - 0 0 0 1 24 - - 15 - 2 0 0 4 1
60 | 0 8 4 0 - 13 - 3 60 0 - 14 0 3 24 4 7 0 0 -
80 | - 7 - 0 - - 6 4 - 4 - 44 60 - 54 3 21 24 54 2
100 | 3 31 50 0 11 - 53 7 - 0 5 - - -
Peter.
-
- Posts: 1632
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Dark Horse Update
Someone notified me that someone in the SF community is planning on training the Dark Horse XL net with SF data. That’s perfectly fine as long as you don’t distribute this derivative work. See the license.
Code: Select all
License - Attribution-NoDerivatives 4.0 International (CC BY-ND 4.0) https://creativecommons.org/licenses/by-nd/4.0/legalcode
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".