Minic version 2

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

User avatar
xr_a_y
Posts: 1871
Joined: Sat Nov 25, 2017 2:28 pm
Location: France

Re: Minic version 2

Post by xr_a_y »

xr_a_y wrote: Sat Aug 08, 2020 1:18 pm Minic 2.47 is released. This is a quite special version, as NNUE (from Stockfish) as be integrated inside.
This release has NNUE technology enabled (by #define WITH_NNUE inside definition.hpp), but has to be considerate as "MinicNNUE" only if you specify a network file using the NNUEFile GUI option.

MinicNNUE (using nn-97f742aaefcd.nnue) seems +200Elo versus standard Minic evaluation.
But as already stated in many places, MinicNNUE won't be "official Minic" version because it does not represent at all my own work.
Minic is already of course vastly inspired by others engines but I try and test everything by myself, and those who has followed Minic development know how long it took... but here, integrated NNUE was more or less just a copy/paste.
So please do not "categorized" MinicNNUE with standard Minic (i'm thinking of CCRL, FGRL, CEGT testing here).

Anyone is of course free to use this Minic version with any network file you want (but please call it MinicNNUE, not Minic, as soon as a net is used). I'd also be glad if someone try to train specific net for it...

This experiment showed that many Elo can be gained in Minic by working on its evaluation, and also many more working on search ! :D

Using NNUE inside Minic was quite easy (as discussed on the TCEC discord channel), mainly because the NNUE code itself is very clear, a very good job was done by people who integrated it into Stockfish.

Finally, I'd like to make a point about NNUE technology release. I think many engine dev will try (and success) to use it in the next days or weeks, and it would feel more natural to me if the NNUE is hosted in a specific repository so that many engine authors can work with it. I think this would benefit to everyone in the community, including Stockfish and maybe even shogi engine devs. Maybe some will try to write others architectures for instances. This will at least be better than 100 copy/paste different codes hard to track and merge.
Windows users have probably seen troubles with this 2.47 release, this is why 2.48 is released just a few hours after, with better portability (for Android and Windows).
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Minic version 2

Post by dkappe »

xr_a_y wrote: Sat Aug 08, 2020 1:18 pm

Anyone is of course free to use this Minic version with any network file you want (but please call it MinicNNUE, not Minic, as soon as a net is used). I'd also be glad if someone try to train specific net for it...
What did you have in mind for the net?

In the meantime I plugged Night Nurse 0.1 into it. Nice.

[pgn]
[Event "?"]
[Site "?"]
[Date "2020.08.08"]
[Round "1"]
[White "Minic2"]
[Black "MinicNNUE-NiNu-0.1"]
[Result "0-1"]
[ECO "B21"]
[GameDuration "00:03:58"]
[GameEndTime "2020-08-08T15:16:14.505 CDT"]
[GameStartTime "2020-08-08T15:12:16.214 CDT"]
[Opening "Sicilian"]
[PlyCount "148"]
[TimeControl "60+1"]
[Variation "Grand Prix attack"]

1. e4 {book} c5 {book} 2. f4 {book} d5 {book} 3. exd5 {book} Nf6 {book}
4. Bb5+ {+0.54/17 3.1s} Nbd7 {+0.09/17 2.8s} 5. c4 {+0.43/17 3.4s}
a6 {-0.13/18 3.2s} 6. Bxd7+ {+0.22/18 3.9s} Qxd7 {-0.13/17 4.0s}
7. Nf3 {+0.32/19 3.6s} e6 {-0.12/17 2.5s} 8. Ne5 {+0.37/19 3.8s}
Qd6 {-0.13/17 2.7s} 9. Nc3 {+0.65/18 2.2s} exd5 {+0.07/17 3.5s}
10. cxd5 {+0.27/16 2.2s} Be7 {+0.13/17 3.6s} 11. d4 {+0.36/17 3.1s}
O-O {-0.01/18 3.5s} 12. dxc5 {+0.70/17 3.9s} Qxc5 {+0.36/17 3.0s}
13. Qd3 {+0.15/17 3.3s} b5 {+0.43/17 3.1s} 14. Be3 {+0.07/18 1.8s}
Qd6 {+0.48/18 3.1s} 15. Rd1 {+0.24/17 3.0s} Bb7 {+0.24/17 3.0s}
16. Qd4 {-0.01/17 1.9s} b4 {+0.70/16 3.2s} 17. Ne4 {+0.25/19 2.6s}
Nxe4 {+0.36/20 2.8s} 18. Qxe4 {+0.12/20 2.3s} Rad8 {+0.56/18 1.8s}
19. Nc4 {-0.02/20 2.4s} Qd7 {+0.46/18 2.7s} 20. Nb6 {+0.21/22 2.6s}
Bh4+ {+0.54/18 2.0s} 21. g3 {+0.25/22 1.6s} Qh3 {+0.43/18 2.6s}
22. Kf2 {+0.50/17 1.5s} Rfe8 {+0.68/17 2.5s} 23. Qxb4 {+0.44/18 2.4s}
Bf6 {+0.37/15 2.5s} 24. b3 {+0.72/16 1.7s} h5 {+0.49/14 1.5s}
25. Qc4 {+0.63/19 2.0s} h4 {+0.58/16 2.6s} 26. Qf1 {+0.56/17 2.0s}
Qf5 {+0.55/18 2.2s} 27. Qe2 {+0.52/16 1.6s} Rd6 {+0.55/16 1.2s}
28. g4 {-0.21/18 2.1s} Qe4 {+0.43/19 2.1s} 29. Qd3 {-0.22/18 1.6s}
Qb4 {+0.82/19 2.0s} 30. a3 {-0.38/18 1.5s} Rxe3 {+0.77/19 1.2s}
31. Qxe3 {-0.37/23 2.2s} Qxb6 {+0.68/20 1.6s} 32. Qxb6 {-0.33/22 2.1s}
Rxb6 {+0.68/17 0.21s} 33. b4 {-0.18/22 1.3s} Rd6 {+0.64/18 1.2s}
34. Rd2 {-0.25/22 2.0s} g5 {+1.17/18 1.2s} 35. Rhd1 {-0.59/17 1.6s}
gxf4 {+1.49/17 1.7s} 36. Kf3 {-0.83/18 1.3s} Be5 {+1.50/17 1.4s}
37. h3 {-0.89/16 1.7s} Rd8 {+1.65/17 1.5s} 38. a4 {-1.30/17 2.1s}
Rc8 {+1.89/17 1.3s} 39. b5 {-1.44/18 1.4s} axb5 {+2.20/19 1.6s}
40. axb5 {-1.89/19 1.9s} Bd6 {+2.12/19 1.8s} 41. Ke4 {-1.57/20 1.8s}
Re8+ {+2.30/18 1.3s} 42. Kd4 {-1.77/20 1.8s} Re5 {+2.10/19 1.7s}
43. Kc4 {-1.94/19 1.0s} Kg7 {+2.29/19 1.8s} 44. Ra1 {-1.95/19 1.2s}
Re4+ {+2.32/17 1.1s} 45. Rd4 {-2.18/20 1.3s} Re2 {+2.45/19 1.6s}
46. Rf1 {-2.24/19 1.1s} Rc2+ {+2.90/17 1.2s} 47. Kb3 {-2.68/21 1.7s}
Rc5 {+3.43/18 1.0s} 48. Ka4 {-2.92/20 1.0s} Rc3 {+3.43/19 1.7s}
49. Ka5 {-2.78/20 1.7s} f3 {+3.75/18 1.6s} 50. Rdd1 {-2.97/19 1.6s}
Bc5 {+3.59/17 1.7s} 51. b6 {-3.07/19 0.98s} Kf6 {+4.24/17 1.9s}
52. Rc1 {-3.23/20 1.2s} Rxc1 {+5.74/18 1.2s} 53. Rxc1 {-4.41/16 1.4s}
Bd6 {+6.23/18 0.92s} 54. Rc2 {-5.74/18 1.8s} Ke5 {+7.22/18 1.2s}
55. Rf2 {-5.77/17 0.13s} Kf4 {+9.28/16 0.20s} 56. Rc2 {-9.31/19 1.7s}
Kg3 {+9.82/16 0.21s} 57. Rc3 {-13.98/18 1.4s} Kg2 {+13.42/19 1.1s}
58. Rc6 {-16.77/20 1.6s} f2 {+19.41/20 1.5s} 59. Rxd6 {-20.05/19 1.1s}
f1=Q {+21.82/17 0.12s} 60. Kb4 {-22.76/25 1.4s} Qf4+ {+22.74/16 0.23s}
61. Kc5 {-26.57/25 0.55s} Qe3+ {+23.02/17 0.13s} 62. Kb4 {-32.41/26 1.4s}
Kxh3 {+23.06/19 0.15s} 63. g5 {-36.50/24 1.3s} Qxg5 {+33.31/25 1.2s}
64. Kb5 {-33.64/26 0.099s} Qe5 {+38.01/26 1.4s} 65. Rd8 {-33.64/24 0.10s}
Qe2+ {+40.81/16 0.17s} 66. Kc5 {-37.11/29 1.1s} Qe7+ {+46.80/20 0.18s}
67. Rd6 {-42.47/31 1.4s} Kg4 {+53.00/16 0.16s} 68. Kd4 {-99.86/18 0.70s}

Qxd6 {+99.87/14 0.18s} 69. Kd3 {-99.88/19 0.97s} Qxd5+ {+99.89/16 0.32s}
70. Kc3 {-99.90/17 0.13s} h3 {+99.91/21 1.8s} 71. Kb4 {-99.92/24 1.0s}
h2 {+99.93/22 1.6s} 72. Kc3 {-99.94/39 0.78s} h1=Q {+99.95/35 1.1s}
73. Kb4 {-99.96/121 0.31s} Qhd1 {+99.97/121 0.26s} 74. Ka3 {-99.98/121 0.006s}
Q1b3# {+99.99/121 0.009s, Black mates} 0-1

[/pgn]
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
User avatar
xr_a_y
Posts: 1871
Joined: Sat Nov 25, 2017 2:28 pm
Location: France

Re: Minic version 2

Post by xr_a_y »

dkappe wrote: Sat Aug 08, 2020 10:24 pm
xr_a_y wrote: Sat Aug 08, 2020 1:18 pm

Anyone is of course free to use this Minic version with any network file you want (but please call it MinicNNUE, not Minic, as soon as a net is used). I'd also be glad if someone try to train specific net for it...
What did you have in mind for the net?
In the meantime I plugged Night Nurse 0.1 into it. Nice.
Thanks for testing it. I'm a total noob in cooking neural networks :oops:, so I had nothing special in mind. Would be fun to have "personalities" network in various styles.
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Minic version 2

Post by dkappe »

xr_a_y wrote: Sat Aug 08, 2020 11:13 pm
dkappe wrote: Sat Aug 08, 2020 10:24 pm
xr_a_y wrote: Sat Aug 08, 2020 1:18 pm

Anyone is of course free to use this Minic version with any network file you want (but please call it MinicNNUE, not Minic, as soon as a net is used). I'd also be glad if someone try to train specific net for it...
What did you have in mind for the net?
In the meantime I plugged Night Nurse 0.1 into it. Nice.
Thanks for testing it. I'm a total noob in cooking neural networks :oops:, so I had nothing special in mind. Would be fun to have "personalities" network in various styles.
How about a net based on 30m Minic2 positions at d8 for starters? One can always improve from there.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Minic version 2

Post by dkappe »

One thing that would be handy is a UCI EvalScale factor. Although it doesn’t look like regular eval and nnue eval have a linear relationship, it would be nice to be able to multiply the raw evals by some factor to nudge them into a range that the search expects.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Minic version 2

Post by dkappe »

Final score (60+1, noomen 3, colors reversed, 1 cpu, 6 man egtb) was

Code: Select all


Score of Minic2 vs MinicNNUE-NiNu-0.1: 13 - 52 - 35  [0.305] 100
Elo difference: -143.07 +/- 57.34

Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
User avatar
xr_a_y
Posts: 1871
Joined: Sat Nov 25, 2017 2:28 pm
Location: France

Re: Minic version 2

Post by xr_a_y »

dkappe wrote: Sat Aug 08, 2020 11:56 pm One thing that would be handy is a UCI EvalScale factor. Although it doesn’t look like regular eval and nnue eval have a linear relationship, it would be nice to be able to multiply the raw evals by some factor to nudge them into a range that the search expects.
I'm not sure if I get your point, but i've already tried to merge classic evaluation and NNUE evaluation (using game phase, linear combination, remaining material, ...) but it was without any success.
brianr
Posts: 536
Joined: Thu Mar 09, 2006 3:01 pm

Re: Minic version 2

Post by brianr »

I suspect the factor would be to move the NN evals into the same approximate range as the traditional eval as many of the search pruning approaches use a tentative eval score, which would not be optimal if the NN eval was out of whack (technical term).

Perhaps a better option would be to have search pruning parameter sets with one traditional and another tuned for the NN eval.

However, my hunch is that soon the hand-crafted evals will be left behind and the focus will be on creating the nets with more chess-specific info (getting far away from the "zero" approach, of course). I think this is one reason (in addition to being highly optimized) why the current shallow SF-NNUE nets do so well despite being vastly smaller than the Leela nets. Likely something in between will turn out best.
User avatar
xr_a_y
Posts: 1871
Joined: Sat Nov 25, 2017 2:28 pm
Location: France

Re: Minic version 2

Post by xr_a_y »

dkappe wrote: Sat Aug 08, 2020 11:56 pm One thing that would be handy is a UCI EvalScale factor. Although it doesn’t look like regular eval and nnue eval have a linear relationship, it would be nice to be able to multiply the raw evals by some factor to nudge them into a range that the search expects.
Ok I now compute (and apply) a scaling factor when using NNUE. For now 5000 random (with fixed seed) positions are used, but this is cheap and it can be more of course. I avoid position with EvalStd*EvalNNUE < 0 or |EvalNNUE|> QueenValue

The factor is quite little in fact, it is around EvalStd ~ EvalNNUE * 0.58.

I'll run a test to see if applying it gives some strength.

It won't be fun to have to retune search all over again ...
User avatar
CMCanavessi
Posts: 1142
Joined: Thu Dec 28, 2017 4:06 pm
Location: Argentina

Re: Minic version 2

Post by CMCanavessi »

dkappe wrote: Sun Aug 09, 2020 6:50 am Final score (60+1, noomen 3, colors reversed, 1 cpu, 6 man egtb) was

Code: Select all


Score of Minic2 vs MinicNNUE-NiNu-0.1: 13 - 52 - 35  [0.305] 100
Elo difference: -143.07 +/- 57.34

Wow....
Follow my tournament and some Leela gauntlets live at http://twitch.tv/ccls