Is this SF NN almost like 20 MB book?

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

Jouni
Posts: 3279
Joined: Wed Mar 08, 2006 8:15 pm

Is this SF NN almost like 20 MB book?

Post by Jouni »

I made short 100 game bullet level test. SF dev with Fritz tournament book against SF NNUE no book at all. NNUE scored 59% e.g. +63 ELO. Sometimes NN played 18 theory moves at bullet level - stunning! There is UCI parameter BookMoves with default value 16. Is it a hint?
Jouni
Nay Lin Tun
Posts: 708
Joined: Mon Jan 16, 2012 6:34 am

Re: Is this SF NN almost like 20 MB book?

Post by Nay Lin Tun »

People should start learning how NN works?

Otherwise your question will be laughted as if asking similar question like, " Is the earth flat?"
Gian-Carlo Pascutto
Posts: 1243
Joined: Sat Dec 13, 2008 7:00 pm

Re: Is this SF NN almost like 20 MB book?

Post by Gian-Carlo Pascutto »

Nay Lin Tun wrote: Tue Aug 04, 2020 7:37 pm People should start learning how NN works?
If you give a NN a ton of opening positions to learn, it will definitely learn to remember them.

But for NNUE it is more complicated. The output does not contain move recommendations, just the evaluation of the position. I don't think the current learning process tries to match move outcomes, just game outcomes. So there would be no opportunity for such learning. It could remember which book positions are 'bad' and consequently play towards the 'good' ones, though.

The whole book or "no books" discussion is silly. If one engine can use 100M of data files, then so should the others. Doesn't matter what is contained in them.
User avatar
MikeB
Posts: 4889
Joined: Thu Mar 09, 2006 6:34 am
Location: Pen Argyl, Pennsylvania

Re: Is this SF NN almost like 20 MB book?

Post by MikeB »

Jouni wrote: Tue Aug 04, 2020 7:26 pm I made short 100 game bullet level test. SF dev with Fritz tournament book against SF NNUE no book at all. NNUE scored 59% e.g. +63 ELO. Sometimes NN played 18 theory moves at bullet level - stunning! There is UCI parameter BookMoves with default value 16. Is it a hint?
Not at all. It is driven by AI and statistics . It may seem like it's an opening book at times- but that is not the science behind it. It more similar to pattern recognition - at a very high level with a search function , so certain patterns that may be ignored or pruned away by Stockfish Classical engine , are no longer being pruned if they look interesting based on statistics from prior games . No different that how baseball teams now position their fielders.

Edit: It is interesting how well NN play openings with NO BOOk! Scary - ina good way of course.
Image
Dann Corbit
Posts: 12537
Joined: Wed Mar 08, 2006 8:57 pm
Location: Redmond, WA USA

Re: Is this SF NN almost like 20 MB book?

Post by Dann Corbit »

I don't think the question "Is this SF NN almost like 20 MB book?" was meant literally.
Jouni was simply noticing that sf nnue plays openings very well.

IOW, "Look, it plays the openings so well, we can throw away the books." is what I think he was saying.

LC0 also plays openings very well. I suspect that NN approaches work very well for the initial, quiet part of the game.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
corres
Posts: 3657
Joined: Wed Nov 18, 2015 11:41 am
Location: hungary

Re: Is this SF NN almost like 20 MB book?

Post by corres »

Gian-Carlo Pascutto wrote: Tue Aug 04, 2020 7:51 pm ...
If one engine can use 100M of data files, then so should the others. Doesn't matter what is contained in them.
??
Gian-Carlo Pascutto
Posts: 1243
Joined: Sat Dec 13, 2008 7:00 pm

Re: Is this SF NN almost like 20 MB book?

Post by Gian-Carlo Pascutto »

corres wrote: Tue Aug 04, 2020 8:02 pm
Gian-Carlo Pascutto wrote: Tue Aug 04, 2020 7:51 pm ...
If one engine can use 100M of data files, then so should the others. Doesn't matter what is contained in them.
??
Some tournaments disallow "books" but allow "neural networks", even if this distinction does not exist in reality, because you can train a neural network to remember openings.
corres
Posts: 3657
Joined: Wed Nov 18, 2015 11:41 am
Location: hungary

Re: Is this SF NN almost like 20 MB book?

Post by corres »

Gian-Carlo Pascutto wrote: Tue Aug 04, 2020 8:03 pm ...
Some tournaments disallow "books" but allow "neural networks", even if this distinction does not exist in reality, because you can train a neural network to remember openings.
But the most of chess engine can not "read" neural net, so they need common opening book.
Alayan
Posts: 550
Joined: Tue Nov 19, 2019 8:48 pm
Full name: Alayan Feh

Re: Is this SF NN almost like 20 MB book?

Post by Alayan »

It's theoretically possible to train a NN to learn and match a classical opening book, then have an hybrid engine that uses the NN move/eval as long as the NN claims the position is in book, then switches to something else.

This is just adding a layer of obfuscation and work (retraining the NN once the book gets a significant enough update) to produce the same end result.

Except this obfuscation scheme wouldn't be banned in tournaments that ban classical book.

Book ban to ensure the focus is on search and evaluation works well for classical engine rather than in a book war, but when mixing in NN engines that are trained to use move suggestions (e.g. the so-called "policy head" of Leela), it gets blurry.
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Is this SF NN almost like 20 MB book?

Post by dkappe »

Oh boy. Most nnue are trained at the beginning without regard to game outcome, often at depth 8. Most of the positions they see are maybe 18 to 19 ply into the game and later. They are in essence an approximation of an engine eval at d8.

You can look at my Toga and Night Nurse (based on Bad Gyal) nets and compare them to each other and the many stockfish nets, then explain to me how they are memorizing openings.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".