NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

David Carteau
Posts: 127
Joined: Sat May 24, 2014 9:09 am
Location: France
Full name: David Carteau

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by David Carteau »

chesskobra wrote: Sat Sep 28, 2024 12:39 pm Sorry if I am asking a question that was answered in this or another similar thread. But it was suggested that there are engines that use a very simple architecture like an array of 768 bits for input position and a small hidden layer of 16 neurons. Here viewtopic.php?p=957361#p957361 it was explained by lithander in some detail.

What are some engines that implement a simple network like that? I would like to look at some small engines, preferably for CPU, with easy to understand C or C++ code. I am not interested in creating my own engine. But I would like to create a simple network, plug it into an existing engine and see what happens. Are there any scripts that I can use for data generation and training such a network?
Hi chesskobra !

My engine Orion uses such a simple NNUE network (768x32x32x2), and I provided training + inference code in the Cerebrum library, which can be freely used in your own experiments !

See this post, my website and my Github repo if you are interested ! I hope it will help you :)
chesskobra
Posts: 254
Joined: Thu Jul 21, 2022 12:30 am
Full name: Chesskobra

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by chesskobra »

@ David Carteau Thanks a lot. I will take a look at it in detail. Is this windows only?
David Carteau
Posts: 127
Joined: Sat May 24, 2014 9:09 am
Location: France
Full name: David Carteau

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by David Carteau »

chesskobra wrote: Sun Sep 29, 2024 5:49 pm @ David Carteau Thanks a lot. I will take a look at it in detail. Is this windows only?
The engine itself, Orion, yes (note that it works well on other platforms using Wine). But the training and inference code of the Cerebrum library can be used on any platform (as far as training is concerned, you'll just have to adapt the .bat files, which basically call Python scripts, to your preferred script shell environment)!
chesskobra
Posts: 254
Joined: Thu Jul 21, 2022 12:30 am
Full name: Chesskobra

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by chesskobra »

Thanks. I looked at a bat file, and it is only a listing of python commands, so I think it will work without a problem. Can almost any NNUE engine be compiled with any NNUE network such as yours? Or does it have to be an engine that specifically supports 768x32x32->2 network?
David Carteau
Posts: 127
Joined: Sat May 24, 2014 9:09 am
Location: France
Full name: David Carteau

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by David Carteau »

chesskobra wrote: Mon Sep 30, 2024 10:27 am Thanks. I looked at a bat file, and it is only a listing of python commands, so I think it will work without a problem. Can almost any NNUE engine be compiled with any NNUE network such as yours? Or does it have to be an engine that specifically supports 768x32x32->2 network?
Using the library requires at least that the target engine uses the same pieces representation (see the inference code, cerebrum.c lines 30-65).

The easiest way to understand the inference part is to have a look at the provided "Cerebrum engine" if you are familiar with Python. It's a basic example of how NNUE works when used in an engine (direct access)
JVMerlino
Posts: 1376
Joined: Wed Mar 08, 2006 10:15 pm
Location: San Francisco, California

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by JVMerlino »

David Carteau wrote: Sun Sep 29, 2024 5:36 pm Hi chesskobra !

My engine Orion uses such a simple NNUE network (768x32x32x2), and I provided training + inference code in the Cerebrum library, which can be freely used in your own experiments !

See this post, my website and my Github repo if you are interested ! I hope it will help you :)
David,

Thank you so much for making this for the community. I've successfully gone through all of the training, and it took about 36 hours. About 32 hours of that was CPU time, and the rest was me trying to install and figure out Python, which I had absolutely no prior experience with. :)

I've tried including your inference code in my engine (Myrddin) and I just now realized that you use Pawn=0, Knight=1, etc... whereas I use King=0, Queen=1, etc. Rather than making a lot of changes in my engine, after looking through your Python code, am I correct that if I just change lines 118 and 119 in train.py from...
whites = ["P", "N", "B", "R", "Q", "K"]
blacks = ["p", "n", "b", "r", "q", "k"]
to...
whites = ["K", "Q", "R", "B", "N", "P"]
blacks = ["k", "q", "r", "b", "n", "p"]
...that it will work for my engine?

Also, I was very concerned that a few of the steps had hung or crashed, given that there is very little feedback to show progress, particularly during the shuffle/split process. Since I used a PGN file containing about 1.9M games, the resulting "positions.txt" file contained over 170M positions. Apparently it took over 24 hours just for that process. If you intend to enhance your code, perhaps some indication of progress would be helpful. :)

Thank you so much again! :D
jm
David Carteau
Posts: 127
Joined: Sat May 24, 2014 9:09 am
Location: France
Full name: David Carteau

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by David Carteau »

JVMerlino wrote: Thu Oct 03, 2024 1:09 am I've tried including your inference code in my engine (Myrddin) and I just now realized that you use Pawn=0, Knight=1, etc... whereas I use King=0, Queen=1, etc. Rather than making a lot of changes in my engine, after looking through your Python code, am I correct that if I just change lines 118 and 119 in train.py from...
whites = ["P", "N", "B", "R", "Q", "K"]
blacks = ["p", "n", "b", "r", "q", "k"]
to...
whites = ["K", "Q", "R", "B", "N", "P"]
blacks = ["k", "q", "r", "b", "n", "p"]
...that it will work for my engine?
Yes, it should work. Make sure you delete the "data/" folder before starting a new training session (this folder should contain a bunch of .pickle files).

You can also try to replace "piece_type" with something like "5 - piece_type" in the inference code with the neural network you already trained (but this is a bit more risky :)).

When you are done with one solution or the other, I recommend comparing the results of your engine with those of the provided "Cerebrum engine", just to be sure that everything is ok ;)
JVMerlino wrote: Thu Oct 03, 2024 1:09 am Also, I was very concerned that a few of the steps had hung or crashed, given that there is very little feedback to show progress, particularly during the shuffle/split process. Since I used a PGN file containing about 1.9M games, the resulting "positions.txt" file contained over 170M positions. Apparently it took over 24 hours just for that process. If you intend to enhance your code, perhaps some indication of progress would be helpful. :)
Yes, you're right, there's definitely something to improve here. I'll work on it!
MichaelL
Posts: 4
Joined: Sun Sep 22, 2024 9:51 pm
Full name: Michael Lewis

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by MichaelL »

David Carteau wrote: Sun Sep 29, 2024 5:36 pm Hi chesskobra !

My engine Orion uses such a simple NNUE network (768x32x32x2), and I provided training + inference code in the Cerebrum library, which can be freely used in your own experiments !

See this post, my website and my Github repo if you are interested ! I hope it will help you :)
Thanks for posting those links. Looks like exactly something I was after, I implemented a basic handcrafted evaluation in order to start development of search, I seem to have the basic negamax/quiescence done, now adding what appear to be the standard set of improvements, transposition tables etc.. My plan is to finish that, implement UCI interface (which may take some time) and then learn from the basics how the NNUE style neural nets work. Those links are really, really helpful.
JVMerlino
Posts: 1376
Joined: Wed Mar 08, 2006 10:15 pm
Location: San Francisco, California

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by JVMerlino »

MichaelL wrote: Fri Oct 04, 2024 11:43 pm
David Carteau wrote: Sun Sep 29, 2024 5:36 pm Hi chesskobra !

My engine Orion uses such a simple NNUE network (768x32x32x2), and I provided training + inference code in the Cerebrum library, which can be freely used in your own experiments !

See this post, my website and my Github repo if you are interested ! I hope it will help you :)
Thanks for posting those links. Looks like exactly something I was after, I implemented a basic handcrafted evaluation in order to start development of search, I seem to have the basic negamax/quiescence done, now adding what appear to be the standard set of improvements, transposition tables etc.. My plan is to finish that, implement UCI interface (which may take some time) and then learn from the basics how the NNUE style neural nets work. Those links are really, really helpful.
They are indeed, and David has also been very helpful! In about 3 days work (most of which was CPU time, but the rest was me fumbling around while I worked in Python for the first time, and me not being a very good programmer, and me not always following David's instructions correctly), I got his NN code implemented in my engine and created my own network. Self-play shows an increase of about 105 elo. Definitely worth the effort!
JacquesRW
Posts: 103
Joined: Sat Jul 30, 2022 12:12 pm
Full name: Jamie Whiting

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by JacquesRW »

This thread has partly inspired me to do a long overdue improvement to some aspects of bullet's documentation:
https://github.com/jw1912/bullet/blob/m ... -basics.md

There is a pretty brief overview of NNUE, but more importantly, I think, is this section:
https://github.com/jw1912/bullet/blob/m ... nner-traps

Its been a steadily growing sentiment among engine devs (on discord) that most of the popular resources for learning NNUE (e.g. nnue.md) are outdated and can be extremely misleading w.r.t where is actually a good place to start, and what a reasonable progression of features/compute investment is. It is rather common to see beginners asking about HalfKP in the SF discord server, for example.

You can see Orion falls into one of the classic beginner traps of immediately trying to use more than one hidden layer. The author may have non-elo driven reasons to do this, but I don't think it is particularly helpful to include it in what looks like a tutorial for a basic NNUE.