NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Discussion of chess software programming and technical issues.

Moderators: hgm, chrisw, Rebel

David Carteau
Posts: 130
Joined: Sat May 24, 2014 9:09 am
Location: France
Full name: David Carteau

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by David Carteau »

JacquesRW wrote: Sun Oct 06, 2024 9:24 pm You can see Orion falls into one of the classic beginner traps of immediately trying to use more than one hidden layer. The author may have non-elo driven reasons to do this, but I don't think it is particularly helpful to include it in what looks like a tutorial for a basic NNUE.
Hi Jamie,

I'm not sure that "Orion falls into one of the classic beginner traps", as I was one of the early adopters of NNUE back in 2020 (see this post), and the current NN architecture is just derived from those early experiments.

I'm fascinated by the "compression" of chess knowledge allowed by these NNs and have certainly quickly tested smaller networks (as simple as 768x64x1). With v1.0, my goal was to see if it was possible to use "weak" and "dirty" labels (i.e. only game results) to train a "decent" network (in terms of performance). Keeping a second hidden layer gave me (slightly) better results, so I've kept the scheme :)

But you're right : if you have good labels for training, it's possible to have smaller and more efficient networks (see the current trend with small language models) !

David
Download the Orion chess engine --- Train your NNUE with the Cerebrum library --- Contribute to the Nostradamus experiment !
David Carteau
Posts: 130
Joined: Sat May 24, 2014 9:09 am
Location: France
Full name: David Carteau

Re: NN Eval functions/Embedding Leela/NNUE prebuilt models into Engine

Post by David Carteau »

MichaelL wrote: Fri Oct 04, 2024 11:43 pm
David Carteau wrote: Sun Sep 29, 2024 5:36 pm Hi chesskobra !

My engine Orion uses such a simple NNUE network (768x32x32x2), and I provided training + inference code in the Cerebrum library, which can be freely used in your own experiments !

See this post, my website and my Github repo if you are interested ! I hope it will help you :)
Thanks for posting those links. Looks like exactly something I was after, I implemented a basic handcrafted evaluation in order to start development of search, I seem to have the basic negamax/quiescence done, now adding what appear to be the standard set of improvements, transposition tables etc.. My plan is to finish that, implement UCI interface (which may take some time) and then learn from the basics how the NNUE style neural nets work. Those links are really, really helpful.
Thank you and good luck with your project !
Download the Orion chess engine --- Train your NNUE with the Cerebrum library --- Contribute to the Nostradamus experiment !