Search found 421 matches

by Rémi Coulom
Fri Aug 14, 2020 5:52 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: Don't understand NNUE
Replies: 14
Views: 5020

Re: Don't understand NNUE

The Chess Programming Wiki has an excellent description:
https://www.chessprogramming.org/Stockf ... _Structure
by Rémi Coulom
Thu Aug 13, 2020 6:53 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: Neural Networks weights type
Replies: 2
Views: 1192

Re: Neural Networks weights type

8-bit accuracy is often accurate enough, and faster than floating point. The tensor cores of the most recent NVIDIA GPUs can do 4-bit calculation (in addition to 8-bit interger, and 16-bit float). The next generation will also allow sparsity, which is another big potential for performance improvemen...
by Rémi Coulom
Wed Nov 20, 2019 9:45 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: UCI Win/Draw/Loss reporting
Replies: 65
Views: 12462

Re: UCI Win/Draw/Loss reporting

You can use the logit of the probability of winning, and multiply it by a constant so that it looks like centipawns. I do this for my program, and it works very well. I use this, which is the same thing right? https://www.chessprogramming.org/Pawn_Advantage,_Win_Percentage,_and_Elo It's convenient ...
by Rémi Coulom
Thu Oct 31, 2019 10:33 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: UCI Win/Draw/Loss reporting
Replies: 65
Views: 12462

Re: UCI Win/Draw/Loss reporting

Especially because engines like LC0 score only in terms of probability - without WDL support, there is no meaningful eval display possible. Printing the score in centipawns and expecting the user to know this is meant as win score expectation is a bad hack anyway. The other way around fails also be...
by Rémi Coulom
Mon Oct 07, 2019 12:15 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: A book on machine learning
Replies: 7
Views: 2966

Re: A book on machine learning

I took a quick look at the content, and it seems there is extremely little machine learning in this book, except for a decision tree in Chapter 2. Mostly genetic algorithms and population-based methods. No neural network. But what they describe may be fun programming experiments.
by Rémi Coulom
Thu Oct 03, 2019 12:43 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: trying to understand mcts
Replies: 13
Views: 5129

Re: trying to understand mcts

Zeta v097 and v098 were an attempt to make use of thousands of parallel gpu- threads via an parallel Best-First-MiniMax search and classic evaluation. Very interesting, thanks. When using large neural networks, evaluation is so slow that data transfers between CPU and GPU have very little cost. To ...
by Rémi Coulom
Thu Oct 03, 2019 9:56 am
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: trying to understand mcts
Replies: 13
Views: 5129

Re: trying to understand mcts

I tried to reuse the current tree for the next move, but in Zeta v097/v098 I had to copy the tree back n forth between cpu/gpu, and the memory got quickly filled, so I kept it as disabled option. But in theory it should give you an boost. You store the tree in the GPU? I am very surprised. My intui...
by Rémi Coulom
Tue Sep 10, 2019 7:24 pm
Forum: Computer Chess Club: Programming and Technical Discussions
Topic: Search-based opening book
Replies: 17
Views: 7244

Re: Search-based opening book

Hi, I have also been generating an opening book automatically. My approach is simply to use MCTS to grow the book, using a long search to evaluate the leaves, and negamax backup. It is a bit similar to the drop-out expansion approach of the Othello book algorithms, but I like the MCTS approach bette...