Page 14 of 15

Re: Final Release of Ethereal, V12.75

Posted: Thu Nov 12, 2020 9:15 pm
by AndrewGrant
Madeleine Birchfield wrote:
Thu Nov 12, 2020 9:07 pm
MikeB wrote:
Thu Nov 12, 2020 8:49 pm
Madeleine Birchfield wrote:
Thu Nov 12, 2020 8:07 pm
It looks like Andrew Grant might not be done with Ethereal yet:

https://github.com/AndyGrant/EtherealDe ... cffe1887ae
https://github.com/AndyGrant/EtherealDe ... cf421653a2
Once it gets in your blood, it's hard to get out. "Chess programmers never die, they simply fade away."
A few weeks ago, when everybody else was posting about Ethereal's final release version 12.75, Andrew Grant announced that he was writing his own neural network trainer, and then Halogen 7 came on the scene with a NNUE network that was trained on his trainer. So I thought, well, Andrew Grant probably doesn't want to get rid of his hard work on Ethereal's handcrafted evaluation function after trumpeting how much Ethereal's handcrafted evaluation function meant to him, so that is the reason why this was the final release of Ethereal, and his future would be working on Halogen with Kieren Pearson. But then earlier today he said that he wasn't working on Halogen; he was just testing his trainer with Halogen, which then brings up the question, then why did he write the trainer in the first place? Turns out in the dev version of Ethereal, he replaced whatever previous code he was using for endgames with a NNUE net, presumably trained using his trainer.
Maybe you never see Ethereal 12.76.
Maybe I make a post titled: "Ethereal 13.00, the second most original NNUE on the scene"
Either way, the NN code stays secret! Stop the Steal! (of NNUE)

Re: Final Release of Ethereal, V12.75

Posted: Thu Nov 12, 2020 10:38 pm
by matejst
I think that making NN for specialized parts of the evaluation, and combining it with more classical methods is a very good idea. In my view that would be something that makes Andrew's work truly unique and different, and it could be the way to the next step in engine development. Anyway, all these critics made me finally try Ethereal.

Re: Final Release of Ethereal, V12.75

Posted: Thu Nov 12, 2020 10:45 pm
by MikeB
matejst wrote:
Thu Nov 12, 2020 10:38 pm
I think that making NN for specialized parts of the evaluation, and combining it with more classical methods is a very good idea. In my view that would be something that makes Andrew's work truly unique and different, and it could be the way to the next step in engine development. Anyway, all these critics made me finally try Ethereal.
Andrew has always been one that plows his own path. That is a compliment.

Re: Final Release of Ethereal, V12.75

Posted: Thu Nov 12, 2020 11:00 pm
by matejst
Mike,

I felt that he was evolving in a known paradigm, chasing SF -- similar approach, similar methodology, and, to be honest, I thought it was in part a waste of talent (learning aside). Now, he is on his own path, and with his energy I am sure results could come fast. Anyway, I hope he will continue exploring and improving Ethereal. I guess the eventual aim is to understand and control the knowledge encapsulated in nets, and to make better use of it.

Re: Final Release of Ethereal, V12.75

Posted: Thu Nov 12, 2020 11:33 pm
by AndrewGrant
matejst wrote:
Thu Nov 12, 2020 11:00 pm
Mike,

I felt that he was evolving in a known paradigm, chasing SF -- similar approach, similar methodology, and, to be honest, I thought it was in part a waste of talent (learning aside). Now, he is on his own path, and with his energy I am sure results could come fast. Anyway, I hope he will continue exploring and improving Ethereal. I guess the eventual aim is to understand and control the knowledge encapsulated in nets, and to make better use of it.
As of now, in the shadows I'm trying to replicate SF's NNUE. Need to prove that I can come up with a paradigm that works, on a architecture that is known to work, before changing the game. In many ways I am chasing Stockfish -- but I am at least always doing it fresh. Maybe I end up with the same NN architecture -- but its trained on new code, different data collection methods, different optimization algorithms, ....

Its hard to create nuances if you just _start_ with someone else's work verbatim. When you start from the ground up, you are faced with design decisions that are already decided in other code bases.

Re: Final Release of Ethereal, V12.75

Posted: Mon Nov 16, 2020 1:09 pm
by Kieren Pearson
Madeleine Birchfield wrote:
Thu Nov 12, 2020 9:07 pm
MikeB wrote:
Thu Nov 12, 2020 8:49 pm
Madeleine Birchfield wrote:
Thu Nov 12, 2020 8:07 pm
It looks like Andrew Grant might not be done with Ethereal yet:

https://github.com/AndyGrant/EtherealDe ... cffe1887ae
https://github.com/AndyGrant/EtherealDe ... cf421653a2
Once it gets in your blood, it's hard to get out. "Chess programmers never die, they simply fade away."
A few weeks ago, when everybody else was posting about Ethereal's final release version 12.75, Andrew Grant announced that he was writing his own neural network trainer, and then Halogen 7 came on the scene with a NNUE network that was trained on his trainer. So I thought, well, Andrew Grant probably doesn't want to get rid of his hard work on Ethereal's handcrafted evaluation function after trumpeting how much Ethereal's handcrafted evaluation function meant to him, so that is the reason why this was the final release of Ethereal, and his future would be working on Halogen with Kieren Pearson. But then earlier today he said that he wasn't working on Halogen; he was just testing his trainer with Halogen, which then brings up the question, then why did he write the trainer in the first place? Turns out in the dev version of Ethereal, he replaced whatever previous code he was using for endgames with a NNUE net, presumably trained using his trainer.
You can see what Andrew's done on Halogen on my GitHub repo. Andrew's current interest now he isn't working on Ethereal is working on a NN trainer and may very well have a trainer that is better than the one that SF uses to train its NNUE nets. He sometimes trains Halogen networks to test the trainer's progress. As far as I know he's not really interested in contributing to Halogen directly, its more of a side effect.

Re: Final Release of Ethereal, V12.75

Posted: Mon Nov 16, 2020 2:20 pm
by Madeleine Birchfield
Kieren Pearson wrote:
Mon Nov 16, 2020 1:09 pm
You can see what Andrew's done on Halogen on my GitHub repo. Andrew's current interest now he isn't working on Ethereal is working on a NN trainer and may very well have a trainer that is better than the one that SF uses to train its NNUE nets. He sometimes trains Halogen networks to test the trainer's progress. As far as I know he's not really interested in contributing to Halogen directly, its more of a side effect.
Andrew seems to have a nntrainer_x256 github repository he hasn't made public yet.

Re: Final Release of Ethereal, V12.75

Posted: Mon Nov 16, 2020 8:37 pm
by AndrewGrant
Madeleine Birchfield wrote:
Mon Nov 16, 2020 2:20 pm
Kieren Pearson wrote:
Mon Nov 16, 2020 1:09 pm
You can see what Andrew's done on Halogen on my GitHub repo. Andrew's current interest now he isn't working on Ethereal is working on a NN trainer and may very well have a trainer that is better than the one that SF uses to train its NNUE nets. He sometimes trains Halogen networks to test the trainer's progress. As far as I know he's not really interested in contributing to Halogen directly, its more of a side effect.
Andrew seems to have a nntrainer_x256 github repository he hasn't made public yet.
That was just a branch to PR the new network. The trianing code remains in a private repo. A teaser, I guess....

Code: Select all

static const Layer ARCHITECTURE[] = {
    {43210, 256, &activate_relu, &backprop_relu },
    {  512,  32, &activate_relu, &backprop_relu },
    {   32,  32, &activate_relu, &backprop_relu },
    {   32,   1, &activate_null, &backprop_null },
};

// Choose a Loss, LossProp, and NN Architecture

#define LOSS_FUNC     l2_one_neuron_loss
#define LOSSPROP_FUNC l2_one_neuron_lossprop
#define NN_TYPE       HALFKP_RELFACTOR

Re: Final Release of Ethereal, V12.75

Posted: Tue Nov 17, 2020 2:39 am
by Kieren Pearson
Madeleine Birchfield wrote:
Mon Nov 16, 2020 2:20 pm
Kieren Pearson wrote:
Mon Nov 16, 2020 1:09 pm
You can see what Andrew's done on Halogen on my GitHub repo. Andrew's current interest now he isn't working on Ethereal is working on a NN trainer and may very well have a trainer that is better than the one that SF uses to train its NNUE nets. He sometimes trains Halogen networks to test the trainer's progress. As far as I know he's not really interested in contributing to Halogen directly, its more of a side effect.
Andrew seems to have a nntrainer_x256 github repository he hasn't made public yet.
Andrew has a private NN trainer repository, its not just for the x256 net he made for Halogen the other day. Its actually quite flexible and could be used to train a NNUE or really any architecture of network. Only him and a handful of others (myself included) have access.

Re: Final Release of Ethereal, V12.75

Posted: Tue Dec 08, 2020 6:04 am
by Madeleine Birchfield
Madeleine Birchfield wrote:
Thu Nov 12, 2020 9:07 pm
Turns out in the dev version of Ethereal, he replaced whatever previous code he was using for endgames with a NNUE net, presumably trained using his trainer.
Looks like the NNUE version of Ethereal is headed off to TCEC Season 20.

https://github.com/AndyGrant/EtherealDev/tree/tcec_nnue
https://github.com/AndyGrant/EtherealDe ... fc608baf9b