Page 22 of 50

Re: Fat Fritz 2

Posted: Fri Feb 12, 2021 11:36 pm
by kinderchocolate

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 2:53 am
by Collingwood
glennsamuel32 wrote: Fri Feb 12, 2021 7:51 pm I'm very glad some have already offered their services to fight this.
Who?

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 4:37 am
by Angle
So much ado about a silly commercial clone that's even weaker than Stockfish dev. Just spit on it, it's not worth such a long discussion. The only interesting thing here is the prospects for strengthening the networks of FF2 format.

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 5:25 am
by dkappe
Angle wrote: Sat Feb 13, 2021 4:37 am So much ado about a silly commercial clone that's even weaker than Stockfish dev. Just spit on it, it's not worth such a long discussion. The only interesting thing here is the prospects for strengthening the networks of FF2 format.
So much hate that people miss important details. The net isn’t just a different size, it’s based on mcts/nn data. Since I’m the only other person to have trained a net on that kind of data, I can promise you that it trains differently than HCE or NNUE data and yields different results. 512x16 isn’t the optimal size for ab data, in my experience. Also, mcts/nn trained nets don’t do well with RL.

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 5:45 am
by Angle
dkappe wrote: Sat Feb 13, 2021 5:25 am So much hate that people miss important details. The net isn’t just a different size, it’s based on mcts/nn data. Since I’m the only other person to have trained a net on that kind of data, I can promise you that it trains differently than HCE or NNUE data and yields different results. 512x16 isn’t the optimal size for ab data, in my experience. Also, mcts/nn trained nets don’t do well with RL.
Probably, you missed the last part of my message:
Angle wrote: Sat Feb 13, 2021 4:37 am The only interesting thing here is the prospects for strengthening the networks of FF2 format.
So I also believe that the new net is interesting and perspective. But rhe only honest way to develop such nets is to train them iwithin the SF project as a contribution to SF development (or just train a net as an end in itself), but not trying to make money from it by pretending that the engine FF2 is original and is a "new No 1".

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 6:21 am
by Angle
dkappe wrote: Sat Feb 13, 2021 5:25 am So much hate that people miss important details. The net isn’t just a different size, it’s based on mcts/nn data. Since I’m the only other person to have trained a net on that kind of data, I can promise you that it trains differently than HCE or NNUE data and yields different results. 512x16 isn’t the optimal size for ab data, in my experience. Also, mcts/nn trained nets don’t do well with RL.
In other words I want to say: if they implement the new net into an engine which is actually Stockfish, then it should be called "Stockfish New NN-X" rather than "Fat Fritz 2" (and hence it should be free and open source).

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 8:22 am
by Dann Corbit
Albert Silver is a chess expert. Perhaps not a programming expert (but neither is Larry who works with Komodo).
That does not diminish their efforts.
I guess that the programming was done by the guy who owns the repository.

There is something novel here (the double sized network).

I think a lot of people who know about chess engine development will be rather put off by the FF2 affair.
Personally, since the source and net have been published, I am somewhat neutral about it.

There has been a request to move this thread to the engine origins forum.
I think technically it should move, but there is a flaming interest in it, so I think the other moderators should also be involved in the decision.

Lots of other people are doing stuff like this (hijacking LC0 and SF and making forks with various degrees of novelty).
I think that if they obey the law then it is OK and people will decide what they want to do about it personally.

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 9:19 am
by Ozymandias
Angle wrote: Sat Feb 13, 2021 4:37 am The only interesting thing here is the prospects for strengthening the networks of FF2 format.
This, since it involves SF and mcts/NN training, the way to go is for the Lc0 and SF groups to start some sort of collaboration. Maybe then we'd really get the "best of both worlds".

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 10:35 am
by jshriver
gaard wrote: Wed Feb 10, 2021 2:43 am No. If I were to find a bug in FF2, I should be able to rebuild it with all else being equal, minus the bug, for example.
You can, I cloned the code base when this was post and able to compiled under Linux. Actually now I think about it you have a point. Because even if you can compile it, its useless unless you have the weights file and since its not a simple file you can copy over from your paid version its useless.

I bought Fritz 17, so it was fun to compile FF1 and just use the weights file from it. They really should separate it out (licensing issues aside)

Re: Fat Fritz 2

Posted: Sat Feb 13, 2021 10:53 am
by jshriver
This reminds me of the days before GPL3 born, so their may be precedent. When Linux started gaining serious ground and business started using it (GPLv2) there was a lot of buzz and lawsuits over companies not releasing source code or even if they did they didnt release it all.

My point, with GPLv2 distributing a binary with open source + proprietary code was a violation. The only way around that was if you put your code in a library and dynamically linked it. Static or hard linking was a violation.

From my understanding that was a BIG reason why GPLv3 was even born, to addess this issue. People wanted nvidia drivers and to make it easier for companies to contribute w/o risking their assets.

On one hand this applies to code not data, so perhaps I'm comparing apples to oranges. If there was a commercial egtb and someone embedded the 3-4 men in the engime binary, that sounds illegal. This is kind of the reverse.

My $0.02. Personally I think its a money grab, they should have release it as part of a bundle like FF1. Know some people mentioned recovering cost for hardware to make the net, but I cant imagine it being that much. I found a post by the author of the NN talking about making a Google colab noteook to test training NN, making it pubically available and asking for volunteers. That was only 4 months ago.

https://www.kaggle.com/questions-and-answers/193306

Food for thought.