Fat Fritz 2
Moderators: hgm, Rebel, chrisw
-
- Posts: 89
- Joined: Sat Nov 09, 2019 3:24 pm
- Full name: .
Re: Fat Fritz 2
Who?glennsamuel32 wrote: ↑Fri Feb 12, 2021 7:51 pm I'm very glad some have already offered their services to fight this.
-
- Posts: 319
- Joined: Sat Oct 31, 2020 1:04 am
- Full name: Aleksey Glebov
Re: Fat Fritz 2
So much ado about a silly commercial clone that's even weaker than Stockfish dev. Just spit on it, it's not worth such a long discussion. The only interesting thing here is the prospects for strengthening the networks of FF2 format.
Incredibly fast systems miscount incredibly fast.
-
- Posts: 1631
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Fat Fritz 2
So much hate that people miss important details. The net isn’t just a different size, it’s based on mcts/nn data. Since I’m the only other person to have trained a net on that kind of data, I can promise you that it trains differently than HCE or NNUE data and yields different results. 512x16 isn’t the optimal size for ab data, in my experience. Also, mcts/nn trained nets don’t do well with RL.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 319
- Joined: Sat Oct 31, 2020 1:04 am
- Full name: Aleksey Glebov
Re: Fat Fritz 2
Probably, you missed the last part of my message:dkappe wrote: ↑Sat Feb 13, 2021 5:25 am So much hate that people miss important details. The net isn’t just a different size, it’s based on mcts/nn data. Since I’m the only other person to have trained a net on that kind of data, I can promise you that it trains differently than HCE or NNUE data and yields different results. 512x16 isn’t the optimal size for ab data, in my experience. Also, mcts/nn trained nets don’t do well with RL.
So I also believe that the new net is interesting and perspective. But rhe only honest way to develop such nets is to train them iwithin the SF project as a contribution to SF development (or just train a net as an end in itself), but not trying to make money from it by pretending that the engine FF2 is original and is a "new No 1".
Incredibly fast systems miscount incredibly fast.
-
- Posts: 319
- Joined: Sat Oct 31, 2020 1:04 am
- Full name: Aleksey Glebov
Re: Fat Fritz 2
In other words I want to say: if they implement the new net into an engine which is actually Stockfish, then it should be called "Stockfish New NN-X" rather than "Fat Fritz 2" (and hence it should be free and open source).dkappe wrote: ↑Sat Feb 13, 2021 5:25 am So much hate that people miss important details. The net isn’t just a different size, it’s based on mcts/nn data. Since I’m the only other person to have trained a net on that kind of data, I can promise you that it trains differently than HCE or NNUE data and yields different results. 512x16 isn’t the optimal size for ab data, in my experience. Also, mcts/nn trained nets don’t do well with RL.
Incredibly fast systems miscount incredibly fast.
-
- Posts: 12541
- Joined: Wed Mar 08, 2006 8:57 pm
- Location: Redmond, WA USA
Re: Fat Fritz 2
Albert Silver is a chess expert. Perhaps not a programming expert (but neither is Larry who works with Komodo).
That does not diminish their efforts.
I guess that the programming was done by the guy who owns the repository.
There is something novel here (the double sized network).
I think a lot of people who know about chess engine development will be rather put off by the FF2 affair.
Personally, since the source and net have been published, I am somewhat neutral about it.
There has been a request to move this thread to the engine origins forum.
I think technically it should move, but there is a flaming interest in it, so I think the other moderators should also be involved in the decision.
Lots of other people are doing stuff like this (hijacking LC0 and SF and making forks with various degrees of novelty).
I think that if they obey the law then it is OK and people will decide what they want to do about it personally.
That does not diminish their efforts.
I guess that the programming was done by the guy who owns the repository.
There is something novel here (the double sized network).
I think a lot of people who know about chess engine development will be rather put off by the FF2 affair.
Personally, since the source and net have been published, I am somewhat neutral about it.
There has been a request to move this thread to the engine origins forum.
I think technically it should move, but there is a flaming interest in it, so I think the other moderators should also be involved in the decision.
Lots of other people are doing stuff like this (hijacking LC0 and SF and making forks with various degrees of novelty).
I think that if they obey the law then it is OK and people will decide what they want to do about it personally.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
-
- Posts: 1535
- Joined: Sun Oct 25, 2009 2:30 am
Re: Fat Fritz 2
This, since it involves SF and mcts/NN training, the way to go is for the Lc0 and SF groups to start some sort of collaboration. Maybe then we'd really get the "best of both worlds".
-
- Posts: 1342
- Joined: Wed Mar 08, 2006 9:41 pm
- Location: Morgantown, WV, USA
Re: Fat Fritz 2
You can, I cloned the code base when this was post and able to compiled under Linux. Actually now I think about it you have a point. Because even if you can compile it, its useless unless you have the weights file and since its not a simple file you can copy over from your paid version its useless.
I bought Fritz 17, so it was fun to compile FF1 and just use the weights file from it. They really should separate it out (licensing issues aside)
-
- Posts: 1342
- Joined: Wed Mar 08, 2006 9:41 pm
- Location: Morgantown, WV, USA
Re: Fat Fritz 2
This reminds me of the days before GPL3 born, so their may be precedent. When Linux started gaining serious ground and business started using it (GPLv2) there was a lot of buzz and lawsuits over companies not releasing source code or even if they did they didnt release it all.
My point, with GPLv2 distributing a binary with open source + proprietary code was a violation. The only way around that was if you put your code in a library and dynamically linked it. Static or hard linking was a violation.
From my understanding that was a BIG reason why GPLv3 was even born, to addess this issue. People wanted nvidia drivers and to make it easier for companies to contribute w/o risking their assets.
On one hand this applies to code not data, so perhaps I'm comparing apples to oranges. If there was a commercial egtb and someone embedded the 3-4 men in the engime binary, that sounds illegal. This is kind of the reverse.
My $0.02. Personally I think its a money grab, they should have release it as part of a bundle like FF1. Know some people mentioned recovering cost for hardware to make the net, but I cant imagine it being that much. I found a post by the author of the NN talking about making a Google colab noteook to test training NN, making it pubically available and asking for volunteers. That was only 4 months ago.
https://www.kaggle.com/questions-and-answers/193306
Food for thought.
My point, with GPLv2 distributing a binary with open source + proprietary code was a violation. The only way around that was if you put your code in a library and dynamically linked it. Static or hard linking was a violation.
From my understanding that was a BIG reason why GPLv3 was even born, to addess this issue. People wanted nvidia drivers and to make it easier for companies to contribute w/o risking their assets.
On one hand this applies to code not data, so perhaps I'm comparing apples to oranges. If there was a commercial egtb and someone embedded the 3-4 men in the engime binary, that sounds illegal. This is kind of the reverse.
My $0.02. Personally I think its a money grab, they should have release it as part of a bundle like FF1. Know some people mentioned recovering cost for hardware to make the net, but I cant imagine it being that much. I found a post by the author of the NN talking about making a Google colab noteook to test training NN, making it pubically available and asking for volunteers. That was only 4 months ago.
https://www.kaggle.com/questions-and-answers/193306
Food for thought.