I don't think so. Just a few changes in source and some analyzing job. For the size, no matter how many positions you train, size will be the same. We have no prove that they analysed "billions" as they said.Modern Times wrote: ↑Tue Feb 09, 2021 4:21 pmI suspect a lot of money was spent on renting hardware to create that double-size network. They have to recover the cost somehow.
Fat Fritz 2
Moderators: hgm, Rebel, chrisw
-
- Posts: 91
- Joined: Thu May 14, 2020 3:34 pm
- Full name: A. B. Gursu
Re: Fat Fritz 2
-
- Posts: 1631
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Fat Fritz 2
I think this is unwarranted skepticism. I’ve generated ~1b positions d8 with Toga II (an old letterbox engine) in a few months using an old Mac mini and ~1b positions at 800 nodes with a gtx 1060 using bad gyal in about the same period of time. Both of those nets play different chess than SF nets according to the various similarity testers.abgursu wrote: ↑Tue Feb 09, 2021 6:38 pmI don't think so. Just a few changes in source and some analyzing job. For the size, no matter how many positions you train, size will be the same. We have no prove that they analysed "billions" as they said.Modern Times wrote: ↑Tue Feb 09, 2021 4:21 pmI suspect a lot of money was spent on renting hardware to create that double-size network. They have to recover the cost somehow.
That’s all in my private capacity as a chess engine and net tinkerer. Based on actual experience, I’m confident that a net like FF1 would yield a strong and distinct nnue with its data. If you spent some real money rather than just using old cast off machines like me, why not?
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 1754
- Joined: Tue Apr 19, 2016 6:08 am
- Location: U.S.A
- Full name: Andrew Grant
Re: Fat Fritz 2
Stockfish clone, likely worse, likely not special in training. If you want to spend money in the computer chess world, buy a copy of Komodo so that you are at least contributing to the continued progression of computer chess, not to the pockets of someone who has made no efforts or advancements in the field aside from going up the tax brackets. Perhaps something soon will appear worth buying...
Last edited by AndrewGrant on Tue Feb 09, 2021 8:02 pm, edited 1 time in total.
#WeAreAllDraude #JusticeForDraude #RememberDraude #LeptirBigUltra
"Those who can't do, clone instead" - Eduard ( A real life friend, not this forum's Eduard )
"Those who can't do, clone instead" - Eduard ( A real life friend, not this forum's Eduard )
-
- Posts: 91
- Joined: Thu May 14, 2020 3:34 pm
- Full name: A. B. Gursu
Re: Fat Fritz 2
Oh, it is not about possibility. I meant that there is no need of spending money for developing FF2. They didn't wrote a new engine or something. They just changed Stockfish and trained a new NNUE. If we have nothing to prove, everything is suspicion. They can train a strong NNUE with a few thousands or millons of positions and they can advertise it like billions. There is no difference for people who're familiar with NN engines but normal chess players will think that "Whoa, billions? I must buy it. It looks like very strong." just because they think every more position in training must result better. Millions and Thousands are not impressive as Billions.dkappe wrote: ↑Tue Feb 09, 2021 7:00 pmI think this is unwarranted skepticism. I’ve generated ~1b positions d8 with Toga II (an old letterbox engine) in a few months using an old Mac mini and ~1b positions at 800 nodes with a gtx 1060 using bad gyal in about the same period of time. Both of those nets play different chess than SF nets according to the various similarity testers.abgursu wrote: ↑Tue Feb 09, 2021 6:38 pmI don't think so. Just a few changes in source and some analyzing job. For the size, no matter how many positions you train, size will be the same. We have no prove that they analysed "billions" as they said.Modern Times wrote: ↑Tue Feb 09, 2021 4:21 pmI suspect a lot of money was spent on renting hardware to create that double-size network. They have to recover the cost somehow.
That’s all in my private capacity as a chess engine and net tinkerer. Based on actual experience, I’m confident that a net like FF1 would yield a strong and distinct nnue with its data. If you spent some real money rather than just using old cast off machines like me, why not?
-
- Posts: 1242
- Joined: Sat Jul 05, 2014 7:54 am
- Location: Southwest USA
Re: Fat Fritz 2
A excellent question....nothing evident on the Chessbase site......."Fat Fritz 2.0 - The new number 1"
https://en.chessbase.com/post/fat-fritz ... w-number-1 99.90 Euros Folks...
Meanwhile these "engines" keep getting Larger and Larger instead of Stronger and Smaller....Wow 40-60 mg for a SF Chess Engine...what an Amazing Development....Ceres engines at 800 MB....whats next?
-
- Posts: 1631
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Fat Fritz 2
Again, based on extensive experience training non-SF nnue — Night Nurse, Harmon, Toga, Frosty, Dark Horse — you will get nowhere with less than 500m positions. Ideally you want ~1b positions. With an AB engine, you’ll need to do reenforcement learning, so closer to 3b+ positions.abgursu wrote: ↑Tue Feb 09, 2021 7:53 pm Oh, it is not about possibility. I meant that there is no need of spending money for developing FF2. They didn't wrote a new engine or something. They just changed Stockfish and trained a new NNUE. If we have nothing to prove, everything is suspicion. They can train a strong NNUE with a few thousands or millons of positions and they can advertise it like billions. There is no difference for people who're familiar with NN engines but normal chess players will think that "Whoa, billions? I must buy it. It looks like very strong." just because they think every more position in training must result better. Millions and Thousands are not impressive as Billions.
Perhaps you, like me, have done extensive nnue training from a variety of data sources? I’m curious to hear about your experiences.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 550
- Joined: Tue Nov 19, 2019 8:48 pm
- Full name: Alayan Feh
Re: Fat Fritz 2
A useless clone, taking advantage of the work of unpaid volunteers that didn't put a price tag for advancing computer chess, trying to get money from people that don't know better.
Legal but shady.
The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.
Legal but shady.
The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.
-
- Posts: 1631
- Joined: Tue Aug 21, 2018 7:52 pm
- Full name: Dietrich Kappe
Re: Fat Fritz 2
I’m sorry you feel my work is a scam.Alayan wrote: ↑Tue Feb 09, 2021 8:02 pm A useless clone, taking advantage of the work of unpaid volunteers that didn't put a price tag for advancing computer chess, trying to get money from people that don't know better.
Legal but shady.
The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
-
- Posts: 91
- Joined: Thu May 14, 2020 3:34 pm
- Full name: A. B. Gursu
Re: Fat Fritz 2
I did some tries for an easier chess odds, but nothing very exhaustive like yours. But I realized that with small but true data we can do much better than random billions. NN's are totally black boxes. Any eval or a position can interrupt the whole data to be perfect.dkappe wrote: ↑Tue Feb 09, 2021 7:59 pmAgain, based on extensive experience training non-SF nnue — Night Nurse, Harmon, Toga, Frosty, Dark Horse — you will get nowhere with less than 500m positions. Ideally you want ~1b positions. With an AB engine, you’ll need to do reenforcement learning, so closer to 3b+ positions.abgursu wrote: ↑Tue Feb 09, 2021 7:53 pm Oh, it is not about possibility. I meant that there is no need of spending money for developing FF2. They didn't wrote a new engine or something. They just changed Stockfish and trained a new NNUE. If we have nothing to prove, everything is suspicion. They can train a strong NNUE with a few thousands or millons of positions and they can advertise it like billions. There is no difference for people who're familiar with NN engines but normal chess players will think that "Whoa, billions? I must buy it. It looks like very strong." just because they think every more position in training must result better. Millions and Thousands are not impressive as Billions.
Perhaps you, like me, have done extensive nnue training from a variety of data sources? I’m curious to hear about your experiences.
-
- Posts: 91
- Joined: Thu May 14, 2020 3:34 pm
- Full name: A. B. Gursu
Re: Fat Fritz 2
If you're not the one who made FF2, then there is no reason for you to be taken on right?dkappe wrote: ↑Tue Feb 09, 2021 8:09 pmI’m sorry you feel my work is a scam.Alayan wrote: ↑Tue Feb 09, 2021 8:02 pm A useless clone, taking advantage of the work of unpaid volunteers that didn't put a price tag for advancing computer chess, trying to get money from people that don't know better.
Legal but shady.
The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.