Fat Fritz 2

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

abgursu
Posts: 91
Joined: Thu May 14, 2020 3:34 pm
Full name: A. B. Gursu

Re: Fat Fritz 2

Post by abgursu »

Modern Times wrote: Tue Feb 09, 2021 4:21 pm
Werewolf wrote: Tue Feb 09, 2021 3:49 pm Oh goodness. Do you actually have to buy it for 99 euros?
I suspect a lot of money was spent on renting hardware to create that double-size network. They have to recover the cost somehow.
I don't think so. Just a few changes in source and some analyzing job. For the size, no matter how many positions you train, size will be the same. We have no prove that they analysed "billions" as they said.
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Fat Fritz 2

Post by dkappe »

abgursu wrote: Tue Feb 09, 2021 6:38 pm
Modern Times wrote: Tue Feb 09, 2021 4:21 pm
Werewolf wrote: Tue Feb 09, 2021 3:49 pm Oh goodness. Do you actually have to buy it for 99 euros?
I suspect a lot of money was spent on renting hardware to create that double-size network. They have to recover the cost somehow.
I don't think so. Just a few changes in source and some analyzing job. For the size, no matter how many positions you train, size will be the same. We have no prove that they analysed "billions" as they said.
I think this is unwarranted skepticism. I’ve generated ~1b positions d8 with Toga II (an old letterbox engine) in a few months using an old Mac mini and ~1b positions at 800 nodes with a gtx 1060 using bad gyal in about the same period of time. Both of those nets play different chess than SF nets according to the various similarity testers.

That’s all in my private capacity as a chess engine and net tinkerer. Based on actual experience, I’m confident that a net like FF1 would yield a strong and distinct nnue with its data. If you spent some real money rather than just using old cast off machines like me, why not?
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
AndrewGrant
Posts: 1754
Joined: Tue Apr 19, 2016 6:08 am
Location: U.S.A
Full name: Andrew Grant

Re: Fat Fritz 2

Post by AndrewGrant »

Stockfish clone, likely worse, likely not special in training. If you want to spend money in the computer chess world, buy a copy of Komodo so that you are at least contributing to the continued progression of computer chess, not to the pockets of someone who has made no efforts or advancements in the field aside from going up the tax brackets. Perhaps something soon will appear worth buying...
Last edited by AndrewGrant on Tue Feb 09, 2021 8:02 pm, edited 1 time in total.
#WeAreAllDraude #JusticeForDraude #RememberDraude #LeptirBigUltra
"Those who can't do, clone instead" - Eduard ( A real life friend, not this forum's Eduard )
abgursu
Posts: 91
Joined: Thu May 14, 2020 3:34 pm
Full name: A. B. Gursu

Re: Fat Fritz 2

Post by abgursu »

dkappe wrote: Tue Feb 09, 2021 7:00 pm
abgursu wrote: Tue Feb 09, 2021 6:38 pm
Modern Times wrote: Tue Feb 09, 2021 4:21 pm
Werewolf wrote: Tue Feb 09, 2021 3:49 pm Oh goodness. Do you actually have to buy it for 99 euros?
I suspect a lot of money was spent on renting hardware to create that double-size network. They have to recover the cost somehow.
I don't think so. Just a few changes in source and some analyzing job. For the size, no matter how many positions you train, size will be the same. We have no prove that they analysed "billions" as they said.
I think this is unwarranted skepticism. I’ve generated ~1b positions d8 with Toga II (an old letterbox engine) in a few months using an old Mac mini and ~1b positions at 800 nodes with a gtx 1060 using bad gyal in about the same period of time. Both of those nets play different chess than SF nets according to the various similarity testers.

That’s all in my private capacity as a chess engine and net tinkerer. Based on actual experience, I’m confident that a net like FF1 would yield a strong and distinct nnue with its data. If you spent some real money rather than just using old cast off machines like me, why not?
Oh, it is not about possibility. I meant that there is no need of spending money for developing FF2. They didn't wrote a new engine or something. They just changed Stockfish and trained a new NNUE. If we have nothing to prove, everything is suspicion. They can train a strong NNUE with a few thousands or millons of positions and they can advertise it like billions. There is no difference for people who're familiar with NN engines but normal chess players will think that "Whoa, billions? I must buy it. It looks like very strong." just because they think every more position in training must result better. Millions and Thousands are not impressive as Billions.
supersharp77
Posts: 1242
Joined: Sat Jul 05, 2014 7:54 am
Location: Southwest USA

Re: Fat Fritz 2

Post by supersharp77 »

amchess wrote: Tue Feb 09, 2021 5:17 pm what's the github repository link?
A excellent question....nothing evident on the Chessbase site......."Fat Fritz 2.0 - The new number 1"

https://en.chessbase.com/post/fat-fritz ... w-number-1 99.90 Euros Folks...

Meanwhile these "engines" keep getting Larger and Larger instead of Stronger and Smaller....Wow 40-60 mg for a SF Chess Engine...what an Amazing Development....Ceres engines at 800 MB....whats next? :) :wink:
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Fat Fritz 2

Post by dkappe »

abgursu wrote: Tue Feb 09, 2021 7:53 pm Oh, it is not about possibility. I meant that there is no need of spending money for developing FF2. They didn't wrote a new engine or something. They just changed Stockfish and trained a new NNUE. If we have nothing to prove, everything is suspicion. They can train a strong NNUE with a few thousands or millons of positions and they can advertise it like billions. There is no difference for people who're familiar with NN engines but normal chess players will think that "Whoa, billions? I must buy it. It looks like very strong." just because they think every more position in training must result better. Millions and Thousands are not impressive as Billions.
Again, based on extensive experience training non-SF nnue — Night Nurse, Harmon, Toga, Frosty, Dark Horse — you will get nowhere with less than 500m positions. Ideally you want ~1b positions. With an AB engine, you’ll need to do reenforcement learning, so closer to 3b+ positions.

Perhaps you, like me, have done extensive nnue training from a variety of data sources? I’m curious to hear about your experiences.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
Alayan
Posts: 550
Joined: Tue Nov 19, 2019 8:48 pm
Full name: Alayan Feh

Re: Fat Fritz 2

Post by Alayan »

A useless clone, taking advantage of the work of unpaid volunteers that didn't put a price tag for advancing computer chess, trying to get money from people that don't know better.

Legal but shady.

The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.
dkappe
Posts: 1631
Joined: Tue Aug 21, 2018 7:52 pm
Full name: Dietrich Kappe

Re: Fat Fritz 2

Post by dkappe »

Alayan wrote: Tue Feb 09, 2021 8:02 pm A useless clone, taking advantage of the work of unpaid volunteers that didn't put a price tag for advancing computer chess, trying to get money from people that don't know better.

Legal but shady.

The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.
I’m sorry you feel my work is a scam.
Fat Titz by Stockfish, the engine with the bodaciously big net. Remember: size matters. If you want to learn more about this engine just google for "Fat Titz".
abgursu
Posts: 91
Joined: Thu May 14, 2020 3:34 pm
Full name: A. B. Gursu

Re: Fat Fritz 2

Post by abgursu »

dkappe wrote: Tue Feb 09, 2021 7:59 pm
abgursu wrote: Tue Feb 09, 2021 7:53 pm Oh, it is not about possibility. I meant that there is no need of spending money for developing FF2. They didn't wrote a new engine or something. They just changed Stockfish and trained a new NNUE. If we have nothing to prove, everything is suspicion. They can train a strong NNUE with a few thousands or millons of positions and they can advertise it like billions. There is no difference for people who're familiar with NN engines but normal chess players will think that "Whoa, billions? I must buy it. It looks like very strong." just because they think every more position in training must result better. Millions and Thousands are not impressive as Billions.
Again, based on extensive experience training non-SF nnue — Night Nurse, Harmon, Toga, Frosty, Dark Horse — you will get nowhere with less than 500m positions. Ideally you want ~1b positions. With an AB engine, you’ll need to do reenforcement learning, so closer to 3b+ positions.

Perhaps you, like me, have done extensive nnue training from a variety of data sources? I’m curious to hear about your experiences.
I did some tries for an easier chess odds, but nothing very exhaustive like yours. But I realized that with small but true data we can do much better than random billions. NN's are totally black boxes. Any eval or a position can interrupt the whole data to be perfect.
abgursu
Posts: 91
Joined: Thu May 14, 2020 3:34 pm
Full name: A. B. Gursu

Re: Fat Fritz 2

Post by abgursu »

dkappe wrote: Tue Feb 09, 2021 8:09 pm
Alayan wrote: Tue Feb 09, 2021 8:02 pm A useless clone, taking advantage of the work of unpaid volunteers that didn't put a price tag for advancing computer chess, trying to get money from people that don't know better.

Legal but shady.

The scheme only work with deceptive marketing (using SF-dev for their clone, comparing performance to SF12...). It's not an outright scam, but it's not much better.
I’m sorry you feel my work is a scam.
If you're not the one who made FF2, then there is no reason for you to be taken on right?