Ethereal Tuning - Data Dump

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

AndrewGrant
Posts: 1750
Joined: Tue Apr 19, 2016 6:08 am
Location: U.S.A
Full name: Andrew Grant

Ethereal Tuning - Data Dump

Post by AndrewGrant »

Tuning Paper has been pushed to Github (Still a draft, but not finishing it)
https://github.com/AndyGrant/Ethereal/b ... Tuning.pdf

3x Sets of ~10M positions of <fen> <result>




1x Sets of ~12.5M postitions from FRC of <fen> <result>


Code to extract positions from PGNs for building books


Not sure how long any of these links will stay alive. About a dozen authors have used parts of the datasets I've just shared, and have found massive gains along the way. I'm confident that everyone can gain elo from them. Even Stockfish could gain elo if my paper is implemented and relative data is generated.
#WeAreAllDraude #JusticeForDraude #RememberDraude #LeptirBigUltra
"Those who can't do, clone instead" - Eduard ( A real life friend, not this forum's Eduard )
brianr
Posts: 536
Joined: Thu Mar 09, 2006 3:01 pm

Re: Ethereal Tuning - Data Dump

Post by brianr »

Thank you for sharing your work, yet again.
Joost Buijs
Posts: 1563
Joined: Thu Jul 16, 2009 10:47 am
Location: Almere, The Netherlands

Re: Ethereal Tuning - Data Dump

Post by Joost Buijs »

Thanks for sharing this!
Joerg Oster
Posts: 937
Joined: Fri Mar 10, 2006 4:29 pm
Location: Germany

Re: Ethereal Tuning - Data Dump

Post by Joerg Oster »

Thank you!
Jörg Oster
jdart
Posts: 4366
Joined: Fri Mar 10, 2006 5:23 am
Location: http://www.arasanchess.org

Re: Ethereal Tuning - Data Dump

Post by jdart »

Might want to put these into a torrent, or onto something like Prontoshare (https://www.prontoshare.com).
chrisw
Posts: 4313
Joined: Tue Apr 03, 2012 4:28 pm

Re: Ethereal Tuning - Data Dump

Post by chrisw »

AndrewGrant wrote: Sat Oct 10, 2020 11:36 am Tuning Paper has been pushed to Github (Still a draft, but not finishing it)
https://github.com/AndyGrant/Ethereal/b ... Tuning.pdf

3x Sets of ~10M positions of <fen> <result>




1x Sets of ~12.5M postitions from FRC of <fen> <result>


Code to extract positions from PGNs for building books


Not sure how long any of these links will stay alive. About a dozen authors have used parts of the datasets I've just shared, and have found massive gains along the way. I'm confident that everyone can gain elo from them. Even Stockfish could gain elo if my paper is implemented and relative data is generated.
I did a very quick and dirty re-tune using this set (minus the FRC) with target game result (as opposed to long time prior which was CCRL blitz + standard, using 50:50 fast SF11 eval and game result, 31mn positions) and got +80 Elo. Looks like game quality matters.
However, went on nightmare server, played a few games against Rofscade, which I think is almost certainly NNUE, and got completely trashed. Watched the games carefully with my chess player eye, and it was very clear in each case that the trashing was a positional crush, NNUE just gradually and methodically improves its position (despite way lower nps), no flashy tactics, just very strong positional play via evaluation function. Asphyxiation. And consistent with it. And reflected in PV score.
This is going to be very difficult to get competitive with, but I’m still working on it.
chrisw
Posts: 4313
Joined: Tue Apr 03, 2012 4:28 pm

Re: Ethereal Tuning - Data Dump

Post by chrisw »

AndrewGrant wrote: Sat Oct 10, 2020 11:36 am Tuning Paper has been pushed to Github (Still a draft, but not finishing it)
https://github.com/AndyGrant/Ethereal/b ... Tuning.pdf

3x Sets of ~10M positions of <fen> <result>




1x Sets of ~12.5M postitions from FRC of <fen> <result>


Code to extract positions from PGNs for building books


Not sure how long any of these links will stay alive. About a dozen authors have used parts of the datasets I've just shared, and have found massive gains along the way. I'm confident that everyone can gain elo from them. Even Stockfish could gain elo if my paper is implemented and relative data is generated.
I did a very quick and dirty re-tune using this set (minus the FRC) with target game result (as opposed to long time prior which was CCRL blitz + standard, using 50:50 fast SF11 eval and game result, 31mn positions) and got +80 Elo. Looks like game quality matters.
However, went on nightmare server, played a few games against Rofscade, which I think is almost certainly NNUE, and got completely trashed. Watched the games carefully with my chess player eye, and it was very clear in each case that the trashing was a positional crush, NNUE just gradually and methodically improves its position (despite way lower nps), no flashy tactics, just very strong positional play via evaluation function. Asphyxiation. And consistent with it. And reflected in PV score.
This is going to be very difficult to get competitive with, but I’m still working on it.
Joost Buijs
Posts: 1563
Joined: Thu Jul 16, 2009 10:47 am
Location: Almere, The Netherlands

Re: Ethereal Tuning - Data Dump

Post by Joost Buijs »

chrisw wrote: Fri Oct 16, 2020 9:49 am
AndrewGrant wrote: Sat Oct 10, 2020 11:36 am Tuning Paper has been pushed to Github (Still a draft, but not finishing it)
https://github.com/AndyGrant/Ethereal/b ... Tuning.pdf

3x Sets of ~10M positions of <fen> <result>




1x Sets of ~12.5M postitions from FRC of <fen> <result>


Code to extract positions from PGNs for building books


Not sure how long any of these links will stay alive. About a dozen authors have used parts of the datasets I've just shared, and have found massive gains along the way. I'm confident that everyone can gain elo from them. Even Stockfish could gain elo if my paper is implemented and relative data is generated.
I did a very quick and dirty re-tune using this set (minus the FRC) with target game result (as opposed to long time prior which was CCRL blitz + standard, using 50:50 fast SF11 eval and game result, 31mn positions) and got +80 Elo. Looks like game quality matters.
However, went on nightmare server, played a few games against Rofscade, which I think is almost certainly NNUE, and got completely trashed. Watched the games carefully with my chess player eye, and it was very clear in each case that the trashing was a positional crush, NNUE just gradually and methodically improves its position (despite way lower nps), no flashy tactics, just very strong positional play via evaluation function. Asphyxiation. And consistent with it. And reflected in PV score.
This is going to be very difficult to get competitive with, but I’m still working on it.
I didn't try to optimize my engine with this set yet, I have to make a small change in my fen-reader to read the results within square brackets. When I feel like it, I will give it a try before the tournament of tomorrow evening.

If you were playing against Rofchade on his 64 core 3990X it's not a shame that you were trashed, Rofchade on a 'Raspberry Pi' is already difficult enough.
D Sceviour
Posts: 570
Joined: Mon Jul 20, 2015 5:06 pm

Re: Ethereal Tuning - Data Dump

Post by D Sceviour »

Joost Buijs wrote: Fri Oct 16, 2020 1:11 pm
chrisw wrote: Fri Oct 16, 2020 9:49 am
AndrewGrant wrote: Sat Oct 10, 2020 11:36 am Tuning Paper has been pushed to Github (Still a draft, but not finishing it)
https://github.com/AndyGrant/Ethereal/b ... Tuning.pdf

3x Sets of ~10M positions of <fen> <result>




1x Sets of ~12.5M postitions from FRC of <fen> <result>


Code to extract positions from PGNs for building books


Not sure how long any of these links will stay alive. About a dozen authors have used parts of the datasets I've just shared, and have found massive gains along the way. I'm confident that everyone can gain elo from them. Even Stockfish could gain elo if my paper is implemented and relative data is generated.
I did a very quick and dirty re-tune using this set (minus the FRC) with target game result (as opposed to long time prior which was CCRL blitz + standard, using 50:50 fast SF11 eval and game result, 31mn positions) and got +80 Elo. Looks like game quality matters.
However, went on nightmare server, played a few games against Rofscade, which I think is almost certainly NNUE, and got completely trashed. Watched the games carefully with my chess player eye, and it was very clear in each case that the trashing was a positional crush, NNUE just gradually and methodically improves its position (despite way lower nps), no flashy tactics, just very strong positional play via evaluation function. Asphyxiation. And consistent with it. And reflected in PV score.
This is going to be very difficult to get competitive with, but I’m still working on it.
I didn't try to optimize my engine with this set yet, I have to make a small change in my fen-reader to read the results within square brackets. When I feel like it, I will give it a try before the tournament of tomorrow evening.

If you were playing against Rofchade on his 64 core 3990X it's not a shame that you were trashed, Rofchade on a 'Raspberry Pi' is already difficult enough.
I ran a test and got no useful results. That is, the tuning was unable to produce any improvement over Schooners master PST's.

Code: Select all

   1 Schooner2.25-sse                2       8    3000     50%     60%
   2 Schooner-E12_2                  2       8    3000     50%     61%
   3 Schooner1_125_end_2nd           1       8    3000     50%     60%
   4 Schooner-E12_1                 -4       8    3000     49%     59%
The test was compared against my new 1 million epd position filtered training set which I might post. The Ethereal E12 sets above had 80,000 duplicate positions.
chrisw
Posts: 4313
Joined: Tue Apr 03, 2012 4:28 pm

Re: Ethereal Tuning - Data Dump

Post by chrisw »

Joost Buijs wrote: Fri Oct 16, 2020 1:11 pm
chrisw wrote: Fri Oct 16, 2020 9:49 am
AndrewGrant wrote: Sat Oct 10, 2020 11:36 am Tuning Paper has been pushed to Github (Still a draft, but not finishing it)
https://github.com/AndyGrant/Ethereal/b ... Tuning.pdf

3x Sets of ~10M positions of <fen> <result>




1x Sets of ~12.5M postitions from FRC of <fen> <result>


Code to extract positions from PGNs for building books


Not sure how long any of these links will stay alive. About a dozen authors have used parts of the datasets I've just shared, and have found massive gains along the way. I'm confident that everyone can gain elo from them. Even Stockfish could gain elo if my paper is implemented and relative data is generated.
I did a very quick and dirty re-tune using this set (minus the FRC) with target game result (as opposed to long time prior which was CCRL blitz + standard, using 50:50 fast SF11 eval and game result, 31mn positions) and got +80 Elo. Looks like game quality matters.
However, went on nightmare server, played a few games against Rofscade, which I think is almost certainly NNUE, and got completely trashed. Watched the games carefully with my chess player eye, and it was very clear in each case that the trashing was a positional crush, NNUE just gradually and methodically improves its position (despite way lower nps), no flashy tactics, just very strong positional play via evaluation function. Asphyxiation. And consistent with it. And reflected in PV score.
This is going to be very difficult to get competitive with, but I’m still working on it.
I didn't try to optimize my engine with this set yet, I have to make a small change in my fen-reader to read the results within square brackets. When I feel like it, I will give it a try before the tournament of tomorrow evening.

If you were playing against Rofchade on his 64 core 3990X it's not a shame that you were trashed, Rofchade on a 'Raspberry Pi' is already difficult enough.
it was marked as rofchade, not rpirofchade, so I guess the full on version, 64x. Significant was not really the hardware it was on, but the absolute dominance of the eval function, they were evaluation crushes not search ones. mobility and positional. not fireworks, which I guess figures, those small nets are not going to be highly knowledge packed for difficult king safety/king tactical stuff, but it looks like the knowledge packing/tuning for standard structural features is more than enough.