Is any competent one here?? Correct the RYBKA libels!

Discussion of chess software programming and technical issues.

Moderators: bob, hgm, Harvey Williamson

Forum rules
This textbox is used to restore diagrams posted with the [d] tag before the upgrade.
sedicla
Posts: 147
Joined: Fri Jan 07, 2011 11:51 pm
Location: USA

Re: Is any competent one here?? Correct the RYBKA libels!

Post by sedicla » Thu Mar 17, 2011 2:50 pm

my opinion

Image

Don't waste your energy...

bob
Posts: 20916
Joined: Mon Feb 27, 2006 6:30 pm
Location: Birmingham, AL

Re: Is any competent one here?? Correct the RYBKA libels!

Post by bob » Thu Mar 17, 2011 3:21 pm

Romy wrote:
bob wrote:divert attention from the actual examination that is in progress.
Contaminated examination due to connections of participants?
What is the probability of writing two different 100 line blocks of C code, independently by two different programmers, and have them produce the same optimized assembly language (and by inference, identical semantics)?Close enough to zero to call it so.
Respectfuley, that is nonsense.

It depend wholly and holey on the brief to programmers. If one is writing a program to evaluate mobility and other to count beans then the probability is equivalenced zero. But if both are writing a null search (extended type 2a) probability is hugely incremented.
Yes, _HUGELY_ incremented. From maybe .00000000000001% to .0000000001%. Still nearly zero. Your statements are completely uninformed. Don't pull this "I have talked to computer science department" nonsense either. I am in a computer science department. One that is actually reputable.


"Theoretically possible". Practically impossible.
Stored for use.
Once you dig deep enough we won't have to deal with you any longer.
Is censorship and banning already necessary?
Usually some preliminaries, like your losing the argument?

No idea what you are talking about. We have certainly not lost _this_ argument.


Why "within one day"?
Because you could hand compile, given a month. But I agree. I give you a week.
If you give me the 5 compiled versions, and enough time, I'll take the challenge by myself, and if I am successful, will you go away for all time?
Again you are not in concentration!
The 5 versions P,Q... are SOURCE not COMPILED!

The sources will look and smell different. Even very differenced. Maybe the ones which look most different will compile to identical objects!
For source code, this is not hard at all. I don't care if they "look and smell different." I grade student assignments every week. And I specifically look for exactly this. In fact, there are programming tools that will compare source programs. Several universities have developed them and far more use the tools. Of course, you don't know about that, correct? Comparing semantic equivalence between source programs is an automated process nowadays. And you didn't know that either, correct? In fact, you actually know _very_ little, it seems.



The point was three of them will not only produce identical output results when compiled with any compliant compiler, but also if juiced by a special compiler will compile to identical object codes. Your job will be to find which 3 of the 5.
SInce you said "source" it is not any work at all, just run it through (say) the software plagiarism detection software from Stanford, or other places. You are _really_ showing just how little you know about this subject.


And I need panel of 3, because 1 in 10 makes a fluke possible.
The asm expresses the semantics of the C code. It makes copying obvious to the casual observer, once it is laid out.
Pardon, but you are very underestimating of compiler sophistication. The asm may be auto-optimised to the degree of unrecognisability. If SMP involved, more so.
No asm is "unrecognizable". I have absolutely no idea where you are getting your information, but you might consider finding an alternate source. The current source is hopelessly out of touch.

In the day of Cray and HiTech it was different, a compiler was just a little more than an assembler. But RYBKA is of 2005-6, not 1616 or 1986.
Bullshit, again. The fortran compiler from Cray was just as good at optimizing as any compiler around today, and actually better because it had to take advantage of the vector hardware that we don't have today. Boy, are you out of touch with reality... badly out of touch...

However, what is _really_ stupid about your comments is that the compiler is meaningless when you give me source code to compare. What does the optimized compiler output have to do with figuring out which of the 3 programs do the same thing? I am now almost certain that not only do you not know anything about computer science, you don't know anything about programming either... No big surprise.

bob
Posts: 20916
Joined: Mon Feb 27, 2006 6:30 pm
Location: Birmingham, AL

Re: Is any competent one here?? Correct the RYBKA libels!

Post by bob » Thu Mar 17, 2011 3:29 pm

wgarvin wrote:
Romy wrote:
bob wrote:The asm expresses the semantics of the C code. It makes copying obvious to the casual observer, once it is laid out.
Pardon, but you are very underestimating of compiler sophistication. The asm may be auto-optimised to the degree of unrecognisability. If SMP involved, more so.

In the day of Cray and HiTech it was different, a compiler was just a little more than an assembler. But RYBKA is of 2005-6, not 1616 or 1986.
All modern optimizing compilers are quite "sophisticated" (although the ones used to compile Rybka back in 2005-6 were not as good as they are today), but this "auto-optimised to the degree of unrecognisability" is nonsense.

Anyone who understands how compiler optimizers work and knows how to read assembly, should be able to compare a short segment of source code with a short listing of assembly instructions and draw a conclusion about whether they do the same calculation or not.

The effects of most compiler optimizations are relatively simple to understand, even if the implementation of the compiler is quite complicated and difficult. Every optimizing compiler folds constants, does CSE, strength reduction, inlining, and loop optimizations (peeling, unrolling, hoisting invariants, etc.) Modern ones use SSA form and do more aggressive things like partial-redundancy elimination, pointer alias analysis. It all sounds complex until you realize that the compiled program still has to compute the same results that you asked it to compute in your source code. It can move the computations around a bit, and do them in a smarter way, but generally it still has to do them.

If you write some short programs and compile them and look at the instructions the compiler actually produces, you'll get a good feel for what the compiler actually can and can't do to your code. And anyone can learn what those optimizations do without having to learn how they actually do it.

"constant folding": it replaces things like (1 + 3 + 5) with the (9) at compile time. Most compilers do this even in debug builds because it makes the compilation process faster.
"CSE": Common subexpression elimination. If it can figure out that you asked it to do the same computation twice, it will just compute it once and use the result in both places.
"strength reduction": things like (a * 4) get replaced with (a << 2) if that generates faster code. div by constant gets converted into mul by constant, etc.
"loop-invariant code motion": it finds calculations inside the loop body that would just produce the same result every time, and hoists them above the loop so they only have to be done once.
"loop induction": it can replace the variable(s) or address expression(s) that change by a fixed amount on each iteration of a loop (such as a counter, or some array being accessed inside the loop) with some other expression which is cheaper for it to compute. If you write for (int i=0; i<size; i++) and then in the body you index an array of 8-byte structures, it might use (i*8) instead. It might even re-write it like for (t = -(i*8); t != 0; t += 8) so it can take advantage of super-cheap t != 0 test. etc.
"partial redundancy elimination": if a calculation is made in a basic block which is common to more than one possible code path, and then the result is used on one of these paths but not on all of them, it might decide to move the computation so that its only performed on the paths where its needed.

Anyways the point is, compiler output is only surprising if you don't know anything about optimizing compilers. Or I guess, if you expect it to optimize something that seems obvious to you but it fails to do so...
Wylie, I got sucked into his nonsense until his last post. Notice he is talking about giving someone 5 different _source_ programs. And we have to figure out which 3 have semantic equivalence. No compilers needed. No optimization required. I think he is perhaps one of an infinite number of monkeys in a room where the nonsense he typed at least makes grammatical sense, even though it is computer science nonsense.

Who cares how a compiler optimizes if we are comparing two source programs? Of course, an easy test would be to compile all 5 and just diff the executables. If, as he claims, 3 will produce the same, that would take all of 5 minutes. But I would not even go that far, I'd look at the source first. Any good CS faculty member could look at 5 programs and detect similarities that justify a deeper look. We only do this every day...

bob
Posts: 20916
Joined: Mon Feb 27, 2006 6:30 pm
Location: Birmingham, AL

Re: Is any competent one here?? Correct the RYBKA libels!

Post by bob » Thu Mar 17, 2011 3:33 pm

Romy wrote:
bob wrote:
Romy wrote:It is a demonstrable fact that compilation with the best compilers is a many-to-one process. Not a one-to-one process.
a computer scientist will support is that yes, going from asm to C is a 1 to many mapping.
Thank you for this admission.

It did not come out earlier among the learned commentators. Better they see it now, when they have access to brake function and gear, then after the irreversibility line is crossed.

I have stated that _many_ times. And I have explained that one does not need to go from asm back to C to compare an asm to an original C program. One can examine the semantics of the C source, and the semantics of the asm source, and compare them. There is no many to one or one to many there. When both are reduced to the same least common denominator, namely semantics, you end up with just "one" semantic description if the two programs are equivalent. Please go away.

But going from a C to ASM is not.
Well, with a given compiler and given settings, it is not. Else, you are wrong.
That's the flaw in your ointment.
It is flaw or fly in someone else's balm, but it is irrelevant to mine.
The asm expresses the semantics of the C code.
Aha.
Are you unclear about the meanings of syntax and grammar, with application to C compilation process?
Are you unfamiliar with the concept that a compiler recognizes the syntax, determines the semantics, and produces an object file that expresses those semantics? The compiler might produce many different object files depending on optimization settings you choose. But the semantics _must_ be identical each time, else the compiler is broken and the program won't do what you want.

Again, please go away.

bob
Posts: 20916
Joined: Mon Feb 27, 2006 6:30 pm
Location: Birmingham, AL

Re: Is any competent one here?? Correct the RYBKA libels!

Post by bob » Thu Mar 17, 2011 3:35 pm

Romy wrote:
Romy wrote:Is any competent one here?
For some reason, I can remind the first sentence.
It appears you are addressing that to yourself. The answer, therefore, is "no".

User avatar
JVMerlino
Posts: 1042
Joined: Wed Mar 08, 2006 9:15 pm
Location: San Francisco, California

Re: Is any competent one here?? Correct the RYBKA libels!

Post by JVMerlino » Thu Mar 17, 2011 7:31 pm

<sigh>

A troll that responds to himself -- the worst kind.... :roll:

jm

User avatar
geots
Posts: 4790
Joined: Fri Mar 10, 2006 11:42 pm

Re: Is any competent one here?? Correct the RYBKA libels!

Post by geots » Thu Mar 17, 2011 9:33 pm

wgarvin wrote:Don't feed the trolls...

I dont know if (ahem) Carol is right or wrong. But I had much rather be him who has the nerve to come on this forum and tell you what he believes- than to be all the people so far who know nothing about nothing- only saying it has to be true if Hyatt says it is. So if you dont agree with Bob and others, you are a troll. Much better than being a follow the crowd simpleton.

bob
Posts: 20916
Joined: Mon Feb 27, 2006 6:30 pm
Location: Birmingham, AL

Re: Is any competent one here?? Correct the RYBKA libels!

Post by bob » Thu Mar 17, 2011 10:34 pm

geots wrote:
wgarvin wrote:Don't feed the trolls...

I dont know if (ahem) Carol is right or wrong. But I had much rather be him who has the nerve to come on this forum and tell you what he believes- than to be all the people so far who know nothing about nothing- only saying it has to be true if Hyatt says it is. So if you dont agree with Bob and others, you are a troll. Much better than being a follow the crowd simpleton.
I had a calculus teacher that told me "if you integrate 2xdx, you get x^2, and if you differentiate x^2 you get 2xdx." Everyone believed him, and the book proved why this is so. Was "following the crowd" wrong in light of such supporting evidence???

User avatar
Romy
Posts: 72
Joined: Thu Mar 10, 2011 9:39 pm
Location: Bucharest (Romania)

Re: Is any competent one here?? Correct the RYBKA libels!

Post by Romy » Thu Mar 17, 2011 10:57 pm

geots wrote:I dont know if (ahem) Carol is right or wrong. But I had much rather be him who has the nerve to come on this forum and tell you what he believes- than to be all the people so far who know nothing about nothing- only saying it has to be true if Hyatt says it is. So if you dont agree with Bob and others, you are a troll. Much better than being a follow the crowd simpleton.
You are in parts correct.

I am being extra gentle to Mr Hyatt out of respect, like Mr Kasparov responded to Mr Fischer.

Within the space of 5-10 minutes he wrote both in this thread--
bob wrote:All modern optimizing compilers are quite "sophisticated" (although the ones used to compile Rybka back in 2005-6 were not as good as they are today)
and
bob wrote:Bullshit, again. The fortran compiler from Cray was just as good at optimizing as any compiler around today, and actually better
So first he says that "modern" (in context, 2010-11) compilers are notedly superior to one's even from 2005-6.

Then, in almost his next breath, he makes assertion that the compiler from days of Cray Blitz (1986, or maybe even 1616 since, to Mr Hyatt, even Shakespeare needed to be exonerated from cloning/copying of later Crafty) are superior to the 2010-11 compilers.

See the contradiction?

Fortran is red herring (only ills use Fortran today) so do not let him pretend something, like C compiler is progress but Fortran is not, to explain above faux.

To illustrate ten more of his extrasense, I have not enough space in the margin for. But I give one more.

In my challenge, P,Q,R,S,T sources (he mistakes first for exe or asm or pseudocode), he say he can quickly tell which 3 are functionally identical by using compiler. BUT IN MY STATEMENT OF OFFER I CLEARLY STATE NO COMPILER/COMPUTER WOULD BE AVAILABLE! Purpose was to show, apparently very different source can compile by error-free compiler with unchanged settings to identical executable.

Enough.

But he has helped hundreds of student and programmer, including Mr Rajlich when he was novitiate, so I give him much respect for past, and encouragement to be more careful today.

To my central point he has no refutation. Decompile produces A source and not THE source.

In 10 places he writes me (ME!!) explanation of SEMANTICS. It is amusing. I know all. But no offense I take.

Point is, semantics of the decompile source is NOT identical to Fruit semantics (as he implies), except in the triviata or the inevitabilities (if only one way to do something small well, it must be that way).

Semantics is sometimes very close correspondence (Fruit fragment to RYBKA Beta fragment), but that says nothing to the allegation of copying when combined with the inherent irreversibility of the decompile process.

I give an example. Only 1 baby in 1000 dies unexplained at home during first 6 months of life. One woman, 3 out of her 4 babies died like that. Does that make her (or someone close) almost certain murderer? When you understand it not do that at all, you understand the power my previous paragraph. I do not expect to find the meta-level understanding here.

Instead, the dogs to bark, baboon to pack-chase, and Mr Hyatt to attack. All are performing their predestined function from nature. God is great!
I am in a computer science department
I am a computer science department. But not Alabaman.

User avatar
Romy
Posts: 72
Joined: Thu Mar 10, 2011 9:39 pm
Location: Bucharest (Romania)

Re: Is any competent one here?? Correct the RYBKA libels!

Post by Romy » Thu Mar 17, 2011 11:00 pm

bob wrote:I had a calculus teacher that told me "if you integrate 2xdx, you get x^2, and if you differentiate x^2 you get 2xdx." Everyone believed him, and the book proved why this is so. Was "following the crowd" wrong in light of such supporting evidence???
Please do not write such nonsenses.

If your calculus teacher taught this, it explains much.

I suggest you misunderstood the teaching.

If you integrate 2xdx, you get not x^2, but instead x^2+C, where C be a constant of integration, and unknown without limit supplied.

If you think the book proved him right in saying it was x^2, then the book should be burned.

Please be careful when writing exclusive nonsense.

Locked