#define X (MinGW)

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

syzygy
Posts: 5563
Joined: Tue Feb 28, 2012 11:56 pm

Re: #define X (MinGW)

Post by syzygy »

Evert wrote:
syzygy wrote:
Evert wrote:
hgm wrote: Does there exist an option to switch off this rogue behavior w.r.t. the X macro?
Sounds like a compiler bug, but anyway try something like

Code: Select all

#define X _X_rename
#include "standardheaders"
#undef X
#define X ... // Your definition
at the top of your code.
But that will not replace "#define X bla" that supposedly is present somewhere in the included headers with "#define _X_rename bla".
Yes, that's true. I think I made a leap in logic because the compiler "not allowing to #undef the symbol" suggested to me that it wasn't actually a macro. Otherwise #undef X after including the headers should work.
But if Oliver is right that there is no macro X but a function X, it would seem to work.

But judging from what little information HGM gives, it seems to be neither a macro X nor a function X, but a problem with what HGM's macro X expands to.

Or there is just a bad line ending somewhere that messes up compilation on Windows.
User avatar
hgm
Posts: 27793
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: #define X (MinGW)

Post by hgm »

When I used the #define X in isolation, the error did not occur, and neither did it when I deleted the part of the program behind it. So I selectively deleted part by part to see what part of the remaining program it needed to evoke the error. From this it turns out that a later #include of <windows.h> is needed to trigger the error. So the following code:

Code: Select all

#define X 36    /* infinite range &#40;B, R, Q&#41;       */
#define Y 37    /* jumping slide &#40;BG, RG, VG, GG&#41; */
#define Z &#40;Y+3&#41; /* orthogonal GG jumping slides   */
#define J (~0&#41;  /* jump to second square &#40;Ph, Kn&#41; */
#define E (~36&#41; /* X or J &#40;FE, LH&#41;                */
#define L (~1&#41;  /* 1 or J &#40;Ln, SE, HF&#41;            */
#define T (~Y&#41;  /* Tetrarch &#40;skip-slide + igui&#41;   */
#define N (-1&#41;  /* moves as Knight                */
#define F (-2&#41;  /* shogi Knight                   */

#ifdef WIN32 
#    include <windows.h>
#endif
produces:

Code: Select all

C&#58;\cygwin\home\hachu>gcc -O2 -s test2.c -o ten2.exe
test2.c&#58;9&#58;12&#58; error&#58; expected identifier or '(' before '-' token
 #define N (-1&#41;  /* moves as Knight                */
            ^
test2.c&#58;2&#58;11&#58; error&#58; expected identifier or '(' before numeric constant
 #define X 36    /* infinite range &#40;B, R, Q&#41;       */
           ^
test2.c&#58;3&#58;11&#58; error&#58; expected identifier or '(' before numeric constant
 #define Y 37    /* jumping slide &#40;BG, RG, VG, GG&#41; */
           ^
I don't understand how a later #include can pro-actively spoil a #define that occurs in the source before it; I thought that pre-processor definitions only take effect downstream.

From the error message it seems to think that a formal-parameter list should mandatory follow the macro names X, Y and N.

When I move the #include <windows.h> to before the #defines, the error disappears! :shock:
BeyondCritics
Posts: 396
Joined: Sat May 05, 2012 2:48 pm
Full name: Oliver Roese

Re: #define X (MinGW)

Post by BeyondCritics »

hgm wrote: I don't understand how a later #include can pro-actively spoil a #define that occurs in the source before it; I thought that pre-processor definitions only take effect downstream.
The first gcc error message is provably wrong.
gcc has always been infamous for its terrible error messages. You could try to compile with clang or visual studio.
hgm wrote: When I move the #include <windows.h> to before the #defines, the error disappears! :shock:
Well, this is your bug. You have to include system headers first, no matter why.
petero2
Posts: 688
Joined: Mon Apr 19, 2010 7:07 pm
Location: Sweden
Full name: Peter Osterlund

Re: #define X (MinGW)

Post by petero2 »

hgm wrote:When I used the #define X in isolation, the error did not occur, and neither did it when I deleted the part of the program behind it. So I selectively deleted part by part to see what part of the remaining program it needed to evoke the error. From this it turns out that a later #include of <windows.h> is needed to trigger the error. So the following code:

Code: Select all

#define X 36    /* infinite range &#40;B, R, Q&#41;       */
#define Y 37    /* jumping slide &#40;BG, RG, VG, GG&#41; */
#define Z &#40;Y+3&#41; /* orthogonal GG jumping slides   */
#define J (~0&#41;  /* jump to second square &#40;Ph, Kn&#41; */
#define E (~36&#41; /* X or J &#40;FE, LH&#41;                */
#define L (~1&#41;  /* 1 or J &#40;Ln, SE, HF&#41;            */
#define T (~Y&#41;  /* Tetrarch &#40;skip-slide + igui&#41;   */
#define N (-1&#41;  /* moves as Knight                */
#define F (-2&#41;  /* shogi Knight                   */

#ifdef WIN32 
#    include <windows.h>
#endif
produces:

Code: Select all

C&#58;\cygwin\home\hachu>gcc -O2 -s test2.c -o ten2.exe
test2.c&#58;9&#58;12&#58; error&#58; expected identifier or '(' before '-' token
 #define N (-1&#41;  /* moves as Knight                */
            ^
test2.c&#58;2&#58;11&#58; error&#58; expected identifier or '(' before numeric constant
 #define X 36    /* infinite range &#40;B, R, Q&#41;       */
           ^
test2.c&#58;3&#58;11&#58; error&#58; expected identifier or '(' before numeric constant
 #define Y 37    /* jumping slide &#40;BG, RG, VG, GG&#41; */
           ^
I don't understand how a later #include can pro-actively spoil a #define that occurs in the source before it; I thought that pre-processor definitions only take effect downstream.
They do take effect only downstream. The error message from gcc is referring to a useless position in the code though. The file ia32intrin.h is included from windows.h. Lines 261-267 look like this:

Code: Select all

/* Write flags register */
extern __inline void
__attribute__((__gnu_inline__, __always_inline__, __artificial__))
__writeeflags &#40;unsigned long long X&#41;
&#123;
  __builtin_ia32_writeeflags_u64 &#40;X&#41;;
&#125;
When X is defined to be 36, the preprocessed code is invalid. However it seems that the compiler detects that the error occurs at a code position that has been changed by macro expansion, and then decides to refer in the error message to the position where X was defined instead of the position where X was used.

To figure out what was going on I ran the code through the gcc preprocessor like this:

Code: Select all

x86_64-w64-mingw32-gcc -O2 -Wall -E test.c >test.e
Then I searched for 36 in test.e and found:

Code: Select all

extern __inline void
__attribute__((__gnu_inline__, __always_inline__, __artificial__))
__writeeflags &#40;unsigned long long 
# 264 "/usr/lib/gcc/x86_64-w64-mingw32/6.3.0/include/ia32intrin.h"
                                 36
# 264 "/usr/lib/gcc/x86_64-w64-mingw32/6.3.0/include/ia32intrin.h" 3 4
                                  )
&#123;
  __builtin_ia32_writeeflags_u64 (
# 266 "/usr/lib/gcc/x86_64-w64-mingw32/6.3.0/include/ia32intrin.h"
                                 36
# 266 "/usr/lib/gcc/x86_64-w64-mingw32/6.3.0/include/ia32intrin.h" 3 4
                                  );
&#125;
# 28 "/usr/lib/gcc/x86_64-w64-mingw32/6.3.0/include/x86intrin.h" 2 3 4
syzygy
Posts: 5563
Joined: Tue Feb 28, 2012 11:56 pm

Re: #define X (MinGW)

Post by syzygy »

So this is a bug in gcc's ia32intrin.h header file.

It can be reproduced with regular gcc on Linux:

Code: Select all

#define X 36

#include <x86intrin.h>

Code: Select all

$ gcc test.c
test.c&#58;1&#58;11&#58; error&#58; expected ‘;’, ‘,’ or ‘)’ before numeric constant
 #define X 36
           ^
User avatar
Evert
Posts: 2929
Joined: Sat Jan 22, 2011 12:42 am
Location: NL

Re: #define X (MinGW)

Post by Evert »

syzygy wrote:So this is a bug in gcc's ia32intrin.h header file.

It can be reproduced with regular gcc on Linux:

Code: Select all

#define X 36

#include <x86intrin.h>

Code: Select all

$ gcc test.c
test.c&#58;1&#58;11&#58; error&#58; expected ‘;’, ‘,’ or ‘)’ before numeric constant
 #define X 36
           ^
Is that really a bug though? Any other name they pick for the argument can be mangled the same way.
Of course there are a few things that conspire to make this a problem: using such a generic name as "X" for the function argument, making it upper-case rather than lower-case, #defining something as generic as X and including system headers after #defining your own stuff (nominally bad practice, but it really shouldn't break so easily).

Anyway, problem solved and identified. :)
syzygy
Posts: 5563
Joined: Tue Feb 28, 2012 11:56 pm

Re: #define X (MinGW)

Post by syzygy »

syzygy wrote:So this is a bug in gcc's ia32intrin.h header file.
And if this is a mirror of the official gcc, which it seems to be, then it has recently been fixed:

Code: Select all

/* Write flags register */
extern __inline void
__attribute__((__gnu_inline__, __always_inline__, __artificial__))
__writeeflags &#40;unsigned long long __X&#41;
&#123;
  __builtin_ia32_writeeflags_u64 (__X&#41;;
&#125;
Compare with the gcc-6.3.0 version of the file:

Code: Select all

/* Write flags register */
extern __inline void
__attribute__((__gnu_inline__, __always_inline__, __artificial__))
__writeeflags &#40;unsigned long long X&#41;
&#123;
  __builtin_ia32_writeeflags_u64 &#40;X&#41;;
&#125;
syzygy
Posts: 5563
Joined: Tue Feb 28, 2012 11:56 pm

Re: #define X (MinGW)

Post by syzygy »

Evert wrote:Is that really a bug though? Any other name they pick for the argument can be mangled the same way.
If they use __X instead of X (as they do now), then it goes wrong only if someone uses __X in his program. The C and C++ standards tell you not to use identifiers starting with a double underscore.
Dann Corbit
Posts: 12540
Joined: Wed Mar 08, 2006 8:57 pm
Location: Redmond, WA USA

Re: #define X (MinGW)

Post by Dann Corbit »

syzygy wrote:
Evert wrote:Is that really a bug though? Any other name they pick for the argument can be mangled the same way.
If they use __X instead of X (as they do now), then it goes wrong only if someone uses __X in his program. The C and C++ standards tell you not to use identifiers starting with a double underscore.
Ronald is right.

It should be safe to use X as a macro {though I would probably not be brave enough to do it without #undef}. Mingw GCC has invaded the implementation name space, IMO.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
User avatar
hgm
Posts: 27793
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: #define X (MinGW)

Post by hgm »

OK, thanks everyone. That clears up the mystery. It is bad to use identifiers that could be macros as formal parameters in function definitions in a header file, but I guess this can be excused by the directive that header files should be #included before any #defines. (Although this is a strange directive, as it is quite common to define macros through a compiler flag, like -DX, and then what??)

But that it does not give the error message on the line where it occurs is truly appalling.