#define X (MinGW)

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

User avatar
hgm
Posts: 27788
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

#define X (MinGW)

Post by hgm »

I have a C program that contains #define X (and Y and N). It compiles fine under gcc on Linux (where I developed it). Because I want to make a Windows binary I tried to compile it under MinGW. To my dismay I get tons of error messages, for every line where the X occurs, that there is no opening parentheses after it. Apparently MinGW has a hard-coded definition of the macro X that it doesn't allow me to redefine or undefine. Now this is pretty annoying, because the code is splattered with occurrences of X and Y, and global substitution isn't an option, because the X and Y also frequently occur inside may other variable and macro names.

Does there exist an option to switch off this rogue behavior w.r.t. the X macro?
mar
Posts: 2554
Joined: Fri Nov 26, 2010 2:00 pm
Location: Czech Republic
Full name: Martin Sedlak

Re: #define X (MinGW)

Post by mar »

Hmm, that seems like a pretty weird behavior that would possibly break many programs.
I wonder if this code would compile fine in your MinGW?

Code: Select all

int X(int) {return 0;}
Maybe you could try to just preprocess to file to see to what it translates to?
elpapa
Posts: 211
Joined: Sun Jan 18, 2009 11:27 pm
Location: Sweden
Full name: Patrik Karlsson

Re: #define X (MinGW)

Post by elpapa »

hgm wrote:global substitution isn't an option, because the X and Y also frequently occur inside may other variable and macro names.
With regex you can do '\bX\b' and it will ignore X's having other letters before or after it. Most editors have an option 'match whole word only', which does the same thing.
User avatar
Evert
Posts: 2929
Joined: Sat Jan 22, 2011 12:42 am
Location: NL

Re: #define X (MinGW)

Post by Evert »

hgm wrote: Does there exist an option to switch off this rogue behavior w.r.t. the X macro?
Sounds like a compiler bug, but anyway try something like

Code: Select all

#define X _X_rename
#include "standardheaders"
#undef X
#define X ... // Your definition
at the top of your code.
syzygy
Posts: 5557
Joined: Tue Feb 28, 2012 11:56 pm

Re: #define X (MinGW)

Post by syzygy »

Evert wrote:
hgm wrote: Does there exist an option to switch off this rogue behavior w.r.t. the X macro?
Sounds like a compiler bug, but anyway try something like

Code: Select all

#define X _X_rename
#include "standardheaders"
#undef X
#define X ... // Your definition
at the top of your code.
But that will not replace "#define X bla" that supposedly is present somewhere in the included headers with "#define _X_rename bla".
BeyondCritics
Posts: 396
Joined: Sat May 05, 2012 2:48 pm
Full name: Oliver Roese

Re: #define X (MinGW)

Post by BeyondCritics »

When i meet an annoying problem, i don't start to hack frantically my source base to get it out of the way. Instead i just let it go and the next day, when i am fresh, i analyze the problem very carefully.
You say, the compiler does not allow you to "undef" the symbol X.
The C standard http://en.cppreference.com/w/c/preprocessor/replace says, that an undef of an not existing symbol is just ignored.
Therefore i assume that you don't have a macro X, but a method with this name from somewhere included.
The very first step would be, to explore from where do you have this name and why. Any decent IDE will assist you with that.
User avatar
Evert
Posts: 2929
Joined: Sat Jan 22, 2011 12:42 am
Location: NL

Re: #define X (MinGW)

Post by Evert »

syzygy wrote:
Evert wrote:
hgm wrote: Does there exist an option to switch off this rogue behavior w.r.t. the X macro?
Sounds like a compiler bug, but anyway try something like

Code: Select all

#define X _X_rename
#include "standardheaders"
#undef X
#define X ... // Your definition
at the top of your code.
But that will not replace "#define X bla" that supposedly is present somewhere in the included headers with "#define _X_rename bla".
Yes, that's true. I think I made a leap in logic because the compiler "not allowing to #undef the symbol" suggested to me that it wasn't actually a macro. Otherwise #undef X after including the headers should work.
Dann Corbit
Posts: 12537
Joined: Wed Mar 08, 2006 8:57 pm
Location: Redmond, WA USA

Re: #define X (MinGW)

Post by Dann Corbit »

Make the smallest possible sample that reproduces the problem.
Post that here (code complete to reproduce).
I guess that there is a simple explanation.
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.
Sven
Posts: 4052
Joined: Thu May 15, 2008 9:57 pm
Location: Berlin, Germany
Full name: Sven Schüle

Re: #define X (MinGW)

Post by Sven »

Dann Corbit wrote:Make the smallest possible sample that reproduces the problem.
Post that here (code complete to reproduce).
I guess that there is a simple explanation.
For me the most simple explanation would be that he defines the macro X somewhere in an own header file but afterwards (i.e., after including this own header file) includes a standard header that redefines (somewhere in its include chain) X (as a macro taking parameters, but that detail does not matter here). That way it would be impossible to solve the problem by doing an #undef anywhere.

The only solution that I see without renaming the own X macro would be to ensure that the standard header causing the problem (or a set of standard headers where one of them causes the problem) always gets included *before* defining the own X macro, and then doing an #undef *between* the #include and the own definition. Then the foreign X macro can't win due to the include guard of the standard header file.
Dann Corbit
Posts: 12537
Joined: Wed Mar 08, 2006 8:57 pm
Location: Redmond, WA USA

Re: #define X (MinGW)

Post by Dann Corbit »

Sven Schüle wrote:
Dann Corbit wrote:Make the smallest possible sample that reproduces the problem.
Post that here (code complete to reproduce).
I guess that there is a simple explanation.
For me the most simple explanation would be that he defines the macro X somewhere in an own header file but afterwards (i.e., after including this own header file) includes a standard header that redefines (somewhere in its include chain) X (as a macro taking parameters, but that detail does not matter here). That way it would be impossible to solve the problem by doing an #undef anywhere.

The only solution that I see without renaming the own X macro would be to ensure that the standard header causing the problem (or a set of standard headers where one of them causes the problem) always gets included *before* defining the own X macro, and then doing an #undef *between* the #include and the own definition. Then the foreign X macro can't win due to the include guard of the standard header file.
The compiler will show you all its macros.
For instance, with VC++ it is:

Code: Select all

cl /EP <filename>
and for gcc it is:

Code: Select all

gcc -E <filename>
Taking ideas is not a vice, it is a virtue. We have another word for this. It is called learning.
But sharing ideas is an even greater virtue. We have another word for this. It is called teaching.