🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

#define gridSz 16 versus Const Int gridSz=16; ????

Started by
9 comments, last by simulator 22 years, 10 months ago
Hi- Having recently discovered #defines, I'm wondering why anyone would use constant variables instead???? What do you use?? Am I missing something???? Is there any difference- speed wise etc? I would imagine #defines are better because they become part of the compiled program memory versus constant variables that have to be accessed separately?? Thanx Edited by - simulator on August 2, 2001 6:26:01 PM Edited by - simulator on August 2, 2001 6:30:07 PM
Advertisement
defines get put in before your program compiles.

You can even things like this:
#define printg printf 


Although I don''t know why you''d want to!
Hi & thanx,

Yes I know about how defines work...
Perhaps constant variables would be those defined at run time eg:


Const long lPitch;

(ie: where lPitch must be determined only once the program is running) (not sure this is even permitted.... must try it)


thanx


--- yes it works- so that must be why???

But still, even for plain, known-ahead-of-time constants- what do most people use: #define?????

Edited by - simulator on August 2, 2001 6:42:59 PM

Edited by - simulator on August 2, 2001 6:49:10 PM
Constant values are placed inline in the same way that #defines are, so there''s no speed difference. I use const for constants, and #define for other things, but both are quite acceptable. Do what looks nice to you.
Not to forget about type safety either...

YAP-YFIO
-deadlinegrunt

~deadlinegrunt

Here''s the difference:
Constants are just like normal variables, with the exception that they cannot be changed.

Defines are simply "substitutions" in code. Take an example:

#define NUMBER 23
...
array[NUMBER] = 113;

is literally exactly the same as

array[23] = 113;

However, defines are not type-checked. Therefore, a number of goofy problems may get right past the compiler and linker (not positive about that, but pretty damned sure). Take an example:

#define NUMBER "AKLSJFD"

array[NUMBER] = 113;

Obviously, "AKLSJFD" isn''t a valid array indice, but since it''s defined in a macro, it might get by the compiler. (Again, not dead sure about this).

To summarize: either one is fine, but just remember that #define-s are less safe than consts.

-Normie

"But time flows like a river... and history repeats."
...
So...what about beaver dams?
I am a devout follower of the"Lazy Programmer's Doctrime"(tm)...and I'm damned proud of it, too!-----"I came, I saw, I started makinggames." ... If you'll excuseme, I must resume my searchfor my long lost lobotomy stitches.
Hmm- OK good points, thank you
Normie - you''re actually mistaken there - #defines are, as you said, substitutions. So:

#define NUMBER "AKLSJFD"
array[NUMBER] = 113;

would become:

array["AKLSJFD"] = 113;

...which raises a compiler error. However:

#define INCREMENT(a, b) a += b

If you pass a char* and an float to that function, you''ll get an error which doesn''t seem to make sense (Illegal use of floating point). And that''s the only problem I''ve ever seen coming from lack of type safety. It''s not actually as bad as some people make out.
#define''s are also nasty in that if you do something like

#define IncCheck(a,b) (a++ < b++) ? a++ : b++

and then
IncCheck(i,j)
both i & j will be incremented at least once,
because the direct substitution expands to
(i++ < j++) ? i++ : j++

The const keyword was introduced to allow the compiler to type check arguments (can''t do that with macros) and the inline keyword was used to fix the expansion problem that macros suffer. The only time I''d recommend using macros is conditional code compilation, ie
#if defined _DEBUG
printf("This is a debug version");
#endif

Brad
brad_beveridge: If you wrote that as an inline function, using the same code, you''d get the exact same logic error. I think what you meant was more like this:

#define Check(a, b) ((a > b) ? a : b)
Check(a++, b++);

Which would of course cause an error. But show me someone who would write something like that and not see the problem within a few minutes, and I''ll show you someone who needs to learn the basics of c.

There are still advantages that macros have over inline functions. In some cases, it''s just that it requires less code than the inline function, and in some cases, nothing can go wrong:

#define CREATEWINDOW(Title) CreateWindowEx(0, "randomclassname", Title, WS_VISIBLE, 0, 0, 640, 480, NULL, NULL, GetModuleHandle(NULL), NULL)

And although the following isn''t the best example, there are some things that macros can do that inline functions cannot:

// Declare and clear a DirectX structure:
#define DECLARE_STRUCT(type, var) type var; ZeroMemory(&var, sizeof(var)); var.dwSize = sizeof(var)


In summary, use macros where appropriate, and use inline functions where they''re appropriate. A macro should use each parameter exactly once (to avoid the Check(a++, b++) problem), and should be commented to describe the parameters if the parameters are not obvious.

This topic is closed to new replies.

Advertisement