Reputation: 5754
Suppose I have the following preprocessor definition
#define MYNUMBER 10f;
I would then like to use it in my code as follows:
float someResult = MYNUMBER * 3;
When I do this, Xcode thinks that I am trying to use *
as a unary pointer mark instead of multiplication sign, and causes an error. What is the correct method of defining such a constant and using it in a multiplicative expression?
Upvotes: 0
Views: 228
Reputation: 57168
You shouldn't have a semicolon after your #define. It things MYNUMBER is "10f;".
Upvotes: 8