Reputation: 6198
I'm working with OpenGL shaders, which prefer C strings to NSStrings, and I've run into the following oddity:
The relevant facts are that
1) shaderSource is defined as
NSString* shaderSource
2) the signature of glShaderSource is
glShaderSource(GLuint shader, GLuint count, const GLchar** string, const GLint *length)
This works:
int len = [shaderSource length];
const char *cstr = [shaderSource UTF8String];
glShaderSource(shader, 1, &cstr, &len);
This does not work:
glShaderSource(shader, 1, &[shaderSource UTF8String], &[shaderSource length]);
I am clearly not understanding something about objective c here, and I'd like to rectify that problem, so if you're so inclined, do pray tell, WTF?
Upvotes: 1
Views: 1621
Reputation: 84328
You seem to be using &[shaderSource UTF8String]
to give you an extra level of indirection, so that the types will be correct, since glShaderSource
wants a pointer to a pointer. But you can't just put &
in front of anything to get a pointer to it.
&x
means "return the address of x"; for that to make sense, x
must be a symbol, not an expression. You can't write &[shaderSource UTF8String]
for the same reason you can't write &(1 + t)
. That's just an expression, you need to store the value somewhere, only then you can pass along the address (place where you stored it).
This might be confusing if you are used to C-structs, since if you have a struct foo
with member bar
, you can get the address of a member by writing &foo.bar
. But Objective-C objects are not structs (well, technically they are opaque structs). You get property values from them by sending them messages, which is basically the same as a function call. So if foo
is an object and bar
is a property, &foo.bar
(which is really short-hand for &[foo bar]
) is an invalid expression.
Upvotes: 5