Christopher Theriault
Christopher Theriault

Reputation: 131

How to get gcc to compile 16-bit unicode strings

So I'm trying to compile this project : https://github.com/dmitrystu/libusb_stm32 with Segger Embedded studio which uses gcc. The process is choking on this error :

pasting formed 'u"Open source USB stack for STM32"', an invalid preprocessing token

which is caused by this line :

static const struct usb_string_descriptor manuf_desc_en = USB_STRING_DESC("Open source USB stack for STM32");

So USB_STRING_DESC is a macro :

#define USB_STRING_DESC(s)         {.bLength = sizeof(CAT(u,s)),.bDescriptorType = USB_DTYPE_STRING,.wString = {CAT(u,s)}}

And CAT is a macro CAT(x,y) x##y. The intent must be to convert a string of type 8-bit char into a 16-bit Unicode type but the compiler doesn't like it. Is there some #include or compiler setting that may be missing that I have to add here? Clearly the author of this code expects it to work so there must be some fault in my setup.

Also I'm not clear on how the sizeof() operation is supposed to work here. As I understand it there is no way to get the length of a string at compile time so that operation will always return the size of a pointer.

Upvotes: 0

Views: 318

Answers (1)

Christopher Theriault
Christopher Theriault

Reputation: 131

In response to Keith's question, the gcc version is 4.2.1. Poking around the compiler settings the default option is the C99 standard, when I changed it to C11 everything compiled just fine. Thanks!

Upvotes: 1

Related Questions