Reputation: 24067
ef_vi_alloc_from_pd
function from this c code accepts enum as last argument:
int ef_vi_alloc_from_pd(ef_vi* vi, ef_driver_handle vi_dh,
struct ef_pd* pd, ef_driver_handle pd_dh,
int evq_capacity, int rxq_capacity, int txq_capacity,
ef_vi* evq_opt, ef_driver_handle evq_dh,
enum ef_vi_flags flags)
In this c example we define flags as unsigned vi_flags
and it works:
unsigned vi_flags;
vi_flags = EF_VI_FLAGS_DEFAULT;
if( cfg_timestamping )
vi_flags |= EF_VI_RX_TIMESTAMPS;
TRY(ef_vi_alloc_from_pd(&res->vi, res->dh, &res->pd, res->dh,
-1, -1, 0, NULL, -1, vi_flags));
But in c++ it doesn't work, I have compile error when calling ef_vi_alloc_from_pd error: invalid conversion from ‘unsigned int’ to ‘ef_vi_flags’ [-fpermissive]
I've tried to define vi_flags as enum:
enum ef_vi_flags vi_flags;
vi_flags = EF_VI_FLAGS_DEFAULT;
vi_flags |= EF_VI_RX_TIMESTAMPS;
But this doesn't compile too, when I doing "|=" error: invalid conversion from ‘int’ to ‘ef_vi_flags’ [-fpermissive]
How to use ef_vi_alloc_from_pd
from c++?
vi_flags
and how to call ef_vi_alloc_from_pd
?vi_flags
?Upvotes: 0
Views: 273
Reputation: 218278
You may do:
ef_vi_flags vi_flags = ef_vi_flags(EF_VI_FLAGS_DEFAULT | EF_VI_RX_TIMESTAMPS);
Upvotes: 2