Reputation: 333
I'm trying to create a typemap from a C++ struct to a PyLong.
For example, I have the following struct that represents a 128 bit number, and I would like to access it in the Python interface as a simple Python unsigned long.
struct my_128 {
u_int64_t raw[2];
};
How can I create such a typemap?
Upvotes: 2
Views: 1025
Reputation: 177526
Absent full error checking, these typemaps work:
%typemap(in) struct my_128 {
PyObject* temp;
PyObject* shift;
if(!PyLong_Check($input) && !PyInt_Check($input))
{
PyErr_SetString(PyExc_TypeError,"Must be int or long type");
return NULL;
}
$1.raw[0] = PyInt_AsUnsignedLongLongMask($input); // low 64-bits
shift = PyInt_FromLong(64);
temp = PyNumber_Rshift($input,shift);
$1.raw[1] = PyInt_AsUnsignedLongLongMask(temp); // high 64-bits
Py_DECREF(temp);
Py_DECREF(shift);
}
%typemap(out) struct my_128 {
PyObject* low;
PyObject* high;
PyObject* shift;
PyObject* intermediate;
low = PyLong_FromUnsignedLongLong($1.raw[0]);
high = PyLong_FromUnsignedLongLong($1.raw[1]);
shift = PyInt_FromLong(64);
intermediate = PyNumber_Lshift(high,shift);
$result = PyNumber_Add(low,intermediate);
Py_DECREF(low);
Py_DECREF(high);
Py_DECREF(intermediate);
Py_DECREF(shift);
}
Upvotes: 2
Reputation: 88711
I have something that works, but it's not pretty. The problem is it seems the only way to make a PyLong from C that's longer than long long
is with a string!
So for a typemap that takes your my_128
and exposes it as a PyLong when it's returned from a function you can do:
%typemap(out) my_128 {
std::ostringstream s;
s << "0x"
<< std::setfill('0') << std::setw(8) << std::hex << $1.raw[0]
<< std::setfill('0') << std::setw(8) << std::hex << $1.raw[1];
char *c = strdupa(s.str().c_str()); // Avoids a const cast without leaking ever
$result = PyLong_FromString(c,0,0);
}
Which prints it in hex to a stringstream
and then constructs a PyLong from that.
The corresponding typemap going the other way is even uglier. We need to take our PyLong and persuade python to convert it to a suitable string by calling the builtin hex()
on it.
Then we need to massage that into something that we can read from (two) stringstreams
(otherwise the first one steals all the input). This ends up looking like:
%typemap(in) my_128 {
PyObject *moduleName = PyString_FromString((char*)"__builtin__");
assert(moduleName);
PyObject *module = PyImport_Import(moduleName);
assert(module);
PyObject *hex = PyObject_GetAttrString(module,(char*)"hex");
assert(hex);
PyObject *args = PyTuple_Pack(1,$input);
assert(args);
PyObject *result = PyObject_CallObject(hex, args);
assert(result);
std::string str(PyString_AsString(result));
if (str.find("0x")!=std::string::npos) {
str=str.substr(2);
}
if (str.find("L") != std::string::npos) {
str=str.substr(0,str.size()-1);
}
assert(str.size());
if (str.size() > 16) {
PyErr_SetString(PyExc_ValueError, "Expected at most a 128-bit int");
return NULL;
}
std::istringstream s1(str.substr(0,8));
if (!(s1 >> std::hex >> $1.raw[0])) {
$1.raw[0]=0;
}
std::istringstream s2(str.substr(8,16));
if (!(s2 >> std::hex >> $1.raw[1])) {
$1.raw[1]=0;
}
// TODO: check that these really worked!
}
This could use a bunch more error handling still.
I tested these with:
%module test
%{
#include <sstream>
#include <iomanip>
#include <string.h>
#include <iostream> //for testing
%}
// Typemaps go here
%inline {
struct my_128 {
u_int64_t raw[2];
};
my_128 create() {
const my_128 r = {0xdeadbeef, 0xdeadbeef};
return r;
}
void display (my_128 in) {
std::cout << std::setfill('0') << std::setw(8) << std::hex << in.raw[0]
<< std::setfill('0') << std::setw(8) << std::hex << in.raw[1] << std::endl;
}
}
Which from a quick initial test gives:
Python 2.7.1+ (r271:86832, Apr 11 2011, 18:05:24) [GCC 4.5.2] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import test >>> test.create() 16045690984833335023L >>> test.display(16045690984833335023L) deadbeefdeadbeef >>> test.display(test.create()) deadbeefdeadbeef >>> test.display(test.create()+1) deadbeefdeadbef0 >>> test.display(test.create()+100) deadbeefdeadbf53 >>> test.display(test.create()+10000000) deadbeefdf46556f >>> test.display(test.create()+100000000000000000000) Traceback (most recent call last): File "", line 1, in ValueError: Expected at most a 128-bit int >>> test.display(test.create()+100000000000000000) e01104683c37beef >>> test.display(test.create()+10000000000000) deadc8082d205eef >>>
Although it doesn't yet handle "shorter" integers properly because of the substr()
calls.
Upvotes: 0