Reputation: 13
Below given is my JavaScript code.
a = "hello"
a_ascii = [];
for (var i = 0; i < a.length; i ++)
a_ascii.push(a[i].charCodeAt(0));
a_typedArray = new Float32Array(a_ascii.length)
for (let i=0; i<a_ascii.length; i++) {
a_typedArray[i] = a_ascii[i]
}
a_buffer = Module._malloc(a_typedArray.length * a_typedArray.BYTES_PER_ELEMENT)
Module.HEAPF32.set(a_typedArray, a_buffer >> 2)
var result = Module.ccall(
"myFunction", // name of C function
null, // return type
[Number, Number], // argument types
[a_buffer, a.length] // arguments
);
And below given is the C code:
extern "C"
{
void EMSCRIPTEN_KEEPALIVE myFunction(int *a, int s)
{
printf("MyFunction Called\n");
for (int i = 0; i < s; i++) {
printf("%d ", a[i]);
}
printf("\n%d\n", s);
}
}
The output of C code is given below:
1120927744 1120534528 1121452032 1121452032 1121845248
5
though it should have been:
104 101 108 108 111
5
Please, let me know what's wrong with the code.
I took reference from: link
Upvotes: 1
Views: 1666
Reputation: 75062
You are using Float32Array
in JavaScript while int*
in C.
You should do either one of:
Float32Array
to Int32Array
in your JavaScript codeint*
to float*
and "%d "
to "%.0f "
in your C codeUpvotes: 2