Reputation: 367
I'm developing a react native application where I use TensorflowJs. I'm using cameraWithTensors
from tfjs-react-native
package which gives me images as Tensors of type int32
. I want to convert those tensor data into base64 so I can render it in the app.
Somehow I found a solution to convert base64 data into tensor as shown below but I cannot do the opposite. I found decodJpeg
method tfjs docs but it's opposite is not available. I tried many solutions but none of them worked.
URItoTensor = async URI => {
const imgB64 = await FileSystem.readAsStringAsync(URI, {
encoding: FileSystem.EncodingType.Base64,
});
const imgBuffer = tf.util.encodeString(imgB64, 'base64').buffer;
const unit8 = new Uint8Array(imgBuffer)
const tensor = decodeJpeg(unit8);
return tensor;
}
Upvotes: 2
Views: 1687
Reputation: 18381
You can use toPixels to convert the tensor to a canvas and then use canvas.toDataURL
to get the base64 encoding
const canvas = document.createElement('canvas');
canvas.width = tensor.shape.width
canvas.height = tensor.shape.height
await tf.browser.toPixels(tensor, canvas);
canvas.toDataURL() // will return the base64 encoding
The above will work in the browser but not in react-native. React-native has binaryToBase64
to convert from a typedarray;
const bytes = tensor.dataSync(); // can also use the async .data
const encoded = binaryToBase64(bytes); // base64 string
Update
encodeString
is not the method to be used to convert a base64 to a tensor whose underline representation is an image. EncodeString only encode a string as a typedArray. Here is the way to load an image to a tensor. There is even no need to use a base64 string before.
// Load an image as a Uint8Array
const response = await fetch('path/of/image', {}, { isBinary: true });
const imageDataArrayBuffer = await response.arrayBuffer();
cosnt imageData = new Uint8Array(imageDataArrayBuffer);
// Decode image data to a tensor
const tensor = decodeJpeg(imageData);
Upvotes: 4