Reputation: 498
I have a WebSocket server implemented in Java. When a client connects I want to send an image over this connection for the client to use in a canvas element. I have come up with the following server code:
public void onOpen(Connection connection) {
try {
BufferedImage image = ImageIO.read(new File("image.jpg"));
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(image, "jpg", baos);
byte[] byteArray = baos.toByteArray();
connection.sendMessage(byteArray, 0, byteArray.length);
} catch (Exception e ){
System.out.println("Error: "+e.getMessage());
}
}
The client-side Javascript looks like this:
onmessage : function(m) {
if (m.data) {
if (m.data instanceof Blob) {
var blob = m.data;
var bytes = new Uint8Array(blob);
var image = context.createImageData(canvas.width, canvas.height);
for (var i=0; i<bytes.length; i++) {
image.data[i] = bytes[i];
}
}
}
}
The connection works and the data is sent (blob.size has the correct value), but the image is not drawn onto the canvas. Firefox gives me the error message "TypeError: Value could not be converted to any of: HTMLImageElement, HTMLCanvasElement, HTMLVideoElement.".
I am aware of the fact that this using WebSockets is not the best way to send an image to the client. After sending the image the WebSocket is only used to send text messages.
What do I need to change for the image to be sent an applied to the canvas?
Resources used:
how to convert image to byte array in java?
Receive Blob in WebSocket and render as image in Canvas
Upvotes: 3
Views: 9023
Reputation: 520
Try converting the image to base64 before sending, for example:
function drawImage(imgString){
var canvas = document.getElementById("canvas");
var ctx = canvas.getContext("2d");
var image = new Image();
image.src = imgString;
image.onload = function() {
ctx.drawImage(image, 0, 0);
};
}
Here's a link on how to convert the image to base64 in Java
Upvotes: 2