Reputation: 1054
I am following these two codes examples: one for sending image from android and other for attaching received image on canvas
For sending image from android using webrtc datachannel
For receiving image on web and attaching on canvas using webrtc datachannel
https://io2014codelabs.appspot.com/static/codelabs/webrtc-file-sharing/#7
Case is that I want to continuously send images of screen from android to web so that it looks like that screen is being shared from android and every change on the screen of android would be shown on the canvas on web.
Code on Android
This is the code to capture the screen of android.
public void startProjection() {
startActivityForResult(projectionManager.createScreenCaptureIntent(), SCREEN_REQUEST_CODE);
}
This is the code to extract images from the screen of android which I just captured.
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
switch (requestCode) {
case SCREEN_REQUEST_CODE:
mediaProjection = projectionManager.getMediaProjection(resultCode, data);
if (mediaProjection != null) {
projectionStarted = true;
// Initialize the media projection
DisplayMetrics metrics = getResources().getDisplayMetrics();
int density = metrics.densityDpi;
int flags = DisplayManager.VIRTUAL_DISPLAY_FLAG_OWN_CONTENT_ONLY
| DisplayManager.VIRTUAL_DISPLAY_FLAG_PUBLIC;
Display display = getWindowManager().getDefaultDisplay();
Point size = new Point();
display.getSize(size);
projectionDisplayWidth = size.x;
projectionDisplayHeight = size.y;
imageReader = ImageReader.newInstance(projectionDisplayWidth, projectionDisplayHeight
, PixelFormat.RGBA_8888, 2);
mediaProjection.createVirtualDisplay("screencap",
projectionDisplayWidth, projectionDisplayHeight, density,
flags, imageReader.getSurface(), null, handler);
imageReader.setOnImageAvailableListener(new ImageAvailableListener(), handler);
}
break;
}
}
Here is the image available listener class:
private class ImageAvailableListener implements ImageReader.OnImageAvailableListener {
@Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
FileOutputStream fos = null;
Bitmap bitmap = null;
ByteArrayOutputStream stream = null;
try {
image = imageReader.acquireLatestImage();
if (image != null) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * projectionDisplayWidth;
// create bitmap
bitmap = Bitmap.createBitmap(projectionDisplayWidth + rowPadding / pixelStride,
projectionDisplayHeight, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 5, stream);
ByteBuffer byteBuffer = ByteBuffer.wrap(stream.toByteArray());
DataChannel.Buffer buf = new DataChannel.Buffer(byteBuffer, true);
Log.w("CONFERENCE_SCREEN", "Image size less than chunk size condition");
client.sendDataChannelMessage(buf);
imagesProduced++;
Log.w("CONFERENCE_SCREEN", "captured image: " + imagesProduced);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
if (stream != null) {
try {
stream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
if (bitmap != null) {
bitmap.recycle();
}
if (image != null) {
image.close();
}
}
}
}
Code on Web
Creating Canvas:
var canvas = document.createElement('canvas');
canvas.classList.add('incomingPhoto');
screenAndroidImage.insertBefore(canvas, screenAndroidImage.firstChild); // screenAndroidImage is a div
I run the following code whenever image is sent from the android:
if (data.data.byteLength || typeof data.data !== 'string') {
var context = canvas.getContext('2d');
var img = context.createImageData(300, 150);
img.data.set(data.data);
context.putImageData(img, 0, 0);
trace("Image chunk received");
}
I can see image data being received as ArrayBuffer{} on web console. I can't see anything being rendered on the canvas.
Upvotes: 2
Views: 2600
Reputation: 1054
I found the mistake and correction. First of all, in class ImageAvailableListener
we need to change it to support if image size is greater than the byte limit of webrtc data channel. If image size is greater than our limit then we break our image into smaller byte chunks.
private class ImageAvailableListener implements ImageReader.OnImageAvailableListener {
@Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
FileOutputStream fos = null;
Bitmap bitmap = null;
ByteArrayOutputStream stream = null;
try {
image = imageReader.acquireLatestImage();
if (image != null) {
Image.Plane[] planes = image.getPlanes();
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * projectionDisplayWidth;
// create bitmap
bitmap = Bitmap.createBitmap(projectionDisplayWidth + rowPadding / pixelStride,
projectionDisplayHeight, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 5, stream);
if(stream.toByteArray().length < 16000){
ByteBuffer byteBuffer = ByteBuffer.wrap(stream.toByteArray());
DataChannel.Buffer buf = new DataChannel.Buffer(byteBuffer, true);
Log.w("CONFERENCE_SCREEN", "Image size less than chunk size condition");
client.sendDataChannelMessage(buf);
client.sendDataChannelMessage(new DataChannel.Buffer(Utility.toByteBuffer("\n"), false));
} else {
// todo break files in pieces here
ByteBuffer byteBuffer = ByteBuffer.wrap(stream.toByteArray());
DataChannel.Buffer buf = new DataChannel.Buffer(byteBuffer, true);
client.sendDataChannelMessage(buf);
client.sendDataChannelMessage(new DataChannel.Buffer(Utility.toByteBuffer("\n"), false));
// skylinkConnection.sendData(currentRemotePeerId, stream.toByteArray());
Log.w("CONFERENCE_SCREEN", "sending screen data to peer :");
}
imagesProduced++;
Log.w("CONFERENCE_SCREEN", "captured image: " + imagesProduced);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
if (stream != null) {
try {
stream.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
if (bitmap != null) {
bitmap.recycle();
}
if (image != null) {
image.close();
}
}
}
}
Code on Web
Following variables should be declared outside of function which listens for incoming bytes from datachannel.
var buf;
var chunks = []; var count;
Body of the function which listens on datachannel:
if (typeof data.data === 'string') {
buf = new Uint8ClampedArray(parseInt(data.data));
count = 0;
chunks = [];
console.log('Expecting a total of ' + buf.byteLength + ' bytes');
return;
}
var imgdata = new Uint8ClampedArray(data.data);
console.log('image chunk')
buf.set(imgdata, count);
chunks[count] = data.data;
count += imgdata.byteLength;
if (count === buf.byteLength) {
// we're done: all data chunks have been received
//renderPhoto(buf);
var builder = new Blob(chunks, buf.type);
console.log('full image received');
screenViewer.src = URL.createObjectURL(builder);
}
Where screenViewer
is a HTML image element.
Upvotes: 0
Reputation: 71
It seems like SkylinkJS does not support binary transfers as of the moment. I guess the solution that can be done is to encode the bytes into Base64 encoded string and send them as P2P message across the the Web end. And from the Web end, convert the base64 string into an image to write to the canvas.
For Android SDK doc API: MessagesListener sendP2PMessage For Web SDK doc API: incomingMessage
Upvotes: 1