Reputation: 263
I'm working on a Android app that grabs a image and sends it to a PC client for display, both the Android app and PC application use Opencv. The image that i want to send over is a color image (grabbed in the rbga format).
First i grab a image in the java app using:
InputImage = inputFrame.rgba();
Next i am using the Mat image variabele and convert the input image to a byte array using the following native (using JNI) function:
JNIEXPORT jbyteArray JNICALL Java_com_example_communicationmoduleTCPIP_communicationmoduleTCPIP_ConvertImageToByteArray(
JNIEnv* Env, jobject,
jlong addrInputImage){
Mat& OutputImg = *(Mat*) addrInputImage;
jbyteArray Array;
//__android_log_write(ANDROID_LOG_ERROR, "Tag", "==== 1 ");
// Init java byte array
Array = Env->NewByteArray(1536000);
//__android_log_write(ANDROID_LOG_ERROR, "Tag", "==== 2 ");
// Set byte array region with the size of the SendData CommStruct.
// Now we can send the data back.
Env->SetByteArrayRegion(Array, 0, 1536000, (jbyte*)OutputImg.data);
//__android_log_write(ANDROID_LOG_ERROR, "Tag", "==== 3 ");
return Array;
}
Next is end the byte array (containing the image data) over TCP with the following function:
// Send buffer, the method can be used by both client and server objects.
public void SendByteBuffer(byte[] ByteBuffer){
try {
// Get socket output stream
OutputStream OutputBuffer = ClientSocket.getOutputStream();
//Write byte data to outputstream
OutputBuffer.write(ByteBuffer);
}
catch (IOException e) {
Log.e("TCPIPCommunicator: ", "Client: Failed to send", e);
e.printStackTrace();
}
}
On the PC side (in c++) i recieve the buffer with a boost lib recieve function:
int CommunicationModuleTCPIPServer::RecieveBuffer(char Buffer[], int Size){
boost::asio::read(ServerSocket, boost::asio::buffer(TempBuffer, 1536000));
//boost::asio::read(ServerSocket, boost::asio::buffer((char*)InputImage, Size));
//cout <<"Temp buffer: " << TempBuffer << endl;
int ptr=0;
for (int i = 0; i < InputImage.rows; i++) {
for (int j = 0; j < InputImage.cols; j++) {
InputImage.at<cv::Vec4b>(i,j) = cv::Vec4b(TempBuffer[ptr+ 0],TempBuffer[ptr+1],TempBuffer[ptr+2],TempBuffer[ptr+3]);
ptr=ptr+3;
}
}
return COMMSUCCES;
}
And then i display the variable with the imshow function of opencv. The problem is that i don't get a image in the window on the pc side. i'm thinking the conversion is going wrong somewhere but i dont' see where. Does anybody have a idea? All suggestions and feedback are welcome!
The code that i have so far is as followed:
On the PC server side i run a small program that recieves the image and tries to show it with imshow, it calls the RecieveImageBuffer function. Below the main function.
int ImgReciever(){
cout << "Setting up monitor server with ip: 192.168.2.11:5000" << endl;
CommunicationModuleTCPIPServer TICM("192.168.2.11", 5000);
//CommunicationModuleTCPIPServer TICM("192.168.1.103", 5000);
VisionModule VM;
TICM.RunServer();
char ACK[10];
ACK[0] = '@';
while(1){
cout << "Recieving data...." << endl;
TICM.RecieveImageBuffer(TICM.TempBuffer, 384000);
cout << "Data recieved, going to display image" << endl;
imshow("Testwindow", TICM.InputImage);
Sleep(1000);
TICM.SendBuffer(ACK, 1);
}
return 0;
}
The RecieveImageBuffer function (tried to switch the fors of rows and cols but then the program crashes). The read function alwas recieves 34800 bytes in one read. And Tempbuffer is declared as:
uchar TempBuffer[384000]
int CommunicationModuleTCPIPServer::RecieveImageBuffer(uchar Buffer[], int Size){
Mat Temp;
int bytecount = 0;
int bytecountTotal = 0;
while(bytecountTotal < Size){
bytecount = boost::asio::read(ServerSocket, boost::asio::buffer(Buffer, Size));
cout << "Recieved chunck size: " << bytecount << endl;
bytecountTotal = bytecountTotal + bytecount;
bytecount = 0;
}
cout << "Recieved in total: " + bytecountTotal << endl;
int ptr=0;
for (int i = 0; i < InputImage.rows; i++) {
for (int j = 0; j < InputImage.cols; j++) {
//InputImage.at<cv::Vec4b>(i,j) = cv::Vec4b(TempBuffer[ptr+ 0],TempBuffer[ptr+1],TempBuffer[ptr+2],TempBuffer[ptr+3]);
//ptr=ptr+3;
ptr++;
InputImage.at<uchar>(i,j) = TempBuffer[ptr];
}
}
return COMMSUCCES;
}
On the Android side i send the image, firs i grab and convert the graycale image, the buffer sizes matches that of the pc buffer.
OutputImage = inputFrame.gray();
long Size = (OutputImage.total() * OutputImage.channels());
CMTCP.bufferByte = new byte[(int) Size];
CMTCP.bufferByte = ConvertMatToByteArray(OutputImage.getNativeObjAddr(), Size);
The ConvertMatToByteArray function looks as followed:
JNIEXPORT jbyteArray JNICALL Java_com_example_opencv1_MainActivity_ConvertMatToByteArray(
JNIEnv* Env, jobject, jlong addrInputImage, jlong Size){
Mat& OutputImg = *(Mat*) addrInputImage;
jbyteArray Array;
// Init java byte array
Array = Env->NewByteArray(Size);
// Set byte array region with the size of the SendData CommStruct.
// Now we can send the data back.
Env->SetByteArrayRegion(Array, 0, Size, (jbyte*)OutputImg.data);
return Array;
}
The TCP client runs i a thread, that sends a image buffer and then recieves a ack for a new send.
The thread function looks as follows:
while(true){
if(SendFrame){
//System.out.println("Converting image");
// Convert Buffer
//ImgDataSend = ConvertImageToByteArray(InputImage.getNativeObjAddr(), 10, 20, 30);
//System.out.println("Image converted");
// Send new image buffer
System.out.println("Sending frame data");
SendByteBuffer(bufferByte);
SendFrame = false;
System.out.println("Image data send");
// Wait for ACK
while(NoAck){
RecieveBuffer();
//System.out.println("Recieved: " +RecieveString);
if(RecieveString == "@"){
System.out.println("GOT ACK");
bufferByte = new byte[384000];
NoAck = false;
RecieveString = "#";
}
}
}
The SendByteBuffer function looks as followd:
// Send buffer, the method can be used by both client and server objects.
public void SendByteBuffer(byte[] ByteBuffer){
/*
BufferedWriter out;
try {
out = new BufferedWriter(new OutputStreamWriter(ClientSocket.getOutputStream()));
out.write(bufferByte.toString());
out.flush();
} catch (IOException e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
*/
try {
// Get socket output stream
OutputStream OutputBuffer = ClientSocket.getOutputStream();
//Write byte data to outputstream
OutputBuffer.write(bufferByte);
//OutputBuffer.flush();
}
catch (IOException e) {
Log.e("TCPIPCommunicator: ", "Client: Failed to send", e);
e.printStackTrace();
}
}
I recieve the send and ACK functions messages, but the window on the PC side that displays the image stays gray and does not responde (says that in the title bar)
Upvotes: 2
Views: 1146
Reputation: 731
I presume you've verified that the comms exchange works ok and there are no little/big endian issues between the platforms, ie. send a small bytes string and check its preserved intact on the other side.
Also, check that read returns all the data - you might need to call it until all the data has been received especially for large byte buffers.
inputFrame.rgba() returns a 4 channel image - I think you need ptr=ptr+4;
Upvotes: 1