Reputation: 856
I have to get the live stream video from DJI Phantom 3 camera in my C++ application, in order to do a Computer Vision processing in OpenCV.
First I tried sending the H264 raw data through an UDP socket, inside this callback:
mReceivedVideoDataCallBack = new CameraReceivedVideoDataCallback() {
@Override
public void onResult(byte[] videoBuffer, int size) {
//Here, I call a method from a class I created, that sends the buffer through UDP
if (gravar_trigger) controleVideo.enviarFrame(videoBuffer, size);
if (mCodecManager != null) mCodecManager.sendDataToDecoder(videoBuffer, size);
}
};
That communication above works well. However, I haven't been able to decode that UDP H264 data in my C++ desktop application. I have tested with FFmpeg lib, but couldn't get to alocate an AVPacket
with my UDP data, in order to decode using avcodec_send_packet
and avcodec_receive_frame
. I also had problems with AVCodecContext
, since my UDP communication wasn't a stream like RTSP, where it could get information about its source. Therefore, I had to change how I was trying to solve the problem.
Then, I found libstreaming, in which can be associate to stream the android video camera to a Wowza Server, creating something like a RTSP stream connection, where the data could be obtained in my final C++ application easily using OpenCV videoCapture
. However, libstreaming uses its own surfaceView
. In other words, I would have to link the libstreaming surfaceView
with the DJI Drone's videoSurface
. I'm really new to Android, so don't have any clue of how to do that.
To sum up, is that the correct approach? Someone has a better idea? Thanks in advance
Upvotes: 1
Views: 1934
Reputation: 856
After a long time, I finally developed a system that can stream the DJI drone camera correctly
https://github.com/raullalves/DJI-Drone-Camera-Streaming
Upvotes: 1
Reputation: 311
I'm going to wager a couple things. Well, Mostly one thing. One typically needs to handle creating fragmented video packets before sending them. The IDRs of h264 are too large for udp streaming.
Having a solid com link between endpoints, you can add a method which converts a single potentially large packet input to one or more small packet outputs.
The packets that are larger than perhaps 1000 bytes need to be broken into several h264 NALU type 28s. The packets that are small and have the same timestamp can be sent in STAP-A type 24s. Typically you can find inband sps/pps in a stap-a.
Once you have a packetizer for IDRs and large slices, write you depacketizor on the receiver and then you should get clear decoded pictures.
Refer to the h264 RTP specs for how to make the type 28s.
Upvotes: 1