Yogesh Kulkarni
Yogesh Kulkarni

Reputation: 887

Unable to send video as rtp stream from android

I am trying to make android application which will send camera output to server as rtp stream, but it is not working as expected. I am doing following steps

  1. In Activity class implemented SurfaceTextureListener interface and in onCreate() created TextureView and added listener.

  2. In public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) method created and initialized Camera and MediaCodec instance to encode camera output to H.264. Also added PreviewCallback for camera as following -

    mCamera.setPreviewCallback(new Camera.PreviewCallback() {
    
        @Override
        public void onPreviewFrame(byte[] data, Camera camera) {
            // Here encode method will encode frame using Mediacodec and send it to LocalSocket.
            encode(data);
        }
    });
    
  3. Now another AsyncTask will read this LocalSocket and send it to DatagramSocket by adding RTP Header in each packet.

  4. I am testing this code on VLC by giving sdp file, but VLC is not playing any video. If I open udp socket in VLC udp://@:5001 then in the Media Information VLC is showing some data in "Read At Media" and "Input Bitrate", means my app is sending some data to that udp port. Also I tried to save video into android device and my app is saving proper video from same MediaCoded and Camera code.

RTP Header and packet formation code

    int Version; // 2 bits
int Padding; // 1 bit
int Extension; // 1 bit
int CC; // 4 bits
int Marker; // 1 bit
int PayloadType=96; // 7 bits
int Ssrc; // 32 bits
Version = 2;
Padding = 0;
Extension = 0;
CC = 0;
Marker = 0;
Ssrc = 0;
byte[] header = new byte[ 12 ];
long timeStamp = System.currentTimeMillis();
mSeq = ++mSeq + 1;
header[0] = (byte)(Version << 6);
header[0] = (byte)(header[0] | Padding << 5);
header[0] = (byte)(header[0] | Extension << 4);
header[0] = (byte)(header[0] | CC);
header[1] = (byte)(header[1] | Marker << 7);
header[1] = (byte)(header[1] | PayloadType);
header[2] = (byte)(mSeq >> 8);
header[3] = (byte)(mSeq & 0xFF);
header[4] = (byte)(timeStamp >> 24);
header[5] = (byte)(timeStamp >> 16);
header[6] = (byte)(timeStamp >> 8);
header[7] = (byte)(timeStamp & 0xFF);
header[8] = (byte)(Ssrc >> 24);
header[9] = (byte)(Ssrc >> 16);
header[10] = (byte)(Ssrc >> 8);
header[11] = (byte)(Ssrc & 0xFF); 
mBuffers = new byte[1400];
System.arraycopy(header, 0, mBuffers, 0, header.length);
System.arraycopy(buf, 0, mBuffers, 12, buf.length);
DatagramPacket out = new DatagramPacket(mBuffers, mBuffers.length, hostAddress, 5001);
socket.send(out);

I tried to fix my code by removing first 4 bytes of packet as someone from stackoverflow says that in AVC we need to remove 1st 4 bytes. Also checked my RTP header twice, but no luck...

Any idea why my code is failing to send video as rtp?

Upvotes: 2

Views: 1597

Answers (2)

Robert Rowntree
Robert Rowntree

Reputation: 6289

there may be another approach besides taking on the packetization and the networking RFP's covering RTP protocol yourself.

Find a lib that does it for you.

This project is built on netty ( should be OK on android ).

I mention it because i looked at it some time ago for doing SDP/RTP on android in a SIP/VOIP context and found it to be servicable/workable.

If u burn out at the raw packetization level ( i would not want to be testing that w/ wireShark et al over adb ) you could look over his ./src/test/**/session folder i think to get an idea of how his test stuff things run. You should be able to find the RTP level stuff pretty easily and AFAIK the packetization stuff and RFP stuff is good.

In general i believe you would extend some kind of "Session" that just wraps/hooks your video channels/streams where his examples may be doing voice/RTP packetizing.

Upvotes: 1

mstorsjo
mstorsjo

Reputation: 13317

You can't just add the RTP header, you also need to reformat the encoded buffers to fit into one or more fixed-length RTP packets (aka "packetize") for the H264 RTP payload format, see RFC 6184 for the full specification.

If the H.264 packets are short enough to fit in the 1400 packet size, then yes, it's enough to just remove the first 4 bytes (assuming that the first 4 bytes are 0, 0, 0, 1). If the output buffer from the encoder contains more than one NAL unit (if there are more occurrances of the sequence [0,] 0, 0, 1 in the buffer), then you would need to either send each NAL unit in a separate packet, or use one of the more elaborate packetization schemes, see the RFC for more details on this.

Secondly, currently you're sending the full 1400 bytes packet even if the actual encoded payload was shorter. I'm not sure how much problems it can cause, or if it can pass unnoticed, but you really should only send as many bytes as you actually filled. (That is, instead of mBuffers.length, use 12 + buf.length.)

Upvotes: 1

Related Questions