Reputation: 281
Hi everybody I hope you can help me out with this.
The problem: I have an RTP stream which I'm multicasting on my private network (WiFi). I would like to use a number of android tablets for displaying the stream. The number of tablets cannot be restricted and the quality should not degrade with increasing number of clients. This explains why I need multicasting rather than unicasts.
The approach: Theoretically by creating a RTSP or HTTP stream on the server side I should be able to serve the video to my clients. However, my understanding is that the server would take a performance hit when too many clients are connecting at the same time, which I need to avoid. Ideally I would like all clients to simply be listening on the very same multicast. That way the number of clients would have no impact on server performance. [NOTE: The IP is local and TTL is set to 0/1 so no danger of clogging anything else than my own network with the multicast packets.]
The implementation To implement the approach above I thought to write a multicast client in Android that receives the RTP packets and stitches together the stream. I tried this with JPEG payload and it works quite well. The problem with JPEG, however, is that the BitmapFactory.decodeByteArray call to decode each frame is very expensive (almost 100ms!) which limits the frame rate considerably. The load on the network is also quite high since JPEG is not a good video streaming protocol.
What I would like to do is to do for video what I already did for pictures, i.e. stitch together the payload stream (e.g. MPEG4) from the RTP packets and feed it to "something". Initially I thought VideoView would work with a raw input stream but I was wrong, VV seems to work only with a rtsp or http url (correct?).
Solution? Now, what are my options? I'd like to avoid setting up a RTSP server from the raw RTP stream and serve all tablets for the reasons above. I did look around for 2 days and checked all the solutions proposed on SO and on the net but nothing seemed to apply to my problem (the RTSP url or a unicast was the solution in most cases, but I don't think I can use it) so I thought it was finally time to ask this question.
Any help is very appreciated!
cheers
Upvotes: 3
Views: 2782
Reputation: 21
OK, I checked and I used the overloaded BitmapFactory.decodeByteArray to use an immutable Bitmap with the InBitmap flag set in BitmapFactory.Options. May have been something else I had to do for the Bitmap itself, probably made it Static at the very least. May have been some other flags to set also, but you should definitely have enough to go on now.
Upvotes: 1
Reputation: 21
After reading your post again, I picked up on something I missed the first time. I used BitmapFactory.decodeByteArray for MJPEG over HTTP from an Axis camera multicast. The call can be done in a few ms. The problem there is that it normally wants to make a new Bitmap every call. There is a way to make the Bitmap persist and that will get the times way down. I just can't remember the call offhand and my normal dev computer is currently being destroyed... err, 'upgraded' by our IT, so I can't tell you off the top of my head, but you should find it if you search a bit. I was able to get 30fps on a Xoom and Galaxy Tab 10.1 and some others no problem.
Mark Boettcher [email protected]
Upvotes: 1
Reputation: 21
We had a problem trying to play MJPEG steaming over RTSP on the Android. The multicast video server we had was not able to send MJPEG over HTTP and we did not want to use H.264 over RTSP because of latency. The application was a ROV sending live video back to a Droid for display. Just to save you a lot of trouble, if I understand the problem correctly, you simply cannot do it with anything in the Android SDK, like MediaPlayer, etc. In the end we got it working by paying a guy to do some custom code using MPlayer, ffmpeg and Live555. Hope this helps.
Upvotes: 0