user5431162
user5431162

Reputation:

What is the major role of Streaming Media Server?

I am new to Live streaming of a data. I have been exploring in a web about how to live stream a Video. Actually I am an iOS developer and I want to develop an App that streams video.

I am clear about the fundamentals of live video streaming. I came to know that I will be need a Streaming Media Server which will feed the stream to the viewer. I also came to know that viewer has to have a player which decodes the data and synchronize the audio/video stream.

Now, Wowza is a kind of Streaming Media Server which is recommended. But, I have following questions..

(1) Why Media Server? Why we can't have our own Media server? What actually Media Server do that makes its role necessary ?

(2) In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server ?

(3) How will my server communicate with a streaming server like Wowza ?

(4) How Wowza will feed the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.

(5) What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?

Guys, I need to develop a streaming App with better quality. So, better I first understand the flow of data and then start.

It would be great if someone gives a graphical representation of the data flow.

Thanks a lot in Advance !!!

Upvotes: 1

Views: 648

Answers (1)

jabal
jabal

Reputation: 12347

Let me quickly add my understanding to your questions:

1a. Why Media Server? ..

You could write your own software for distributing the stream data to all the players as well. But in that case you would need to implement various transport protocols and you would end up implementing a fairly big piece of software, your home grown media server.

1b. What actually Media Server do to make its role necessary?

A way to see the role of the media server is to either receive the live stream from a stream source and handle the distribution of this stream to probably many-many other players. This usually involves taking the data out from the source transport protocol and repackage it into one or more other container format or transport protocol that the clients favour. Optionally the Media Server can change the way the video or the audio is encoded (transcoding), or produce different resolution and quality streams and provide the players with the list of available qualities in the form of a manifest file (e.g. m3u8 or smil file) so they can do so called adaptive streaming.

An other typical use-case of Media Servers is serving non-live video files to players from disk, as well as recording live streams, and so on. If you look at the feature list of popular media servers, you'll see that they are really doing many things, so practically this is something you probably want to get out of the box and not implement your own.

  1. In my App, I will have to integrate a library for encoding and feed to a streaming server like Wowza. But, how it would be fed to the streaming server?

You need to encode the video and audio with a particular codec (such as H.264 for video and AAC for audio), then you need to choose a suitable container format to put these streams into (e.g. MPEG-TS) and then choose a transport protocol to push the stream to the server (e.g. RTMP). Best if you google for tutorials to see how this looks like in code.

  1. How will my server communicate with a streaming server like Wowza?

The contract is basically the transport protocol, one example is using RTMP protocol to connect to Wowza and publish the stream to it. These protocols cover all the technical details.

  1. How Wowza will feed to the stream to the receiving side i.e. the user having an iPhone and needs to see a live stream.

The player software will initiate the communication with Wowza. This is again protocol dependent but in case you are using HLS, the player will use the HTTP protocol to find out the URL of the consequtive video chunks that it will progressively download and display to the user.

  1. What should be at the receiving side. What will decode the stream and will play the stream to AVPlayer ?

It's not clear whether your app under development is the broadcaster side or the player side. But generally on the player side you need to find a library that is able to pull the stream from the media server with the protocol/transport/codec you are using. I am not familiar with this part in iOS, I only have experience with players embedded in websites.

I am not going to draw this, but imagine 3 boxes connected with arrows and that's the data flow. From encoder to streaming server and finally to player. That's it I guess.. :-)

Upvotes: 3

Related Questions