Reputation: 1699
I'm looking into audio and video streaming (at the same time) between a provider and a consumer and I am wondering what are the best/common solutions to handle the balancing between audio and video when it comes to CPU and bandwidth.
This is for a proof of concept just to get the idea behind things so I am not looking into libraries that implement any of this, but instead I'm more interested in algorithms and design concepts/patterns to handle managing bandwidth and CPU for the two streams (audio and video).
Generally speaking, what is the common approach? Any good primers out there?
Cheers
Upvotes: 1
Views: 214
Reputation: 71
I don't know if this will answer your question. There is adaptive streaming technologies such as HLS and MPEG-DASH. Those will adapt based on bandwidth and CPU performance as well.
Upvotes: 1