Reputation: 11184
I want to play liveStream on iPhoneDevice with AVPlayer. Also i want to get CVPixelBufferRef
from this stream for next usage.
I use Apple guide for creating player. Currently with locally stored videoFiles this player works just fine, also when i try to play this AppleSampleStremURL - http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8
- its work sine too.
Problems appear when i want to play stream with rtsp:// like this one: rtsp://192.192.168.1:8227/TTLS/Streaming/channels/2?videoCodecType=H.264
An code - almost all done using guid provided by Apple, but anyway:
Prepare asset for playing
- (void)initialSetupWithURL:(NSURL *)url
{
NSDictionary *assetOptions = @{ AVURLAssetPreferPreciseDurationAndTimingKey : @YES,
AVURLAssetReferenceRestrictionsKey : @(AVAssetReferenceRestrictionForbidNone)};
self.urlAsset = [AVURLAsset URLAssetWithURL:url options:assetOptions];
}
Prepare player
- (void)prepareToPlay
{
NSArray *keys = @[@"tracks"];
__weak SPHVideoPlayer *weakSelf = self;
[weakSelf.urlAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[weakSelf startLoading];
});
}];
}
- (void)startLoading
{
NSError *error;
AVKeyValueStatus status = [self.urlAsset statusOfValueForKey:@"tracks" error:&error];
if (status == AVKeyValueStatusLoaded) {
self.assetDuration = CMTimeGetSeconds(self.urlAsset.duration);
NSDictionary* videoOutputOptions = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
self.videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:videoOutputOptions];
self.playerItem = [AVPlayerItem playerItemWithAsset: self.urlAsset];
[self.playerItem addObserver:self
forKeyPath:@"status"
options:NSKeyValueObservingOptionInitial
context:&ItemStatusContext];
[self.playerItem addObserver:self
forKeyPath:@"loadedTimeRanges"
options:NSKeyValueObservingOptionNew|NSKeyValueObservingOptionOld
context:&ItemStatusContext];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.playerItem];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(didFailedToPlayToEnd)
name:AVPlayerItemFailedToPlayToEndTimeNotification
object:nil];
[self.playerItem addOutput:self.videoOutput];
self.assetPlayer = [AVPlayer playerWithPlayerItem:self.playerItem];
[self addPeriodicalObserver];
NSLog(@"Player created");
} else {
NSLog(@"The asset's tracks were not loaded:\n%@", error.localizedDescription);
}
}
Problems appear here - AVKeyValueStatus status = [self.urlAsset statusOfValueForKey:@"tracks" error:&error];
- this line with rtsp:// URL return AVKeyValueStatusFailed
with Error:
Printing description of error:
Error Domain=AVFoundationErrorDomain Code=-11800
"The operation could not be completed"
UserInfo=0x7fd1ea5a8a10 {NSLocalizedFailureReason=An unknown error occurred (-12936),
NSLocalizedDescription=The operation could not be completed,
NSURL=rtsp://192.168.1.168:8556/PSIA/Streaming/channels/2?videoCodecType=H.264,
NSUnderlyingError=0x7fd1ea53f830 "The operation couldn’t be completed.
(OSStatus error -12936.)"}
I also looked for questions:
fileURLWithPath
return incorrect URL like : rtsp://192.168.1.168:8556/PSIA/Streaming/channels/2?videoCodecType=H.264 --file:///
- so it's i guess incorrect[AVPlayer playerWithPlayerItem:playerItem]
and like [AVPlayer playerWithURL:url]
- nothing changed. Also try to setup different setting for AVAsset - in initialSetupWithURL
(see method implementation above).So, question is AVPlayer support playing rtsp:// stream? If yes, can someone provide sample of correct usage? Nad what i'm doing wrong in code? If AvPlayer not support rtsp:// maybe exist some alternative solution?
Upvotes: 4
Views: 11664
Reputation: 1060
It's impossible to play RTSP stream using AVPlayer. The link you have shared http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8
is an HLS stream. So it works fine.
You should convert the RTSP stream to HLS in a server or you need to find another way. The most efficient one is to play HLS stream on the client. However, there is a way to use VLC if you don't have an opportunity to get an HLS stream. This way is not efficient but will work.
I have an old blog post explaining the VLC on iOS. You can refer.
https://medium.com/mobil-dev/lets-make-a-video-player-app-e759c165fd58
Upvotes: 0
Reputation: 31
Basically it is possible to segment rtsp stream in to small mp4 containers, and push the containers into AVPlayer using customized URLAsset. Here is an experiment, it is still need a work for smooth transition between chunks, but like an idea here https://github.com/MaximKomlev/RTSPAVPlayer
Upvotes: 1
Reputation: 3379
Have you tried with MobileVLCKit
?
It's really easy and work well! I wrote a small example here
If you want to try it, just type pod try ONVIFCamera
in your terminal.
Here is how to do it:
var mediaPlayer = VLCMediaPlayer()
// Associate the movieView to the VLC media player
mediaPlayer.drawable = self.movieView
let url = URL(string: "rtsp://IP_ADDRESS:PORT/params")
let media = VLCMedia(url: url)
mediaPlayer.media = media
mediaPlayer.play()
Upvotes: 2