Reputation: 1222
how can i get the averagePowerForChannel
in AVPlayer
in order to make an audio visualization on my music app!
ive already done the visualization part but im stuck in its engine (realtime volume channel).
i know that by using AVAudioPlayer
it can be done easily using the .meteringEnabled
Property but for some known reason AVPlayer
is a must in my app!
im actualy thinking of using AVAudioPlayer
Alongside with AVPlayer
to get the desired result but it sounds kind of messy workaround,
how can that affect performance and stability?
thanks in advance
Upvotes: 3
Views: 2620
Reputation: 1578
You will need an audio processor class in combination with AV Foundation to visualize audio samples as well as applying a Core Audio audio unit effect (Bandpass Filter) to the audio data. You can find a sample by Apple here
Essentially you will need to add an observer to you AVPlayer like the below:
// Notifications
let playerItem: AVPlayerItem! = videoPlayer.currentItem
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.New, context: nil);
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: videoPlayer.currentItem, queue: NSOperationQueue.mainQueue(), usingBlock: { (notif: NSNotification) -> Void in
self.videoPlayer.seekToTime(kCMTimeZero)
self.videoPlayer.play()
print("replay")
})
Then handle the notification in the overriden method below:
override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
if (object === videoPlayer.currentItem && keyPath == "tracks"){
if let playerItem: AVPlayerItem = videoPlayer.currentItem {
if let tracks = playerItem.asset.tracks as? [AVAssetTrack] {
tapProcessor = MYAudioTapProcessor(AVPlayerItem: playerItem)
playerItem.audioMix = tapProcessor.audioMix
tapProcessor.delegate = self
}
}
}
}
Here's a link to a sample project on GitHub
Upvotes: -1
Reputation: 7560
I have an issue with AVPlayer
visualisation for about two years. In my case it involves HLS live streaming, in that case, you won't get it running, as of my knowledge.
EDIT This will not let you access the averagePowerForChannel:
method, but you will get access to the raw data and with for example FFT get the desired information.
I got it working with local playback, though. You basically wait for the players player item to have a track up and running. At that point you will need to patch an MTAudioProcessingTap
into the audio mix.
The processing tap will run callbacks you specify in which you will be able to compute the raw audio data as you need.
Here is a quick example (sorry for heaving it in ObjC, though):
#import <AVFoundation/AVFoundation.h>
#import <MediaToolbox/MediaToolbox.h>
void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {};
void finalize(MTAudioProcessingTapRef tap) {};
void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {};
void unprepare(MTAudioProcessingTapRef tap) {};
void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames, MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut, CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {};
- (void)play {
// player and item setup ...
[[[self player] currentItem] addObserver:self forKeyPath:@"tracks" options:kNilOptions context:NULL];
}
//////////////////////////////////////////////////////
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)changecontext:(void *)context
if ([keyPath isEqualToString:@"tracks"] && [[object tracks] count] > 0) {
for (AVPlayerItemTrack *itemTrack in [object tracks]) {
AVAssetTrack *track = [itemTrack assetTrack];
if ([[track mediaType] isEqualToString:AVMediaTypeAudio]) {
[self addAudioProcessingTap:track];
break;
}
}
}
- (void)addAudioProcessingTap:(AVAssetTrack *)track {
MTAudioProcessingTapRef tap;
MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = (__bridge void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalise;
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err) {
NSLog(@"error: %@", [NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
return;
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack];
[inputParams setAudioTapProcessor:tap];
[audioMix setInputParameters:@[inputParams]];
[[[self player] currentItem] setAudioMix:audioMix];
}
There is some discussion going on over on my question from over two years ago, so make sure to check it out as well.
Upvotes: 2