miketucker
miketucker

Reputation: 307

is it possible to read metadata using HTTP live streaming in the iPhone SDK

When playing a live stream using the HTTP Live Streaming method, is it possible read the current metadata (eg. Title and Artist)? This is for an iPhone radio app.

Upvotes: 12

Views: 7475

Answers (4)

Bogdan
Bogdan

Reputation: 131

Swift solution. This is a sample of simple streaming audio player. You can read metadata in the method of delegate AVPlayerItemMetadataOutputPushDelegate.

import UIKit
import AVFoundation

class PlayerViewController: UIViewController {
    var player = AVPlayer()

    override func viewDidLoad() {
        super.viewDidLoad()
        configurePlayer()
        player.play()
    }

    private func configurePlayer() {
        guard let url = URL(string: "Your stream URL") else { return }
        let asset = AVAsset(url: url)
        let playerItem = AVPlayerItem(asset: asset)
        let metadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
        metadataOutput.setDelegate(self, queue: DispatchQueue.main)
        playerItem.add(metadataOutput)
        player = AVPlayer(playerItem: playerItem)
    }
}

extension PlayerViewController: AVPlayerItemMetadataOutputPushDelegate {
    func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
        let item = groups.first?.items.first
        item?.value(forKeyPath: "value")
        print(item!.value(forKeyPath: "value")!)
    }
}

Upvotes: 3

Pablo Ruan
Pablo Ruan

Reputation: 1771

in swift 2.0 getting metadata info music streaming:

PlayerItem.addObserver(self, forKeyPath: "timedMetadata", options: NSKeyValueObservingOptions.New, context: nil)

add this method:

override func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {

    //Atualiza Nome Musica
    if keyPath == "timedMetadata" {
        if let meta = PlayerItem.timedMetadata {
            print("Novo Metadata \(meta)")
            for metadata in meta {
                if let nomemusica = metadata.valueForKey("value") as? String{
                    LB_NomeMusica.text = nomemusica
                    if NSClassFromString("MPNowPlayingInfoCenter") != nil {
                        let image:UIImage = UIImage(named: "logo.gif")!
                        let albumArt = MPMediaItemArtwork(image: image)
                        var songInfo: [String:AnyObject] = [
                            MPMediaItemPropertyTitle: nomemusica,
                            MPMediaItemPropertyArtist: "Ao Vivo",
                            MPMediaItemPropertyArtwork: albumArt
                        ]
                        MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = songInfo
                    }
                }
            }
        }
    }


}

Upvotes: 5

m8labs
m8labs

Reputation: 3721

Not sure that this question is still actual for its author, but may be it will help someone. After two days of pain I investigated that it's quite simple. Here is the code that works for me:

AVPlayerItem* playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:<here your http stream url>]];

[playerItem addObserver:self forKeyPath:@"timedMetadata" options:NSKeyValueObservingOptionNew context:nil];

AVPlayer* player = [[AVPlayer playerWithPlayerItem:playerItem] retain];
[player play];

and then:

- (void) observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object
                        change:(NSDictionary*)change context:(void*)context {

   if ([keyPath isEqualToString:@"timedMetadata"])
   {
      AVPlayerItem* playerItem = object;

      for (AVMetadataItem* metadata in playerItem.timedMetadata)
      {
         NSLog(@"\nkey: %@\nkeySpace: %@\ncommonKey: %@\nvalue: %@", [metadata.key description], metadata.keySpace, metadata.commonKey, metadata.stringValue);
      }
   }
}

That's it. I dont know why Apple didn't provide in the docs for AVPlayerItem this sample for access "title" of the stream which is the key feature for real world streaming audio. In "AV Foundation Framework Reference" they tell about "timedMetadata" nowhere where needed. And Matt's sample does not work with all streams (but AVPlayer does).

Upvotes: 21

Moshe
Moshe

Reputation: 58087

It is, but it's not easy. Matt Gallagher has a nice post on his blog about streaming audio. To quote him on the subject:

The easiest source of metadata comes from the HTTP headers. Inside the handleReadFromStream:eventType: method, use CFReadStreamCopyProperty to copy the kCFStreamPropertyHTTPResponseHeader property from the CFReadStreamRef, then you can use CFHTTPMessageCopyAllHeaderFields to copy the header fields out of the response. For many streaming audio servers, the stream name is one of these fields.

The considerably harder source of metadata are the ID3 tags. ID3v1 is always at the end of the file (so is useless when streaming). ID3v2 is located at the start so may be more accessible.

I've never read the ID3 tags but I suspect that if you cache the first few hundred kilobytes of the file somewhere as it loads, open that cache with AudioFileOpenWithCallbacks and then read the kAudioFilePropertyID3Tag with AudioFileGetProperty you may be able to read the ID3 data (if it exists). Like I said though: I've never actually done this so I don't know for certain that it would work.

Upvotes: 0

Related Questions