Reputation: 763
I Cannot play sound with AVAudioPCMBuffer (though I could play with AVAudioFile). I got this error.
ERROR: AVAudioBuffer.mm:169: -[AVAudioPCMBuffer initWithPCMFormat:frameCapacity:]: required condition is false: isCommonFormat
here is my code below, and I'd really appreciate for your help.
import UIKit
import AVFoundation
class ViewController: UIViewController {
let audioEngine: AVAudioEngine = AVAudioEngine()
let audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
audioEngine.attachNode(audioFilePlayer)
let filePath: String = NSBundle.mainBundle().pathForResource("test", ofType: "mp3")!
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.fileFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
var mainMixer = audioEngine.mainMixerNode
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options: nil, completionHandler: nil)
var engineError: NSError?
audioEngine.startAndReturnError(&engineError)
audioFilePlayer.play()
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
Upvotes: 14
Views: 12019
Reputation: 99
Instead of calling audioFile.fileFormat, you should use audioFile.processingFormat as the parameter for AVAudioPCMBuffer contructor.
let buffer = AVAudioPCMBuffer(pcmFormat: audioFile.processingFormat,
frameCapacity: bufferCapacity)
Upvotes: 0
Reputation: 4037
update @Bick's code to Swift 5.3
the code logic is easy to get
create an empty AVAudioPCMBuffer
, then fill audio data in it.
Secondly, connect the nodes, and use the data to play
import UIKit
import AVFoundation
class ViewControllerX: UIViewController {
var audioEngine = AVAudioEngine()
var audioFilePlayer = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
// prepare the data
guard let filePath = Bundle.main.path(forResource: "test", ofType: "mp3") else{ return }
print("\(filePath)")
let fileURL = URL(fileURLWithPath: filePath)
do {
let audioFile = try AVAudioFile(forReading: fileURL)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
guard let audioFileBuffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount) else{ return }
try audioFile.read(into: audioFileBuffer)
// connect the nodes, and use the data to play
let mainMixer = audioEngine.mainMixerNode
audioEngine.attach(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to: mainMixer, format: audioFileBuffer.format)
try audioEngine.start()
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, completionHandler: nil)
} catch {
print(error)
}
}
}
Upvotes: 1
Reputation: 35
When using an AVAudioPCMBuffer()
you'll get strange errors if you try and use a pcmFormat that's not mixer.outputFormat(forBus: 0)
It will not accept mono channel formats, it will complain about mismatches between the mixer's output format and your format even if you've described them exactly the same and it won't produce errors that explain exactly what the problem is.
Upvotes: 4
Reputation: 41
The problem is you're setting the format of your PCM buffer to a non-PCM format. Therefore, you need to create your AVAudioPCMBuffer
with the AVAudioFile's processingFormat
.
Upvotes: 4
Reputation: 763
just let me share, this worked somehow, though I don't understand fully.
import UIKit
import AVFoundation
class ViewController: UIViewController {
var audioEngine: AVAudioEngine = AVAudioEngine()
var audioFilePlayer: AVAudioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
let filePath: String = NSBundle.mainBundle().pathForResource("test", ofType: "mp3")!
println("\(filePath)")
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
let audioFile = AVAudioFile(forReading: fileURL, error: nil)
let audioFormat = audioFile.processingFormat
let audioFrameCount = UInt32(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
audioFile.readIntoBuffer(audioFileBuffer, error: nil)
var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to:mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime: nil, options: nil, completionHandler: nil)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
}
Upvotes: 8