jeroen
jeroen

Reputation: 502

Offline rendering with The Amazing Audio Engine

This post is also posted on The Amazing Audio Engine forum.

Hi everyone, I am new to The Amazing Audio Engine and iOS dev, and have been trying to figure out how to get the BPM of a track.

So far I have found two articles on offline rendering on the forum:

  1. http://forum.theamazingaudioengine.com/discussion/comment/1743/#Comment_1743
  2. http://forum.theamazingaudioengine.com/discussion/comment/649#Comment_649

As far as I know the AEAudioControllerRenderMainOutput function is only correctly implemented in this fork.

I am trying to do offline rendering to process a track and then use the algorithm described here (JavaScript) and implemented here.

So far I'm loading this fork, and I am using Swift (I am part of Make School Summer Academy at the moment, which teaches Swift).


When playing a track this code works for me (No offline rendering!)

let file = NSBundle.mainBundle().URLForResource("track", withExtension: 
"m4a")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)


audioController = AEAudioController(audioDescription: AEAudioController.nonInterleavedFloatStereoAudioDescription())

let receiver = AEBlockAudioReceiver { (source, time, frames, audioBufferList) -> Void in

    let leftSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData)
    // Advance the buffer sizeof(float) * 512
    let rightSamples = UnsafeMutablePointer<Float>(audioBufferList[0].mBuffers.mData) + 512

    println("leftSamples: \(leftSamples) rightSamples: \(rightSamples)")


}

audioController.addChannels([channel])
audioController.addOutputReceiver(receiver)

audioController.start()

Trying offline rendering

This is the code I am trying to run while I am using this fork

audioController = AEAudioController(audioDescription: AEAudioController.nonInterleaved16BitStereoAudioDescription())

let file = NSBundle.mainBundle().URLForResource("track", withExtension: "mp3")
let channel: AnyObject! = AEAudioFilePlayer.audioFilePlayerWithURL(file, audioController: audioController, error: nil)

audioController.addChannels([channel])
audioController.start(nil)
audioController.stop()

var t = AudioTimeStamp()
let bufferLength: UInt32 = 4096
var buffer = AEAllocateAndInitAudioBufferList(audioController.audioDescription, Int32(bufferLength))
AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer)

var renderDuration: NSTimeInterval = channel.duration
var sampleRate: Float64 = audioController.audioDescription.mSampleRate
var lengthInFrames: UInt32 = UInt32(renderDuration * sampleRate)
var songBuffer: [Float64]

t.mFlags = UInt32(kAudioTimeStampSampleTimeValid)
var frequencyAnalyzer = FrequencyAnalyzer()

println("renderDuration \(renderDuration)")

var outIsOpen = Boolean()

AUGraphClose(audioController.audioGraph)

AUGraphIsOpen(audioController.audioGraph, &outIsOpen)

println("AUGraphIsOpen: \(outIsOpen)")

for (var i: UInt32 = 0; i < lengthInFrames; i += bufferLength) {
    AEAudioControllerRenderMainOutput(audioController, t, bufferLength, buffer);
    t.mSampleTime += Float64(bufferLength)

    println(t.mSampleTime)
    let leftSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData)
    let rightSamples = UnsafeMutablePointer<Int16>(buffer[0].mBuffers.mData) + 512
    println("leftSamples: \(leftSamples.memory) rightSamples: \(rightSamples.memory)")
}



AEFreeAudioBufferList(buffer)


AUGraphOpen(audioController.audioGraph)
audioController.start(nil)
audioController.stop()

Offline rendering is not working for me ATM. The second example is not working it's getting me a lot of mixed errors which I don't understand.

A very common one is inside the channelAudioProducer function on this line:

// Tell mixer/mixer's converter unit to render into audio status = AudioUnitRender(group->converterUnit ? group->converterUnit : group->mixerAudioUnit, arg->ioActionFlags, &arg->originalTimeStamp, 0, *frames, audio);

It gives me EXC_BAD_ACCESS (code=EXC_I386_GPFLT). Among other errors this one is very common.

I am sorry I am a total noob on this field but some stuff I don't really understand. Should I use nonInterleaved16BitStereoAudioDescription or nonInterleavedFloatStereoAudioDescription? How does this implement the mData?

I would love to get some help on this since I'm kind of lost at the moment. Please when you answer me try to explain it as fully as you can, I am new to this stuff.

NOTE: Posting code in Objective-C is fine if you don't know Swift.

Upvotes: 1

Views: 633

Answers (0)

Related Questions