NicoC
NicoC

Reputation: 65

How to write an array of samples into a 24 bits audio file with AVAudioBuffer?

I'm having trouble writing wav files in 24bits with AVAudioEngine in swift. For my usage, my input is an array of Float. I have the audio format of the input file (retrieved with AVAudioFile).

So, I need to convert my input Float array to a value that will be writable for the buffer. Also, I want to find the right channel to write my data. My code is working with 16bit and 32 bit files, but I don't know how to handle 24 bit files... Here it is :

  //Static func to write audiofile
  fileprivate func writeAudioFile(to outputURL : URL,
                                  withFormat format : AVAudioFormat,
                                  fromSamples music : [Float] )
  {
    var outputFormatSettings = format.settings
    guard let bufferFormat = AVAudioFormat(settings: outputFormatSettings) else{
      return
    }

    var audioFile : AVAudioFile?
    do{
      audioFile = try AVAudioFile(forWriting: outputURL,
                                  settings: outputFormatSettings,
                                  commonFormat: format.commonFormat,
                                  interleaved: true)
    } catch let error as NSError {
      print("error:", error.localizedDescription)
    }

    let frameCount = music.count / Int(format.channelCount)
    let outputBuffer = AVAudioPCMBuffer(pcmFormat: bufferFormat,
                                        frameCapacity: AVAudioFrameCount(frameCount))
    //We write the data in the right channel
    guard let bitDepth = (outputFormatSettings["AVLinearPCMBitDepthKey"] as? Int) else {
      return
    }
    switch bitDepth {
    case 16:
      for i in 0..<music.count {
        var floatValue = music[i]
        if(floatValue > 1){
          floatValue = 1
        }
        if(floatValue < -1){
          floatValue = -1
        }
        let value = floatValue * Float(Int16.max)
        outputBuffer?.int16ChannelData!.pointee[i] =  Int16(value)
      }
case 24:
  //Here I am not sure of what I do ... Could'nt find the right channel !
  for i in 0..<music.count {
    outputBuffer?.floatChannelData!.pointee[i] =  music[i]
  }
    case 32:
      for i in 0..<music.count {
        outputBuffer?.floatChannelData!.pointee[i] = music[i]
      }
    default:
      return
    }
    outputBuffer?.frameLength = AVAudioFrameCount( frameCount )

    do{
      try audioFile?.write(from: outputBuffer!)

    } catch let error as NSError {
      print("error:", error.localizedDescription)
      return
    }
  }

Thanks by advance if someone have an idea of how to handle this !

Upvotes: 4

Views: 1732

Answers (1)

dave234
dave234

Reputation: 4955

Representing a 24 bit int in C isn't fun so in Swift I'm sure it's downright painful, and none of the API's support it anyway. Your best bet is to convert to a more convenient format for processing.

AVAudioFile has two formats and an internal converter to convert between them. Its fileFormat represents the format of the file on disk, while its processingformat represents the format of the lpcm data when it is read from, and the format of the lpcm data that it will accept when being written to.

The typical workflow is choose a standard processingFormat, do all of your processing using this format, and let AVAudioFile convert to and from the file format for reading and writing to disk. All of the Audio Unit APIs accept non-interleaved formats, so I tend to use non interleaved for all of my processing formats.

Here's an example that copies the first half of an audio file. It doesn't address your existing code, but illustrates a more common approach:

func halfCopy(src: URL, dst: URL) throws {

    let srcFile = try AVAudioFile(forReading: src) //This opens the file for reading using the standard format (deinterleaved floating point).
    let dstFile = try AVAudioFile(forWriting: dst,
                                  settings: srcFile.fileFormat.settings,
                                  commonFormat: srcFile.processingFormat.commonFormat,
                                  interleaved: srcFile.processingFormat.isInterleaved) //AVAudioFile(forReading: src) always returns a non-interleaved processing format, this will be false

    let frameCount = AVAudioFrameCount(srcFile.length) / 2  // Copying first half of file

    guard let buffer = AVAudioPCMBuffer(pcmFormat: srcFile.processingFormat,
                                        frameCapacity: frameCount) else {
                                            fatalError("Derp")
    }

    try srcFile.read(into: buffer, frameCount: frameCount)
    try dstFile.write(from: buffer)

}

Upvotes: 4

Related Questions