Krusel
Krusel

Reputation: 65

AvAudioEngine playing and recording possible?

I am trying to use the AVAudioEngine to play a button sound. But unfortunately the sound file is played only once.

The Idea is, that the user taps on the button, a sound plays and the recording starts. After the user taps on the button again, a second sound should be playing indicating, that the recording session has been ended.

So far the first sound appears, and the recording starts. Unfortunately the second sound (the ending sound) wont be played.

And I have found out, that when I am using the same AudioEngine as the recording function, the sound wont be played at all.

As I am completely new to the AVFoundation Framework, I am not sure what the issue here is.

Thank in advance.

    var StartSoundEngineScene1 = AVAudioEngine()
    var StartSoundNodeScene1 = AVAudioPlayerNode()

    func SetupAudio(AudioEngine: AVAudioEngine, SoundNode: AVAudioPlayerNode, FileURL: URL) {
        
        guard let AudioFile = try? AVAudioFile(forReading: FileURL) else{ return }
        let AudioSession = AVAudioSession.sharedInstance()

        AudioEngine.attach(SoundNode)
        AudioEngine.connect(SoundNode, to: AudioEngine.mainMixerNode, format: AudioFile.processingFormat)
        AudioEngine.prepare()
        
    }

   override func viewDidLoad() {
        super.viewDidLoad()
        
        SetupAudio(AudioEngine: StartSoundEngineScene1, SoundNode: StartSoundNodeScene1, FileURL: StartRecSound)

}


    func ButtonSound (AudioEngine: AVAudioEngine, SoundNode: AVAudioPlayerNode, FileURL: URL){
        
        try? AudioEngine.start()
        
        guard let audioFile = try? AVAudioFile(forReading: FileURL) else{ return }
        
        SoundNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
        SoundNode.volume = 0.16
        SoundNode.play()
                
    }

func StartRecording(){
        
ButtonSound(AudioEngine: StartSoundEngineScene1, SoundNode: StartSoundNodeScene1, FileURL: StartRecSound)

Timer.scheduledTimer(withTimeInterval: 0.7, repeats: false) { timer in

          if audioEngine.isRunning {
                audioEngine.stop()
                recognitionRequest?.endAudio()
            
          } else {
              print("Rercording Started")
                  
              if let recognitionTask = self.recognitionTask {
                  recognitionTask.cancel()
                  self.recognitionTask = nil
              }
              
              self.recordedMessage = ""
              
              let audioSession = AVAudioSession.sharedInstance()
              do {
                try audioSession.setCategory(AVAudioSession.Category.record)
                try audioSession.setMode(AVAudioSession.Mode.measurement)
              }catch {
                  print(error)
              }
                        
              recognitionRequest = SFSpeechAudioBufferRecognitionRequest()
              
              guard let recognitionRequest = self.recognitionRequest else {
                  fatalError("Unable to create a speech audio buffer")
              }
              
              recognitionRequest.shouldReportPartialResults = true
              recognitionRequest.requiresOnDeviceRecognition = true
                  
              recognitionTask = speechRecognizer?.recognitionTask(with: recognitionRequest, resultHandler: { (result, error) in
                  
                  var isFinal = false
                  if let result = result {
                      let sentence = result.bestTranscription.formattedString
                      self.recordedMessage = sentence
                      print (self.recordedMessage)
                      isFinal = result.isFinal
                  }
                  
                  if error != nil || isFinal {
                      self.audioEngine.stop()
                      self.audioEngine.inputNode.removeTap(onBus: 0)
                      self.recognitionRequest = nil
                      self.recognitionTask = nil
                      self.RecordBtn.isEnabled = true
                  }
                  
              })
              
              let recordingFormat = audioEngine.inputNode.outputFormat(forBus: 0)
              audioEngine.inputNode.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer, when) in
                  self.recognitionRequest?.append(buffer)
              }
              
              audioEngine.prepare()
                
              do{
                  try audioEngine.start()
              }catch {
                  print(error)
              }
          }
      }

}
    
    func StopRecording(){
         if audioEngine.isRunning{
             audioEngine.stop()
ButtonSound(AudioEngine: StartSoundEngineScene1, SoundNode: StartSoundNodeScene1, FileURL: StopRecSound)
             recognitionRequest?.endAudio()
             audioEngine.inputNode.removeTap(onBus: 0)
             }
         }

Upvotes: 1

Views: 902

Answers (1)

childc
childc

Reputation: 49

You set the AVAudioSessionCategory as record.

try audioSession.setCategory(AVAudioSession.Category.record)

If you want to play and Record concurrently, You should set this category playAndRecord

And... If you change the AVAudioSession during playing or recording, AVAudioEngine's configuration will be changed then It fires the AVAudioEngineConfigurationChange notification.

Upvotes: 1

Related Questions