Sven Ostertag
Sven Ostertag

Reputation: 21

iOS Audio is very quiet (React-Native, Expo-Go)

EDIT: Fixed by setting "allowsRecordingIOS" to False, as per Gordon Childs' comment:

""" You need to set allowsRecordingIOS to false when you want to playback. It sounds like that is the expo way to override audio output defaulting to the quieter receiver, instead of the speaker. """


I'm relatively new to mobile-app development.

While testing a larger app on a physical iOS device through Expo-Go, I found that sounds, either downloaded or via audio recordings, come out very quiet on the iPhone. They sound like they come out at around around half the normal volume, or less.

In the web version, on android, in the iOS simulator, and in the iOS simulator's version of safari loading up the web version, I get sounds playing at normal volumes, but on Expo-Go, running just a little test app on a physical iphone, it comes out quiet, as compared to sounds from other apps on the same physical device, like from YouTube.

Here is the code I tested sound with:

import React, { useState, useEffect } from 'react';
import { View, TouchableOpacity, StyleSheet, Alert } from 'react-native';
import { Audio, InterruptionModeIOS, InterruptionModeAndroid } from 'expo-av';
import { FontAwesome } from '@expo/vector-icons';

interface AudioRecorderProps {
  onRecordingComplete?: (uri: string) => void;
}

export const AudioRecorder: React.FC<AudioRecorderProps> = ({ onRecordingComplete }) => {
  const [recording, setRecording] = useState<Audio.Recording | null>(null);
  const [sound, setSound] = useState<Audio.Sound | null>(null);
  const [isRecording, setIsRecording] = useState(false);
  const [isPlaying, setIsPlaying] = useState(false);
  const [recordingUri, setRecordingUri] = useState<string | null>(null);
  const [isPlayingReference, setIsPlayingReference] = useState(false);

  useEffect(() => {
    // Request permissions and configure audio
    const setupAudio = async () => {
      try {
        await Audio.requestPermissionsAsync();
        await Audio.setAudioModeAsync({
          allowsRecordingIOS: true,
          playsInSilentModeIOS: true,
          staysActiveInBackground: true,
          interruptionModeIOS: InterruptionModeIOS.DuckOthers,
          interruptionModeAndroid: InterruptionModeAndroid.DuckOthers,
          shouldDuckAndroid: true,
          playThroughEarpieceAndroid: true
        });
      } catch (error) {
        Alert.alert('Error', 'Failed to initialize audio recording');
      }
    };

    setupAudio();

    // Cleanup
    return () => {
      if (sound) sound.unloadAsync();
      if (recording) recording.stopAndUnloadAsync();
    };
  }, []);

  const startRecording = async () => {
    try {
      const { recording: newRecording } = await Audio.Recording.createAsync(
        Audio.RecordingOptionsPresets.HIGH_QUALITY
      );
      setRecording(newRecording);
      setIsRecording(true);
    } catch (error) {
      Alert.alert('Error', 'Failed to start recording');
    }
  };

  const stopRecording = async () => {
    if (!recording) return;

    try {
      await recording.stopAndUnloadAsync();
      const uri = recording.getURI();
      setRecordingUri(uri || null);
      setIsRecording(false);
      if (uri && onRecordingComplete) {
        onRecordingComplete(uri);
      }
    } catch (error) {
      Alert.alert('Error', 'Failed to stop recording');
    }
  };

  const playSound = async () => {
    if (!recordingUri) return;

    try {
      if (sound) {
        await sound.unloadAsync();
      }

      const { sound: newSound } = await Audio.Sound.createAsync(
        { uri: recordingUri },
        {
          volume: 1.0,
        }
      );
      setSound(newSound);
      setIsPlaying(true);

      await newSound.playAsync();
      newSound.setOnPlaybackStatusUpdate((status) => {
        if (status && 'didJustFinish' in status && status.didJustFinish) {
          setIsPlaying(false);
        }
      });
    } catch (error) {
      Alert.alert('Error', 'Failed to play recording');
      setIsPlaying(false);
    }
  };

  const playReferenceTone = async () => {
    try {
      const { sound: referenceSound } = await Audio.Sound.createAsync(
        require('../../assets/beep.mp3'),
        {
          volume: 1.0,
        }
      );

      setIsPlayingReference(true);

      await referenceSound.playAsync();
      referenceSound.setOnPlaybackStatusUpdate((status) => {
        if (status && 'didJustFinish' in status && status.didJustFinish) {
          setIsPlayingReference(false);
          referenceSound.unloadAsync();
        }
      });
    } catch (error) {
      Alert.alert('Error', 'Failed to play reference tone');
      setIsPlayingReference(false);
    }
  };

  return (
    <View style={styles.container}>
      <TouchableOpacity
        style={[styles.button, isRecording ? styles.recording : null]}
        onPress={isRecording ? stopRecording : startRecording}
        >
        <FontAwesome
          name={isRecording ? 'stop-circle' : 'microphone'}
          size={24}
          color="white"
        />
      </TouchableOpacity>

      {recordingUri && (
        <TouchableOpacity
          style={[styles.button, isPlaying ? styles.playing : null]}
          onPress={playSound}
          disabled={isPlaying}
          >
          <FontAwesome
            name={isPlaying ? 'pause' : 'play'}
            size={24}
            color="white"
          />
        </TouchableOpacity>
      )}

      <TouchableOpacity
        style={[styles.button, isPlayingReference ? styles.playing : styles.referenceTone]}
        onPress={playReferenceTone}
        disabled={isPlayingReference}
      >
        <FontAwesome
          name="volume-up"
          size={24}
          color="white"
        />
      </TouchableOpacity>
    </View>
  );
};

const styles = StyleSheet.create({
  container: {
    flexDirection: 'row',
    justifyContent: 'center',
    alignItems: 'center',
    gap: 20,
  },
  button: {
    width: 60,
    height: 60,
    borderRadius: 30,
    backgroundColor: '#007AFF',
    justifyContent: 'center',
    alignItems: 'center',
  },
  recording: {
    backgroundColor: '#FF3B30',
  },
  playing: {
    backgroundColor: '#34C759',
  },
  referenceTone: {
    backgroundColor: '#5856D6',
  },
});

export default function AudioTest2() {
  const handleRecordingComplete = (uri: string) => {
    console.log('Recording saved at:', uri);
  };

  return (
    <View style={{ flex: 1, justifyContent: 'center', alignItems: 'center' }}>
      <AudioRecorder onRecordingComplete={handleRecordingComplete} />
    </View>
  );
}

All this does, is give you a recording button, a play button that appears when you have a recording, and a play button for a piece of sample audio (just a simple beep, for me).

I do not know where this issue comes from nor how to approach it, I don't know if it's an iPhone thing or if I have to configure expo toward it or something. Please help.

Upvotes: 1

Views: 73

Answers (0)

Related Questions