Reputation: 8918
What is the required API configuration/call for playing two independent wavefiles overlapped ? I tried to do so , I am getting resource busy error. Some pointers to solve the problem will be very helpful.
Following is the error message from snd_pcm_prepare()
of the second wavefile
"Device or resource busy"
Upvotes: 6
Views: 14418
Reputation: 607
As ALSA provides a mixer device by default (dmix), you can simply use aplay, like so :
aplay song1.wav &
aplay -Dplug:dmix song2.wav
If your audio files are the same rate and format, then you don't need to use plug. It becomes :
aplay song1.wav &
aplay -Ddmix song2.wav
If however you want to program this method, there are some C++ audio programming tutorials here. These tutorials show you how to load audio files and operate different audio subsystems, such as jackd and ALSA.
In this example it demonstrates playback of one audio file using ALSA. It can be modified by opening a second audio file like so :
Sox<short int> sox2;
res=sox2.openRead(argv[2]);
if (res<0 && res!=SOX_READ_MAXSCALE_ERROR)
return SoxDebug().evaluateError(res);
Then modify the while loop like so :
Eigen::Array<int, Eigen::Dynamic, Eigen::Dynamic, Eigen::RowMajor> buffer, buffer2;
size_t totalWritten=0;
while (sox.read(buffer, pSize)>=0 && sox2.read(buffer2, pSize)>=0){
if (buffer.rows()==0 || buffer.rows()==0) // end of the file.
break;
// as the original files were opened as short int, summing will not overload the int buffer.
buffer+=buffer2; // sum the two waveforms together
playBack<<buffer; // play the audio data
totalWritten+=buffer.rows();
}
Upvotes: 2
Reputation: 2672
following is a very simplified multi-thread playback solution (assuming both files are the same sample format, same channel number and same frequency):
starting buffer based thread per each file decoding (have to make this code 2 times - for file1
and for file2
):
import wave
import threading
periodsize = 160
f = wave.open(file1Wave, 'rb')
file1Alive = True
file1Thread = threading.Thread(target=_playFile1)
file1Thread.daemon = True
file1Thread.start()
file decoding thread itself (also has to be defined twice - for file1
and for file2
) :
def _playFile1():
# Read data from RIFF
while file1Alive:
if file1dataReady:
time.sleep(.001)
else:
data1 = f.readframes(periodsize)
if not data1:
file1Alive = False
f.close()
else:
file1dataReady == True
starting merging thread (aka funnel
) to merge file decodings
import alsaaudio
import threading
sink = alsaaudio.PCM(alsaaudio.PCM_PLAYBACK, device="hw:CARD=default")
sinkformat = 2
funnelalive = True
funnelThread = threading.Thread(target=self._funnelLoop)
funnelThread.daemon = True
funnelThread.start()
merge and play (aka funnel
) thread
def _funnelLoop():
# Reading all Inputs
while funnelalive:
# if nothing to play - time to selfdestruct
if not file1Alive and not file2Alive:
funnelalive = False
sink.close()
else:
if file1dataReady and file2dataReady:
# merging data from others but first
datamerged = audioop.add(data2, data2, sinkformat)
file1dataReady = False
file2dataReady = False
sink.write(datamerged)
time.sleep(.001)
Upvotes: 0
Reputation: 477
You can use this configuration also
pcm.dmix_stream
{
type dmix
ipc_key 321456
ipc_key_add_uid true
slave.pcm "hw:0,0"
}
pcm.mix_stream
{
type plug
slave.pcm dmix_stream
}
Update it in ~/.asoundrc or /etc/asound.conf
You can use command
For wav file
aplay -D mix_stream "filename"
For raw or pcmfile
aplay -D mix_stream -c "channels" -r "rate" -f "format" "filename"
Enter the value for channels, rate, format and filename as per your audio file
Upvotes: 0
Reputation: 872
You can configure ALSA's dmix
plugin to allow multiple applications to share input/output devices.
An example configuration to do this is below:
pcm.dmixed {
type dmix
ipc_key 1024
ipc_key_add_uid 0
slave.pcm "hw:0,0"
}
pcm.dsnooped {
type dsnoop
ipc_key 1025
slave.pcm "hw:0,0"
}
pcm.duplex {
type asym
playback.pcm "dmixed"
capture.pcm "dsnooped"
}
# Instruct ALSA to use pcm.duplex as the default device
pcm.!default {
type plug
slave.pcm "duplex"
}
ctl.!default {
type hw
card 0
}
This does the following:
dmix
plugin, which allows multiple apps to share the output streamdsnoop
which does the same thing for the input streamduplex
device that will support input and output using the asym
pluginduplex
device as the default devicehw:0
to control the default device (alsamixer and so on)Stick this in either ~/.asoundrc
or /etc/asound.conf
and you should be good to go.
For more information see http://www.alsa-project.org/main/index.php/Asoundrc#Software_mixing.
Upvotes: 25
Reputation: 51842
ALSA does not provide a mixer. If you need to play multiple audio streams at the same time, you need to mix them together on your own.
The easiest way this can be accomplished is by decoding the WAV files to float
samples, add them, and clip them when converting them back to integer samples.
Alternatively, you can try to open the default audio device (and not a hardware device like "hw:0") multiple times, once for each stream you wish to play, and hope that the dmix ALSA plugin is loaded and will provide the mixing functionality.
Upvotes: 5