Reputation: 9
In my app on Android, i use AudioRecord and send continuouslly an bytes array PCM_16 to Node.js server.
byte[] audioBuffer = new byte[mAudioBufferSampleSize];
mAudioRecord.startRecording();
int audioRecordingState = mAudioRecord.getRecordingState();
if (audioRecordingState != AudioRecord.RECORDSTATE_RECORDING) {
Log.e(TAG, "AudioRecord is not recording");
return;
} else {
Log.v(TAG, "AudioRecord has started recording...");
}
while (inRecordMode) {
int samplesRead = mAudioRecord.read(audioBuffer, 0,
mAudioBufferSampleSize);
Log.v(TAG, "Got samples: " + samplesRead);
if (WebSocketManager.roomSocket.isConnected()) {
WebSocketManager.roomSocket.send(audioBuffer);
}
}
After that, i can stream it to the web browser in ArrayBuffer type and try to convert it to an Float32Array to be buffer for an instance of AudioContext. But i can't hear any thing or with loud noise.
function onMessage(evt) {
var context = new AudioContext();
var source = context.createBufferSource();
source.connect(context.destination);
array = new Int8Array(evt.data);
abs = new Float32Array(evt.data);
arrays = new Float32Array(abs.length);
ab = context.createBuffer(1, array.length, 44100);
ab.getChannelData(0).set(arrays);
source.buffer = ab;
source.start(0);
// then do it
}
So anyone can give me an advance, please? P/s: use decodeAudioData just give an null error
Sorry for my poor English
Upvotes: 0
Views: 1385
Reputation: 13928
The array you're getting is each sample 0-32k (range of a uint16). Each sample in an AudioBuffer's channel data is a float32 - nominal range of -1 to +1.
You need to convert the data in each sample, not just assign the value and rely on conversion.
Upvotes: 1