Reputation: 91
I am currently examining an Android application that uses AudioRecord to record audio in 16-bit PCM format:
byte [] buffer = new byte[1600];
audioRecord.read(buffer, 0, 1600);
It stores the recorded audio into buffer
.The documentation of this read function describes that this function should only be used with 8-bit PCM. However, the Android application uses it with 16-bit PCM (and it seems to work without issues; another overloaded read variant using a byte array also mentions that the use of 16-bit PCM with this method is possible, but deprecated).
Now I am unsure if each sample (consisting of 2 bytes) is stored in little or in big endian format. The documentation section about the audio encoding says that using a ByteBuffer instead of byte array results in native endian (instead of big endian).
I suspect that a short is stored in big endian format but I can not find evidence for this.
Upvotes: 3
Views: 1467
Reputation: 803
It's Big Endian.
Here's the needle in the haystack - https://developer.android.com/reference/android/media/AudioFormat#encoding:~:text=when%20the%20short%20is%20stored%20in%20a%20ByteBuffer%2C%20it%20is%20native%20endian%20(as%20compared%20to%20the%20default%20Java%20big%20endian)
It seems like Java stores everything in Big Endian.
Edit: This is only the case for byte[], according to the documentation you can supposedly use ByteBuffer which will store data in little endian.
Upvotes: 0
Reputation: 372
I have the same situation in my recent project.
Method
public int read (byte[] audioData,
int offsetInBytes,
int sizeInBytes)
actually calls
public int read (byte[] audioData,
int offsetInBytes,
int sizeInBytes,
int readMode)
with AudioRecord.READ_BLOCKING as the 4th parameter. Since 16-bit PCM is deprecated, I think it's still ok to use this method just not recommended.
The read method finally calls native method native_read_in_byte_array to fill the audio buffer. Android is natively little-endian so native_read_in_byte_array stores audio data in little endian in C/C++ layer and then send audio data back to Java layer through JNI.
I did a quick test to find that byte buffer passing through JNI won't change the order in which it stores its bytes.
Basically you get your Java byte[] in the same order as in the C/C++ jbyteArray. So a short in native is still stored in little endian in Java layer.
That's all I can reason from my test. Hope it helps and let me know if something is wrong in there.
Upvotes: 4