Reputation: 2013
I am wondering on the relationship between a block of samples and its time equivalent. Given my rough idea so far:
Number of samples played per second = total filesize / duration.
So say, I have a 1.02MB file and a duration of 12 sec (avg), I will have about 89,300 samples played per second. Is this right?
Is there other ways on how to compute this? For example, how can I know how much a byte[1024] array is equivalent to in time?
Upvotes: 9
Views: 31286
Reputation: 212979
In addition to @BrokenGlass's very good answer, I'll just add that for uncompressed audio with a fixed sample rate, number of channels and bits per sample, the arithmetic is fairly straightforward. E.g. for "CD quality" audio we have a 44.1 kHz sample rate, 16 bits per sample, 2 channels (stereo), therefore the data rate is:
44100 * 16 * 2
= 1,411,200 bits / sec
= 176,400 bytes / sec
= 10 MB / minute (approx)
Upvotes: 17
Reputation: 160922
Generally speaking for PCM samples you can divide the total length (in bytes) by the duration (in seconds) to get the number of bytes per second (for WAV files there will be some inaccuracy to account for the header). How these translate into samples depends on
If you know 2) and 3) you can determine 1)
In your example 89300 bytes/second, assuming stereo and 16 bits per sample would be 89300 / 4 ~= 22Khz sample rate
Upvotes: 20