Reputation: 11
I am using a WaveFormatConversionStream to increase the sample rate of a mono-channel, 16-bit PCM WAV file containing recorded speech from 11025 to 16000 Hz. The resulting WAV file is still intelligible, but seems to have a great deal of "static" (it sounds as if it's being played through a partially blown-out speaker). Is this normal, expected behavior, or am I doing something wrong? Code follows:
using (WaveFileReader wfr = new WaveFileReader("inFile.wav"))
{
var newFormat = new WaveFormat(16000, wfr.WaveFormat.BitsPerSample, wfr.WaveFormat.Channels);
using (WaveFileWriter wfw = new WaveFileWriter("outFile.wav", newFormat))
{
using (WaveFormatConversionStream conversionStream = new WaveFormatConversionStream(newFormat, wfr))
{
conversionStream.Position = 0;
byte[] buffer = new byte[1024];
while (conversionStream.Position < conversionStream.Length)
{
int bytesRead = conversionStream.Read(buffer, 0, 1024);
if (bytesRead > 0)
{
wfw.Write(buffer, 0, bytesRead);
}
else
{
break;
}
}
}
}
}
Upvotes: 1
Views: 3326
Reputation: 49482
There's nothing obvious I can see wrong with your code (although no need to set Position = 0). It is using the ACM sample rate conversion provided with Windows under the hood, which is reasonable but not brilliant. In particular, I don't think it applies any low pass filters, which are usually recommended to reduce aliasing and artifacts when resampling. But what you describe sounds more severe thn that. Another thing you could try is making your buffer size a full second - i.e. 16000 * channels * 2.
Another thing, I assume the audio is 16 bit?
Upvotes: 1