user345602
user345602

Reputation: 598

Android microphone issue

Is it possible to detect (in real time) if somebody is blowing into the microphone ? Thanks

Upvotes: 1

Views: 711

Answers (2)

Forme
Forme

Reputation: 301

Adding The AVFoundation Framework

In order to use the SDK’s AVAudioRecorder class, we’ll need to add the AVFoundation framework to the project:

Next, we’ll import the AVFoundation headers in our view controller’s interface file and set up an AVAudioRecorder instance variable:

Expand the MicBlow project branch in the Groups & Files panel of the project Expand the Classes folder Edit MicBlowViewController.h by selecting it Update the file:

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>

@interface MicBlowViewController : UIViewController {
AVAudioRecorder *recorder;
}

Taking Input From The Mic Uncomment the boilerplate ViewDidLoad method Update it as follows.

- (void)viewDidLoad {
[super viewDidLoad];

NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];

NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
    [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
    [NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
    [NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey,
  nil];

NSError *error;

recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];

if (recorder) {
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];
} else
    NSLog([error description]);

}

Sampling The Audio Level We’ll use a timer to check the audio levels approximately 30 times a second. Add an NSTimer instance variable and its callback method to it in MicBlowViewController.h.

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>

@interface MicBlowViewController : UIViewController {
AVAudioRecorder *recorder;
NSTimer *levelTimer;
  }

  - (void)levelTimerCallback:(NSTimer *)timer;

  @end

Update the .m file’s ViewDidLoad to enable the timer. - (void)viewDidLoad { [super viewDidLoad];

NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];

NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
    [NSNumber numberWithFloat: 44100.0],                 AVSampleRateKey,
    [NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
    [NSNumber numberWithInt: 1],                         AVNumberOfChannelsKey,
    [NSNumber numberWithInt: AVAudioQualityMax],         AVEncoderAudioQualityKey,
  nil];

NSError *error;

recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];

if (recorder) {
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;
    [recorder record];
    levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.03 target: self selector: @selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else
    NSLog([error description]);

}

For now, we’ll just sample the audio input level directly/with no filtering. Add the implementation of levelTimerCallback: to the .m file:

  - (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];
NSLog(@"Average input: %f Peak input: %f", [recorder averagePowerForChannel:0],     [recorder peakPowerForChannel:0]);

}

Sending the updateMeters message refreshes the average and peak power meters. The meter use a logarithmic scale, with -160 being complete quiet and zero being maximum input.

Don’t forget to release the timer in dealloc. Changes are bold:

- (void)dealloc {
[levelTimer release];
[recorder release];
[super dealloc];
 }

Listening For A Blowing Sound As mentioned in the overview, we’ll be using a low pass filter to diminish high frequencies sounds’ contribution to the level. The algorithm creates a running set of results incorporating past sample input; we’ll need an instance variable to hold the results. Update the .h file.

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>

@interface MicBlowViewController : UIViewController {
AVAudioRecorder *recorder;
NSTimer *levelTimer;
double lowPassResults;
}

Implement the algorithm by replacing the levelTimerCallback: method with:

- (void)levelTimerCallback:(NSTimer *)timer {
[recorder updateMeters];

const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;  

NSLog(@"Average input: %f Peak input: %f Low pass results: %f", [recorder     averagePowerForChannel:0], [recorder peakPowerForChannel:0], lowPassResults);
 }

For my app’s need, 0.95 works. We’ll replace the log line with a simple conditional:

 - (void)listenForBlow:(NSTimer *)timer {
[recorder updateMeters];

const double ALPHA = 0.05;
double peakPowerForChannel = pow(10, (0.05 * [recorder peakPowerForChannel:0]));
lowPassResults = ALPHA * peakPowerForChannel + (1.0 - ALPHA) * lowPassResults;

if (lowPassResults > 0.95)
    NSLog(@"Mic blow detected");
}

This example .

And thema .

Upvotes: 0

walta
walta

Reputation: 3466

Yes it is,

You can use the AudioRecord class and analyze the waveform that comes back.

EDIT: Just did some research - one caveat to this. Turns out that Android doesn't handle audio processing in real time very well. You'll see a 100ms delay. If that's ok for your project (probably it is it sounds like) great, but just something to be aware of.

Upvotes: 3

Related Questions