Reputation: 134
I want to make an Android application that uses the camera and applies image processing filters on the preview frames.
package alex.filter;
import java.io.IOException;
import android.content.Context;
import android.graphics.Canvas;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
class Preview extends SurfaceView implements SurfaceHolder.Callback {
SurfaceHolder mHolder;
public Camera camera;
Preview(Context context) {
super(context);
mHolder = getHolder();
mHolder.addCallback(this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void surfaceCreated(SurfaceHolder holder) {
camera = Camera.open();
try {
camera.setPreviewDisplay(holder);
camera.setPreviewCallback(new PreviewCallback() {
public void onPreviewFrame(byte[] data, Camera arg1) {
for( int i = 0 ; i < data.length ; i ++ ){
data[ i] = 0; // or some sirius filter
}
Preview.this.invalidate();
}
});
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
camera.stopPreview();
camera = null;
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
Camera.Parameters parameters = camera.getParameters();
parameters.setPreviewSize(w, h);
camera.setParameters(parameters);
camera.startPreview();
}
@Override
public void draw(Canvas canvas) {
super.draw(canvas);
}
}
However, I see no changes in the emulator no matter what I do in the onPreviewFrame
method.
Upvotes: 6
Views: 11785
Reputation: 2238
Another option is to use the OpenCV framework, which has an Android port:
http://opencv.willowgarage.com/wiki/Android2.3.0
It's an NDK port of the open source Open Computer Vision project, and it'll take raw preview frames and allow for processing them with OpenCV before displaying them on a SurfaceView. Because it manipulates the frames it doesn't run at quite the same framerate as just a directly hooked in hardware optimized preview, but because so much of it is native it does a pretty good job.
There's an OpenCV_Sample app in that version linked above which compiled into a demo app which can do much of what you're looking for. It has menu options to enable inverse, blur the image, or do edge detection on the preview area. Even if it's not exactly what you want, there are some great samples in the source code to learn from.
Upvotes: 4
Reputation: 4235
Well that's because the preview buffers that you are getting in the callback is only a copy of the preview buffers, hence any modifications that you do will not be displayed since the buffer that you get is your copy. Mentioned in the android sdk here
I am not sure how to do this but I have been giving it some thought on how to go about this and here is what I think should be done -
UPDATE
Revisiting the SDK documentation I found this API - setPreviewTexture
this API allows us to - "captures frames from an image stream as an OpenGL ES texture". Once you have the images with the applied texture you can use OpenGL to display your frames. (Take a look at the answer posted by @Stephan on how to do this.)
NOTE - setPreviewTexture
is available from API level 11 onwards! SDK Link
Upvotes: 1