user2005447
user2005447

Reputation: 87

GLSurfaceView does not resume its Open GL thread when "onResume" is called

My problem is the following: on "old" Android devices (v 2.2 and 2.3), after a rotation, my GLSurfaceView are blank. I could see these calls in my log :

- rotation detected! -
CTestApp(10669): entering onConfigurationChanged method.
MainActivity(10669): entering onPause method.
*WEBRTC*(10669): ViEAndroidGLES20::onPause
*WEBRTC*(10669): ContextFactory::destroyContext
*WEBRTC*(10669): ViEAndroidGLES20::onPause
*WEBRTC*(10669): ContextFactory::destroyContext
MainActivity(10669): end of onPause method.
MainActivity(10669): entering onStop method.
*WEBRTC*(10669): ViEAndroidGLES20::onDetachedFromWindow
*WEBRTC*(10669): ViEAndroidGLES20::onDetachedFromWindow
MainActivity(10669): end of onStop method.
MainActivity(10669): entering onDestroy method.
MainActivity(10669): end of onDestroy method.
MainActivity(10669): entering onCreate method.
MainActivity(10669): entering onStart method.
MainActivity(10669): end of onStart method.
MainActivity(10669): entering onResume method.
*WEBRTC*(10669): ViEAndroidGLES20::onResume
*WEBRTC*(10669): ViEAndroidGLES20::onResume
MainActivity(10669): end of onResume method.
*WEBRTC*(10669): ViEAndroidGLES20::onAttachedToWindow
*WEBRTC*(10669): ViEAndroidGLES20::onAttachedToWindow

On newer Android devices, the rendering of the video streams is properly resumed after a rotation of the device:

The log of a working device is similar to the previous (not-working) log, except that these traces appear after the "onAttachedToWindow" calls:

creating OpenGL ES 2.0 context
ViEAndroidGLES20::onSurfaceCreated

In the Eclipse debugger, I noticed that the 2 OpenGl threads that were paused during the activity destruction are not resumed. It appears like there is a difference in the behavior of GLSurfaceView between Android 2.3 and 4.0 that caused the OpenGl thread to be resumed only on the newer version. Does anyone has a clue on this?

Here is the detail of the devices I used for my tests:

working devices:

"bad" devices: - HTC desire, running Android 2.3.5 - Motorola droid, running Android 2.2

Here is additionnal information on the code I used.

I have the following class, which extends GLSurfaceView:

public class ViEAndroidGLES20 extends GLSurfaceView
    implements GLSurfaceView.Renderer {
    private static String TAG = "WEBRTC-JR";
    private static final boolean DEBUG = true;
  // True if onSurfaceCreated has been called.
  private boolean surfaceCreated = false;
  private boolean openGLCreated = false;
  // True if NativeFunctionsRegistered has been called.
  private boolean nativeFunctionsRegisted = false;
  private ReentrantLock nativeFunctionLock = new ReentrantLock();
  // Address of Native object that will do the drawing.
  private long nativeObject = 0;
  private int viewWidth = 0;
  private int viewHeight = 0;

  public static boolean UseOpenGL2(Object renderWindow) {
    return ViEAndroidGLES20.class.isInstance(renderWindow);
  }

  public ViEAndroidGLES20(Context context) {
    super(context);
        init(false, 0, 0);
    }

    public ViEAndroidGLES20(Context context, boolean translucent,
            int depth, int stencil) {
        super(context);
        init(translucent, depth, stencil);
    }

    private void init(boolean translucent, int depth, int stencil) {

        // By default, GLSurfaceView() creates a RGB_565 opaque surface.
        // If we want a translucent one, we should change the surface's
        // format here, using PixelFormat.TRANSLUCENT for GL Surfaces
        // is interpreted as any 32-bit surface with alpha by SurfaceFlinger.
        if (translucent) {
            this.getHolder().setFormat(PixelFormat.TRANSLUCENT);
        }

    // Setup the context factory for 2.0 rendering.
    // See ContextFactory class definition below
    setEGLContextFactory(new ContextFactory());

        // We need to choose an EGLConfig that matches the format of
        // our surface exactly. This is going to be done in our
        // custom config chooser. See ConfigChooser class definition
        // below.
        setEGLConfigChooser( translucent ?
                             new ConfigChooser(8, 8, 8, 8, depth, stencil) :
                             new ConfigChooser(5, 6, 5, 0, depth, stencil) );

        // Set the renderer responsible for frame rendering
     this.setRenderer(this);
     this.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
  }

    private static class ContextFactory implements GLSurfaceView.EGLContextFactory {
        private static int EGL_CONTEXT_CLIENT_VERSION = 0x3098;
        public EGLContext createContext(EGL10 egl, EGLDisplay display, EGLConfig eglConfig) {
            Log.w(TAG, "creating OpenGL ES 2.0 context");
            checkEglError("Before eglCreateContext", egl);
            int[] attrib_list = {EGL_CONTEXT_CLIENT_VERSION, 2, EGL10.EGL_NONE };
            EGLContext context = egl.eglCreateContext(display, eglConfig,
                    EGL10.EGL_NO_CONTEXT, attrib_list);
            checkEglError("After eglCreateContext", egl);
            return context;
        }

        public void destroyContext(EGL10 egl, EGLDisplay display, EGLContext context) {
            Log.d("*WEBRTC*", "ContextFactory::destroyContext");
            egl.eglDestroyContext(display, context);
        }
    }

  private static void checkEglError(String prompt, EGL10 egl) {
    int error;
    while ((error = egl.eglGetError()) != EGL10.EGL_SUCCESS) {
      Log.e("*WEBRTC*", String.format("%s: EGL error: 0x%x", prompt, error));
    }
  }

    private static class ConfigChooser implements GLSurfaceView.EGLConfigChooser {

    public ConfigChooser(int r, int g, int b, int a, int depth, int stencil) {
      mRedSize = r;
      mGreenSize = g;
      mBlueSize = b;
      mAlphaSize = a;
      mDepthSize = depth;
      mStencilSize = stencil;
    }

    // This EGL config specification is used to specify 2.0 rendering.
    // We use a minimum size of 4 bits for red/green/blue, but will
    // perform actual matching in chooseConfig() below.
    private static int EGL_OPENGL_ES2_BIT = 4;
    private static int[] s_configAttribs2 =
    {
      EGL10.EGL_RED_SIZE, 4,
      EGL10.EGL_GREEN_SIZE, 4,
      EGL10.EGL_BLUE_SIZE, 4,
      EGL10.EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,
      EGL10.EGL_NONE
    };

    public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display) {

      // Get the number of minimally matching EGL configurations
      int[] num_config = new int[1];
      egl.eglChooseConfig(display, s_configAttribs2, null, 0, num_config);

      int numConfigs = num_config[0];

      if (numConfigs <= 0) {
        throw new IllegalArgumentException("No configs match configSpec");
      }

      // Allocate then read the array of minimally matching EGL configs
      EGLConfig[] configs = new EGLConfig[numConfigs];
      egl.eglChooseConfig(display, s_configAttribs2, configs,                          numConfigs, num_config);

      // Now return the "best" one
      return chooseConfig(egl, display, configs);
    }

    public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display,
                                  EGLConfig[] configs) {
      for(EGLConfig config : configs) {
        int d = findConfigAttrib(egl, display, config,
                                 EGL10.EGL_DEPTH_SIZE, 0);
        int s = findConfigAttrib(egl, display, config,
                                 EGL10.EGL_STENCIL_SIZE, 0);

        // We need at least mDepthSize and mStencilSize bits
        if (d < mDepthSize || s < mStencilSize)
          continue;

        // We want an *exact* match for red/green/blue/alpha
        int r = findConfigAttrib(egl, display, config,
                                 EGL10.EGL_RED_SIZE, 0);
        int g = findConfigAttrib(egl, display, config,
                                 EGL10.EGL_GREEN_SIZE, 0);
        int b = findConfigAttrib(egl, display, config,
                                 EGL10.EGL_BLUE_SIZE, 0);
        int a = findConfigAttrib(egl, display, config,
                                 EGL10.EGL_ALPHA_SIZE, 0);

        if (r == mRedSize && g == mGreenSize && b == mBlueSize && a == mAlphaSize)
          return config;
      }
      return null;
    }

    private int findConfigAttrib(EGL10 egl, EGLDisplay display,
                                 EGLConfig config, int attribute, int defaultValue) {

      if (egl.eglGetConfigAttrib(display, config, attribute, mValue)) {
        return mValue[0];
      }
      return defaultValue;
    }

    // Subclasses can adjust these values:
    protected int mRedSize;
    protected int mGreenSize;
    protected int mBlueSize;
    protected int mAlphaSize;
    protected int mDepthSize;
    protected int mStencilSize;
    private int[] mValue = new int[1];
  }

  // IsSupported
  // Return true if this device support Open GL ES 2.0 rendering.
  public static boolean IsSupported(Context context) {
    ActivityManager am =
        (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE);
    ConfigurationInfo info = am.getDeviceConfigurationInfo();
    if(info.reqGlEsVersion >= 0x20000) {
      // Open GL ES 2.0 is supported.
      return true;
    }
    return false;
  }

   public void onDrawFrame(GL10 gl) {
    nativeFunctionLock.lock();
    if(!nativeFunctionsRegisted || !surfaceCreated) {
      nativeFunctionLock.unlock();
      return;
    }

    if(!openGLCreated) {
      if(0 != CreateOpenGLNative(nativeObject, viewWidth, viewHeight)) {
        return; // Failed to create OpenGL
      }
      openGLCreated = true; // Created OpenGL successfully
    }
    DrawNative(nativeObject); // Draw the new frame
    nativeFunctionLock.unlock();
  }

   public void onSurfaceChanged(GL10 gl, int width, int height) {

    if (DEBUG)
    {
      Log.d("*WEBRTC*", "ViEAndroidGLES20::onSurfaceChanged");
    }

    surfaceCreated = true;
    viewWidth = width;
    viewHeight = height;

    nativeFunctionLock.lock();
    if(nativeFunctionsRegisted) {
      if(CreateOpenGLNative(nativeObject,width,height) == 0)
      {
        openGLCreated = true;
      }
      else
      {
        Log.e("*WEBRTC*", "ViEAndroidGLES20::onSurfaceChanged - failed to openGlCreated!");
      }
    }
    nativeFunctionLock.unlock();
  }

   public void onSurfaceCreated(GL10 gl, EGLConfig config) {

    if (DEBUG)
    {
      Log.d("*WEBRTC*", "ViEAndroidGLES20::onSurfaceCreated");
    }
  }

  public void ReDraw() {
    if(surfaceCreated) {
      // Request the renderer to redraw using the render thread context.
      this.requestRender();
    }
  }

  private native int CreateOpenGLNative(long nativeObject,
                                        int width, int height);
  private native void DrawNative(long nativeObject);

  protected void onAttachedToWindow()
  {
      if (DEBUG)
      {
          Log.d("*WEBRTC*", "ViEAndroidGLES20::onAttachedToWindow");
      }

      super.onAttachedToWindow();
  }

  protected void onDetachedFromWindow()
  {
      if (DEBUG)
      {
          Log.d("*WEBRTC*", "ViEAndroidGLES20::onDetachedFromWindow");
      }

      super.onDetachedFromWindow();
  }

  public void onPause()
  {
      if (DEBUG)
      {
          Log.d("*WEBRTC*", "ViEAndroidGLES20::onPause");
      }

      super.onPause();
  }

  public void onResume()
  {
      if (DEBUG)
      {
          Log.d("*WEBRTC*", "ViEAndroidGLES20::onResume");
      }

      super.onResume();
  }
}

When I perform a rotation of the device, my main activity gets destroyed, but my application keeps a reference of the [[ViEAndroidGLES20]] instances (class members m_RemoteView1 and m_RemoteView2). These references are fetched in the onStart() callback of the activity, as show below

// The activity is about to become visible.
@Override protected void onStart() {

    Log.d("MainActivity", "entering onStart method.");

    super.onStart();

    //   The application is responsible of keeping valid references to the surface view
    //   used to perform local capture and remote stream rendering.
    m_RemoteView1 = ((CTestApp)getApplication()).GetRemoteVideoView();
    m_RemoteView2 = ((CTestApp)getApplication()).GetRemoteVideoView2();

    if (m_RemoteView1 != null)
    {
        LinearLayout layout = (LinearLayout) findViewById(R.id.remoteVideoRenderLayout1);
        layout.addView(m_RemoteView1);
    }

    if (m_RemoteView2 != null)
    {
        LinearLayout layout = (LinearLayout) findViewById(R.id.remoteVideoRenderLayout2);
        layout.addView(m_RemoteView2);
    }
}

// The activity has become visible, it is now resumed.
@Override protected void onResume() {

    Log.d("MainActivity", "entering onResume method.");

    super.onResume();

    // A GLSurfaceView must be notified when the activity is paused and resumed.  GLSurfaceView clients
    // are required to call onPause() when the activity pauses and onResume() when the activity resumes.
    ((GLSurfaceView)m_RemoteView1).onResume();
    ((GLSurfaceView)m_RemoteView2).onResume();
}

Note that I also included the onResume() callback implementation of my main activity, to show that I call the GLSurfaceView.onResume() when the activity is resumed.

Upvotes: 3

Views: 4383

Answers (1)

user2005447
user2005447

Reputation: 87

I finally managed to find the issue to my problem. The problem came from a behavior difference in class android.opengl.GLSurfaceView between Android 2.3 and 4.x that caused the problem. On Android 4.x implementation of GLSurfaceView, the callback "onAttachedToWindow" caused the associated GLThread to be restarted.

This restart of the GLThread is missing in the Android 2.2 & 2.3 implementations. Not resuming the OpenGL thread caused the rendering views to become blank after calls to ViewGroup::removeView/addView were made, like in a rotation scenario.

To correct this issue, I added to my project a class newGLSurfaceView that is a copy from the GLSurfaceView.java class of Android 4.1 source code.

Thanks,

Upvotes: 4

Related Questions