freethemice
freethemice

Reputation: 41

java OpenGL lwjgl draw 2D then 3d then 2d, error

I can draw 2D then 3D just fine, but when I draw 2D again, it clears the first 2D. Here is the 2d (Stars) with 3d http://postimg.org/image/8lbkbnkgv/

here is what happend when i try and draw a 2D layer affter the 3d http://postimg.org/image/oiac8dcv3/

    public void RenderLoop()
{

glClear(GL_COLOR_BUFFER_BIT);
     glClear(GL_DEPTH_BUFFER_BIT);

    //Draw2d
    glLoadIdentity();
this.init2D();      
    this.Render2DBefore();                
    //Draw2d        

//Draw3d         
    this.init3D();        
    gluLookAt((float)this.myEyeMain.eye.posX, (float)this.myEyeMain.eye.posY, (float)this.myEyeMain.eye.posZ, (float)this.myEyeMain.look.posX, (float)this.myEyeMain.look.posY, (float)this.myEyeMain.look.posZ, 0f, 1f, 0f);
    GL11.glGetFloat(GL11.GL_MODELVIEW_MATRIX, this.modelview);
    Render();
    //Draw3d


    //Draw2d
    glLoadIdentity();
this.init2DB();      
    this.Render2DAfter();                        
    //Draw2d      


    Display.update();
}
public void init2D()
{
    glDisable(GL11.GL_DEPTH_TEST);


    GL11.glEnable(GL11.GL_TEXTURE_2D);              
    GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);         
    // enable alpha blending
        GL11.glEnable(GL11.GL_BLEND);
        GL11.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
        GL11.glViewport(0,0,width,height);

    GL11.glMatrixMode(GL11.GL_MODELVIEW);
    GL11.glOrtho(0, width, height, 0, -1, 1);
    GL11.glMatrixMode(GL11.GL_PROJECTION);
    GL11.glLoadIdentity();



}
public void init2DB()
{
    glDisable(GL11.GL_DEPTH_TEST);


    GL11.glEnable(GL11.GL_TEXTURE_2D);              
    GL11.glClearColor(0.0f, 0.0f, 0.0f, 0.0f);         
    // enable alpha blending
        GL11.glEnable(GL11.GL_BLEND);
        GL11.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
        GL11.glViewport(0,0,width,height);



    GL11.glMatrixMode(GL11.GL_MODELVIEW);
    GL11.glOrtho(0, width, height, 0, -1, 1);
    GL11.glMatrixMode(GL11.GL_PROJECTION);


    GL11.glLoadIdentity();



}
public void init3D()
{
    glDisable(GL11.GL_TEXTURE_2D);

    glEnable(GL_DEPTH_TEST);
    glDepthFunc(GL_LESS);
    glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
    glMatrixMode(GL_PROJECTION);
    gluPerspective(45f, (float) width / (float) height, 1f, this.viewMax);        
    GL11.glGetFloat( GL11.GL_PROJECTION_MATRIX, this.projection);
    GL11.glGetInteger(GL11.GL_VIEWPORT, this.viewport);

    glMatrixMode(GL_MODELVIEW);                
    glLoadIdentity();

}

Upvotes: 0

Views: 530

Answers (1)

Andon M. Coleman
Andon M. Coleman

Reputation: 43319

You are doing a couple of questionable things here:

  1. You are using glOrtho (...) to setup your ModelView matrix
  2. You have your Y-axis inverted in glOrtho (...)

While 1 is definitely an error that needs to be corrected, 2 is perfectly valid but I want to mention a few things about inverting the Y-axis of your projection matrix.

When you set the point (0,0) to be the top-left corner of your window, if you rely on any part of OpenGL that depends on which side (front / back) of a polygon is being drawn (e.g. face culling, vertex lighting, stencil), they will not work as expected. This projection matrix changes the handedness of your projection, and you will need to re-define the front face from its default: counter-clockwise to clockwise. Often in 2D this does not matter, but if you enabled back-face culling in your 3D setup and forgot to turn it off it could become a very serious issue.

To correct your init2D (...) and init2DB (...) functions, re-write the matrix setup code:

GL11.glMatrixMode   (GL11.GL_PROJECTION);
GL11.glLoadIdentity ();
GL11.glOrtho        (0, width, height, 0, -1, 1);

GL11.glMatrixMode   (GL11.GL_MODELVIEW);
GL11.glLoadIdentity ();

Upvotes: 1

Related Questions