foxneSs
foxneSs

Reputation: 2279

How to get rid of weird OpenTK render?

I am fully aware that I can load textures in OpenTK. But when I tried to render a picture using only points it shows really weird lines when ClientSize.Width is EXACTLY equal to the width of the picture being rendered. Like this:

Weird

And if I resize (enlarge) the ClientSize.Width of the window lines become kinda normal and there is actual reason they appear (they cover areas that can't be rendered):

Not weird

This seems to occur regardless of the picture I open. Can somebody explain why there are lines in the first picture?

using System;
using System.Drawing;
using OpenTK;
using OpenTK.Graphics;
using OpenTK.Graphics.OpenGL;

namespace Miracle
{
    class Program
    {
        public static void Main()
        {
            using (Window w = new Window())
                w.Run(30);
        }
    }

    class Window : GameWindow
    {
        private Color[,] pixels;
        private int width, height;
        private bool blink = true;
        private int blinkcounter;

        public Window() : base(1337, 666, GraphicsMode.Default, "Miracle", GameWindowFlags.Default) { }

        protected override void OnLoad(EventArgs e)
        {
            base.OnLoad(e);
            Bitmap pic = new Bitmap("sample.png");
            width = pic.Width;
            height = pic.Height;
            ClientSize = new Size(width, height);
            pixels = new Color[width, height];

            for (int x = 0; x < width; x++)
                for (int y = 0; y < height; y++)
                    pixels[x, y] = pic.GetPixel(x, y);

            GL.ClearColor(Color.FromArgb(0, 255, 0));
            GL.Ortho(0, width, height, 0, -1, 1);
        }

        protected override void OnResize(EventArgs e)
        {
            base.OnResize(e);
            Title = ClientSize.Width + "x" + ClientSize.Height + (ClientSize.Width == width && ClientSize.Height == height ? " (original)" : "");
            GL.Viewport(ClientSize);
        }

        protected override void OnUpdateFrame(FrameEventArgs e)
        {
            base.OnUpdateFrame(e);
            if (blinkcounter == 6)
            {
                GL.ClearColor(blink ? Color.FromArgb(255, 0, 0) : Color.FromArgb(0, 255, 0));
                blink = !blink;
                blinkcounter = 0;
            }
            blinkcounter++;
        }

        protected override void OnRenderFrame(FrameEventArgs e)
        {
            base.OnRenderFrame(e);
            GL.Clear(ClearBufferMask.ColorBufferBit);
            GL.MatrixMode(MatrixMode.Projection);
            GL.Begin(PrimitiveType.Points);

            for (int x = 0; x < width; x++)
                for (int y = 0; y < height; y++)
                {
                    GL.Color4(pixels[x, y]);
                    GL.Vertex2(x, y);
                }

            GL.End();
            SwapBuffers();
        }
    }
}

Upvotes: 1

Views: 493

Answers (1)

Thomas
Thomas

Reputation: 6196

I will describe a solution to your problem and also why I think you're getting the problem.

First, why I think you're getting the problem: The rasterization of points is weird and hardware dependent. This problem looks to be an error of floating point accuracy in the rasterization stage for points. The projected point is on the boundary of choosing between 2 pixels, and it chooses the wrong one due to FP limitations.

Solution: Rasterizing an image using points is NOT recommended. That's not what point rasterization is used for. Instead, rasterize a full screen quad, give each vertex of the quad a texture coordinate, and in the fragment shader use the interpolated texture coordinate to fetch from the texture storing the image you want to render. This avoids the problem you were experiencing with the GL_POINTS because it ensures every pixel is drawn to once.

Upvotes: 2

Related Questions