rstojano
rstojano

Reputation: 159

Lower WebGL Render Resolution, and Increase FPS

I am having a problem with rendering 2d sprites to a GL canvas. When my browser is on a lower resolution monitor it renders at a higher FPS than if it's rendering in a higher resolution monitor.

Mac Retina Display: 2880x1800 = ~20 FPS

External Monitor: 1920x1080 = ~60 FPS

WebGL is rendering at a faster FPS on my 1080 resolution monitor than it is on the Mac retina display.

What is the best practice to force WebGL to render at a lower resolution? I have tried looking for sources to help answer this but can't find anything online.

I've tried lowering the resolution as follows:

var ratio = 1
var width = $(window).width();
var height = $(window).height();
if (window.screen.height * window.devicePixelRatio > 1080) {
   ratio = 1080/(window.screen.height * window.devicePixelRatio);
}
width = width*ratio;
height = height*ratio;
gl.canvas.width = width;
gl.canvas.height = height;

And then setting my viewport like this:

gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);

Update to code

var width = $(window).width();
var height = $(window).height();
if (window.screen.height * window.devicePixelRatio > 1080) {
   glTools.ratio = 1080/(window.screen.height * window.devicePixelRatio);
} else {
   glTools.ratio = 1;
}
width = Math.round(width*glTools.ratio);
height = Math.round(height*glTools.ratio);
glTools.gl.canvas.width = width;
glTools.gl.canvas.height = height;
glTools.gl.viewport(0, 0, width, height);

Performance Update Although the answer to this question is correct. I just wanted to point another culprit to my FPS in my video game. Having timers in my code and performing non-hardware accelerated css animations in my game managed to block the thread and therefore cause the render to be scheduled for later. You can see that the game renders at 60FPS at Obsidio

Upvotes: 1

Views: 3907

Answers (1)

zero298
zero298

Reputation: 26877

Per MDN's WebGL best practices:

Rendering to a canvas can be done at a different resolution than the style sheet will eventually force the canvas to appear at. If struggling with performance you should consider rendering to a low resolution WebGL context and using CSS to upscale its canvas to the size you intend.


WebGL will render based on your canvas' width and height attributes. A canvas that looks like this: <canvas width="256" height="256"> will render at 256x256. However, you can then scale up your drawing by styling your canvas with CSS: canvas {width: 512px;height: 512px;} will display your rendered image at an upscaled 512x512.

Run the snippet below and you'll see the output:

[Canvas internal rendering] W: 256 | H: 256
[Actual canvas size] W: 512 | H: 512

const canvas = document.querySelector("canvas"),
  ctx = canvas.getContext("webgl"),
  {
    width,
    height
  } = canvas.getBoundingClientRect();

console.log(`[Canvas internal rendering] W: ${ctx.drawingBufferWidth} | H: ${ctx.drawingBufferHeight}`);
console.log(`[Actual canvas size] W: ${width} | H: ${height}`);
canvas {
  width: 512px;
  height: 512px;
}
<canvas width="256" height="256">


After looking at your code, the reason your game seems to "zoom in", is because you are tying your viewport to your rendering size. Don't couple these. Your viewport settings should definitely mirror your aspect ratio, but should be independent of your drawing size. WebGL will default it to your canvas size, but since you want to upscale, try setting it to the upscaled canvas size and see if you still get a zoom. From my above example, the code would look like:

ctx.viewport(0, 0, width, height);

Where width and height come from the computed element's size.


Upscaling Example:

Here is a demo showing the differing rending resolutions of the same shape. You will see aliasing in the first image, but the second image should be clear.

const vertSource = `
attribute vec3 a_position;
void main(void) {
    gl_Position = vec4((a_position * 2.0) - 1.0, 1.0);
}`;

const fragSource = `
void main(void) {
    gl_FragColor = vec4(1.0);
}`;

const verts = [
  0.1, 0.1, 0,
  0.9, 0.1, 0,
  0.5, 0.9, 0
];

function setupCanvas(canvas) {
  const gl = canvas.getContext("webgl");

  const {
    width,
    height
  } = canvas.getBoundingClientRect();

  gl.clearColor(0.0, 0.0, 0.0, 1.0);
  gl.enable(gl.DEPTH_TEST);
  gl.viewport(0.0, 0.0, gl.drawingBufferWidth, gl.drawingBufferHeight);

  console.log(`[Canvas internal rendering] W: ${gl.drawingBufferWidth} | H: ${gl.drawingBufferHeight}`);
  console.log(`[Actual canvas size] W: ${width} | H: ${height}`);

  const b = gl.createBuffer(),
    p = gl.createProgram(),
    v = gl.createShader(gl.VERTEX_SHADER),
    f = gl.createShader(gl.FRAGMENT_SHADER);

  gl.bindBuffer(gl.ARRAY_BUFFER, b);
  gl.shaderSource(v, vertSource);
  gl.compileShader(v);
  gl.shaderSource(f, fragSource);
  gl.compileShader(f);
  gl.attachShader(p, v);
  gl.attachShader(p, f);
  gl.linkProgram(p);
  gl.useProgram(p);

  const a = gl.getAttribLocation(p, "a_position");
  gl.vertexAttribPointer(a, 3, gl.FLOAT, false, 0, 0);
  gl.enableVertexAttribArray(a);

  function draw() {
    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(verts), gl.DYNAMIC_DRAW);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
    gl.drawArrays(gl.TRIANGLES, 0, verts.length / 3);
  }
  draw();
}

Array.from(document.querySelectorAll("canvas"))
  .forEach(setupCanvas);
canvas {
  width: 512px;
  height: 512px;
}
<canvas width="256" height="256"></canvas>
<canvas width="512" height="512"></canvas>

Upvotes: 4

Related Questions