jacknkandy
jacknkandy

Reputation: 21

BufferGeometry indices issue

I'm trying to port some code that I made in openFrameworks into THREE.JS. The code generates a landscape using perlin noise. I made it so that first a static index array is created, and then the positions of the vertices are placed out in a square grid each displaced by a specified distance. This is so that the positions of the vertices within the array can be shifted (up, down, left or right) so that when the camera moves the landscape can be updated and new strip of landscape generated based on the direction of the camera movement.

For each vertex, 6 indices are added to the index array which refer to two adjacent triangles. I didn't want to waste memory storing duplicates of each vertex for each triangle.

For example e.g.

v12  v11  v10
*----*----*
|\   |\   |
| \  | \  |
|  \ |  \ |
|   \|   \|
*----*----*
v02  v01  v00

so e.g. at vertex v00, the indices {v00, v10, v11} and {v00, v11, v01} are added, etc.

In my openFrameworks code, this all works perfectly! After a lot of trouble I finally got things working in THREE.js, but have noticed that as soon as I increase the amount of vertices everything starts getting weird - the triangles are connecting (what seems) all over the place, and a large chunk of vertices start to become skipped. At the moment anything up to and including a grid size of 256*256 works fine, but as soon as I increase any higher I start to see all the artefacts.

I'm thinking this is probably an issue with offsetting, but I don't really understand what this means, or how to implement it with my code. I've seen other people use it successfully when they define their indices in order (0, 1, 2, 3, ... ) and instead use 3 individual vertices for each triangle (and add each three vertices for each triangle in sequence). I can't seem to get the same kind of thing to work.

Any ideas? I've got my code below just in case it helps. You can see the parts where I commented out the ofsetting.

var landscape = { size: 0, chunkSize: 21845, distance: 0, geometry: null, mesh: null, positions: null, normals: null, colors: null, indices: null,

generateVertex: function( r, c )
{
    var pos, color;

    // Set position
    pos = new THREE.Vector3();
    pos.x = this.distance * c;
    pos.z = this.distance * r;
    pos.y = -2 + 5*simplex.noise2D( 0.1*pos.x, 0.1*pos.z );

    // Set color
    color = new THREE.Color();
    color.setRGB( Math.random(1), 0, 0 );

    this.vertices.setXYZ( r * this.size + c, pos.x, pos.y, pos.z );
    this.colors.setXYZ( r * this.size + c, color.r, color.g, color.b );
},

generateIndices: function( i, r, c )
{
    this.indices[ i ] = ( r * this.size ) + c;
    this.indices[ i + 1 ] = ( ( r + 1 ) * this.size ) + c;
    this.indices[ i + 2 ] = ( ( r + 1 ) * this.size ) + ( c + 1 );

    this.indices[ i + 3 ] = ( r * this.size ) + c;
    this.indices[ i + 4 ] = ( ( r + 1 ) * this.size ) + ( c + 1 );
    this.indices[ i + 5 ] = ( r * this.size ) + ( c + 1 );

    /*this.indices[ i ] =  ( ( r * this.size ) + c ) % ( 3 * this.chunkSize );
    this.indices[ i + 1 ] = ( ( ( r + 1 ) * this.size ) + c ) % ( 3 * this.chunkSize );
    this.indices[ i + 2 ] = ( ( ( r + 1 ) * this.size ) + ( c + 1 ) ) % ( 3 * this.chunkSize );

    this.indices[ i + 3 ] = ( ( r * this.size ) + c ) % ( 3 * this.chunkSize );
    this.indices[ i + 4 ] = ( ( ( r + 1 ) * this.size ) + ( c + 1 ) ) % ( 3 * this.chunkSize );
    this.indices[ i + 5 ] = ( ( r * this.size ) + ( c + 1 ) ) % ( 3 * this.chunkSize ); */   
},

generatePoint: function( x, z )
{

},

generate: function( size, distance )
{        
    var sizeSquared, i;
    sizeSquared = size * size;
    i = 0;
    this.size = size;
    this.distance = distance;

    // Create buffer geometry
    this.geometry = new THREE.BufferGeometry();

    this.indices = new Uint16Array( 6*(size-1)*(size-1) );

    this.vertices = new THREE.BufferAttribute( new Float32Array( sizeSquared * 3 ), 3 );
    this.colors = new THREE.BufferAttribute( new Float32Array( sizeSquared * 3 ), 3 );

    // Generate points
    for( var r = 0; r < size; r = r + 1 )
    {
        for( var c = 0; c < size; c = c + 1 )
        {
            this.generateVertex( r, c );

            if( (r < size - 1) && (c < size - 1) )
            {
                this.generateIndices( i, r, c );
                i = i + 6;
            }
        }
    }

    // Set geometry
    this.geometry.addAttribute( 'index', new THREE.BufferAttribute( this.indices, 1 ) );
    this.geometry.addAttribute( 'position', this.vertices );
    this.geometry.addAttribute( 'color', this.colors );        

    //
    /*this.geometry.offsets = [];

    var triangles = 2 * ( size - 1 ) * ( size - 1 );
    var offsets = triangles / this.chunkSize;

    for( var j = 0; j < offsets; j = j + 1 )
    {
        var offset =
        {
            start: j * this.chunkSize * 3,
            index: j * this.chunkSize * 3,
            count: Math.min( triangles - ( j * this.chunkSize ), this.chunkSize ) * 3
        };

        this.geometry.offsets.push( offset );
    }*/

    var material = new THREE.MeshBasicMaterial( {vertexColors: THREE.VertexColors} );
    //var material = new THREE.LineBasicMaterial({ vertexColors: THREE.VertexColors });

    this.geometry.computeBoundingSphere();

    this.mesh = new THREE.Mesh( this.geometry, material );
    scene.add( this.mesh );

}

Upvotes: 2

Views: 3416

Answers (1)

GuyRT
GuyRT

Reputation: 2917

WebGL is based on OpenGL ES 2.0 which does not support 32 bit index buffers, so as soon as you have more than 256 * 256 vertices, the index buffer can no longer address them all.

From the OpenGL ES 2.0 Standard (section 2.8 Vertex Arrays):

Indexing support with ubyte and ushort indices is supported. Support for uint indices is not required by OpenGL ES 2.0. If an implementation supports uint indices, it will export the OES element index - uint extension.

Assuming that's the issue you can enable 32bit index buffers by getting and checking for the OES_element_index_uint extension.

var uintExt = gl.getExtension("OES_element_index_uint");
if (!uintExt) {
   alert("Sorry, this app needs 32bit indices and your device or browser doesn't appear to support them");
   return;
}

According to webglstats.com 93.5% of machines support the extension.


You will need to change your generate function to create a 32 bit array:

this.indices = new Uint16Array( 6*(size-1)*(size-1) );

should be:

this.indices = new Uint32Array( 6*(size-1)*(size-1) );

I had a quick delve inside the source of three.js's renderer, and it looks like it checks the type of the index array and will pass gl.UNSIGNED_INT to glDrawElements if you use a Uint32Array.

Upvotes: 3

Related Questions