Jeffers
Jeffers

Reputation: 183

Troubles with Marching Cubes and Texture coordinates

I'm implementing MC algorithm in OpenGL.

Everything went fine, until I reached the point with texture coordinates.

I can't figure out how to implement them!

My progress:

img

Edit: What I want to archive is to put some textures on my generated MC triangles. As far as I understand I need to tell OpenGL uv coordinates, but no idea how to calculate them.

Upvotes: 2

Views: 4310

Answers (3)

Tristan367
Tristan367

Reputation: 35

The first answer given is partly correct, except you also need to check which plane is best to project from instead of always projecting from the z plane, like this C# Unity example:

Vector2[] getUVs(Vector3 a, Vector3 b, Vector3 c)
{
    Vector3 s1 = b - a;
    Vector3 s2 = c - a;
    Vector3 norm = Vector3.Cross(s1, s2).Normalize(); // the normal

    norm.x = Mathf.Abs(norm.x);
    norm.y = Mathf.Abs(norm.y);
    norm.z = Mathf.Abs(norm.z);

    Vector2[] uvs = new Vector2[3];
    if (norm.x >= norm.z && norm.x >= norm.y) // x plane
    {
        uvs[0] = new Vector2(a.z, a.y);
        uvs[1] = new Vector2(b.z, b.y);
        uvs[2] = new Vector2(c.z, c.y);
    }
    else if (norm.z >= norm.x && norm.z >= norm.y) // z plane
    {
        uvs[0] = new Vector2(a.x, a.y);
        uvs[1] = new Vector2(b.x, b.y);
        uvs[2] = new Vector2(c.x, c.y);
    }
    else if (norm.y >= norm.x && norm.y >= norm.z) // y plane
    {
        uvs[0] = new Vector2(a.x, a.z);
        uvs[1] = new Vector2(b.x, b.z);
        uvs[2] = new Vector2(c.x, c.z);
    }

    return uvs;
}

Marching cubes texture example

Though it is better to do this on the GPU in a shader, especially if you are planning on having very dynamic voxels, such as in an infinitely generated world that's constantly generating around the player or a game with lots of digging and building involved, you wouldn't have to calculate the UVs each time and it's also less data you have to send to the GPU, so it is definitely faster than this. I modified a basic triplanar shader I found on the internet a while ago, unfortunately I wasn't able to find it again, but my modified version is basically a triplanar mapping shader except with no blending and it only samples once per pass, so it should be pretty much as fast as a basic unlit shader and looks exactly the same as the image above. I did this because the normal triplanar shader blending doesn't look good with textures like brick walls at 45 degree angles.

Shader "Triplanar (no blending)"
{
    Properties
    {
        _DiffuseMap("Diffuse Map ", 2D) = "white" {}
        _TextureScale("Texture Scale",float) = 1
    }
    SubShader
    {
        Tags { "RenderType" = "Opaque" }
        LOD 200

        CGPROGRAM
        #pragma target 3.0
        #pragma surface surf Lambert

        sampler2D _DiffuseMap;
        float _TextureScale;

        struct Input
        {
            float3 worldPos;
            float3 worldNormal;
        };

        void surf(Input IN, inout SurfaceOutput o)
        {
            IN.worldNormal.x = abs(IN.worldNormal.x);
            IN.worldNormal.y = abs(IN.worldNormal.y);
            IN.worldNormal.z = abs(IN.worldNormal.z);

            if (IN.worldNormal.x >= IN.worldNormal.z && IN.worldNormal.x >= IN.worldNormal.y) // x plane
            {
                o.Albedo = tex2D(_DiffuseMap, IN.worldPos.zy / _TextureScale);
            }
            else if (IN.worldNormal.y >= IN.worldNormal.x && IN.worldNormal.y >= IN.worldNormal.z) // y plane
            {
                o.Albedo = tex2D(_DiffuseMap, IN.worldPos.xz / _TextureScale);
            }
            else if (IN.worldNormal.z >= IN.worldNormal.x && IN.worldNormal.z >= IN.worldNormal.y) // z plane
            {
                o.Albedo = tex2D(_DiffuseMap, IN.worldPos.xy / _TextureScale);
            }
        }
        ENDCG
    }
}

single pass triplanar shader

It ends up looking a lot like a cubemap, though I don't think this is technically a cubemap as we only use three faces, not six.

EDIT: I later realized that you may want it in the fragment shader like that but for my purposes it works exactly the same and would theoretically be faster in the vertex shader:

Shader "NewUnlitShader"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            // make fog work
            #pragma multi_compile_fog


            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float3 normal : NORMAL;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                UNITY_FOG_COORDS(1)
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                
                v.normal.x = abs(v.normal.x);
                v.normal.y = abs(v.normal.y);
                v.normal.z = abs(v.normal.z);

                if (v.normal.x >= v.normal.z && v.normal.x >= v.normal.y) // x plane
                {
                    o.uv = v.vertex.zy;
                }
                else if (v.normal.y >= v.normal.x && v.normal.y >= v.normal.z) // y plane
                {
                    o.uv = v.vertex.xz;
                }
                else if (v.normal.z >= v.normal.x && v.normal.z >= v.normal.y) // z plane
                {
                    o.uv = v.vertex.xy;
                }

                UNITY_TRANSFER_FOG(o, o.vertex);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
                // sample the texture
                fixed4 col = tex2D(_MainTex, i.uv);
                // apply fog
                UNITY_APPLY_FOG(i.fogCoord, col);
                return col;
            }
            ENDCG
        }
    }
}

Upvotes: 1

datenwolf
datenwolf

Reputation: 162164

I need to tell OpenGL uv coordinates, but no idea how to calculate them.

You're facing some big problem there: The topology of what comes out of MC can be anything. The topology of a texture in OpenGL is either a (hyper)torus (GL_TEXTURE_1D, GL_TEXTURE_2D, GL_TEXTURE_3D), or a sphere (GL_TEXTURE_CUBE_MAP).

So inevitably you have to cut your surface into so called maps. This is a nontrivial task, but a qood strategy is cutting along regions with high curvature. See the paper

“Least Squares Conformal Maps for Automatic Texture Atlas Generation”

Bruno Lévy, Sylvain Petitjean, Nicolas Ray and Jérome Maillot

http://alice.loria.fr/index.php/publications.html?Paper=lscm@2002

for the dirty details.

Upvotes: 3

Nils Pipenbrinck
Nils Pipenbrinck

Reputation: 86353

A typical texture coordinate generation algorithms for marching cube algorithms is to use environment mapping.

In short you calculate the vertex-normal at each vertex by averaging the face normals of all adjecting faces, then discard the z-coordinate of the normal and use (x/2+0.5, y/2+0.5) as (u,v) texture-coordinates.

Set up a texture with a nice white spot in the middle and some structure filling the rest of of the texture and you get the terminator-two silver-robot kind of look.

Upvotes: 4

Related Questions