Reputation: 11
I implemented an algorithm to fill the RWTexture3D with density values according to y coordinate of the value using compute shader. Basically, density[id] = id.y * 0.01 - 0.5
. So my densities are in a range between -0.5 and 0.5, and 0 is an isovalue for marching cubes algorithm. This creates me a flat plane exactly in the middle of density field.
Now I`m trying to implement terraforming algorithm. I tried the one, that Sebastian Lague used in this video: https://www.youtube.com/watch?v=vTMEdHcKgM4&t=691s. The main idea of this algorithm is in adding or subtracting a value to a density according to a weight of a brush and a distance between a changing point and a center of a brush. This algorithm works but it has one major problem. If you start with a newly generated world, this algorithm works smooth, it terraforms in a round shape. But after adding and removing terrain, density texture becomes messy. I mean that you might not see the terrain somwhere at the top of the world because after changing values up there they might end up above the isovalue for marching cubes, but they are also much less than values around. So when terraforming that part of the world one more time those valuse become "short paths" - they get less then isovalue much faster than surrounding values and terraforming doesn`t feel as smooth now. Examples are on screenshots: Firstly, I create some structure: Terrain and a slice of the density field after adding terrain the first time Then I remove some parts of it. In the middle and after - at the top. And although there is no geometry now, the density field is messed up: Terrain after removing some parts and messed density field And then I extrude terrain upwards one more time and now it has some curves, although I extruded it the same way as the first time. Added terrain second time. And the more messed up density filed is, the more strange results you can get. Here is the code for my brush:
if (id.x >= textureSizeX - 1 || id.y >= textureSizeY - 1 || id.z >= textureSizeZ - 1|| id.x < 1 || id.y < 1 || id.z < 1)
return;
float dstSqrdFromPoint = dot((terraformPoint - id), (terraformPoint - id));
float dstFromPoint = sqrt(dstSqrdFromPoint);
if(dstSqrdFromPoint > brushSize * brushSize)
return;
float intensity = computeIntensity(dstFromPoint);
densityTexture[id] += intensity * weight * deltaTime;
I have tried another way of terraforming:
I found Cubiquity engine (https://github.com/DavidWilliams81/cubiquity) and tried to recreate their way - finding a normal vector to a point of a brush and than copying a value of a point, directed by a normal vector. It worked better with messed up density field but also had sone troubles:
The code of a brush is below.
computeIntensity just returns a values between 0 and 1, according to a dinstance from a brush center.
getInterpolatedValue returns an interpolated density value in case samplePoint is not exactly at the corner of a cube.
if (id.x >= textureSizeX - 1 || id.y >= textureSizeY - 1 || id.z >= textureSizeZ - 1|| id.x < 1 || id.y < 1 || id.z < 1)
return;
float dstSqrdFromPoint = dot((terraformPoint - id), (terraformPoint - id));
float dstFromPoint = sqrt(dstSqrdFromPoint);
if(dstSqrdFromPoint > brushSize * brushSize)
return;
float intensity = computeIntensity(dstFromPoint);
float vox1nx = densityTexture[int3(id.x - 1, id.y, id.z)];
float vox1px = densityTexture[int3(id.x + 1, id.y, id.z)];
float vox1ny = densityTexture[int3(id.x, id.y - 1, id.z)];
float vox1py = densityTexture[int3(id.x, id.y + 1, id.z)];
float vox1nz = densityTexture[int3(id.x, id.y, id.z - 1)];
float vox1pz = densityTexture[int3(id.x, id.y, id.z + 1)];
float3 normal = float3(vox1nx - vox1px, vox1ny - vox1py, vox1nz - vox1pz);
if (sqrt(dot(normal, normal) > 0.001))
{
normalize(normal);
}
normal *= intensity * weight;
float3 samplePoint = id - normal;
float sample = getInterpolatedValue(samplePoint);
densityTexture[id] = sample;
Upvotes: 1
Views: 77