Yann
Yann

Reputation: 988

Rendering a world-space line in DirectX 11

I am rendering a spline in DirectX 11, but I'm having an issue where it appears to be stuck in screen space, and I can't convince it to be in world space.

Stubborn spline

The spline is initially defined as a std::vector<DirectX::XMFLOAT3> of the control points, which is expanded to a vector of the same of the actual points on the spline called linePoints. Then the vertex, index and constant buffers are created in this function:

void Spline::createBuffers(ID3D11Device* device)
{
    Vertex vertices[100];
    for (int i = 0; i < 100; i++)
    {
        Vertex v;
        v.position = linePoints.at(i);
        v.colour = XMFLOAT4(0, 0, 0, 1.0);
        vertices[i] = v;
    }

    D3D11_BUFFER_DESC bd;
    ZeroMemory(&bd, sizeof(D3D11_BUFFER_DESC));
    bd.Usage = D3D11_USAGE_DEFAULT;
    bd.ByteWidth = sizeof(Vertex) * linePoints.size();
    bd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
    bd.CPUAccessFlags = 0;

    D3D11_SUBRESOURCE_DATA InitData;
    ZeroMemory(&InitData, sizeof(InitData));
    InitData.pSysMem = vertices;

    device->CreateBuffer(&bd, &InitData, &vertexBuffer);

    ZeroMemory(&bd, sizeof(D3D11_BUFFER_DESC));
    bd.Usage = D3D11_USAGE_DEFAULT;
    bd.CPUAccessFlags = 0;
    bd.ByteWidth = sizeof(WORD) * linePoints.size();
    bd.BindFlags = D3D11_BIND_INDEX_BUFFER;

    WORD indices[200];
    int count = 0;
    for (WORD i = 0; i < 100; i++)
    {
        indices[count] = i;
        indices[count + 1] = i + 1;
        count += 2;
    }

    ZeroMemory(&InitData, sizeof(InitData));
    InitData.pSysMem = indices;
    device->CreateBuffer(&bd, &InitData, &indexBuffer);

    ZeroMemory(&bd, sizeof(bd));
    bd.Usage = D3D11_USAGE_DEFAULT;
    bd.ByteWidth = sizeof(LineCBuffer);
    bd.BindFlags = D3D11_BIND_CONSTANT_BUFFER;
    bd.CPUAccessFlags = 0;
    device->CreateBuffer(&bd, nullptr, &constBuffer);
}

The draw function is:

void Spline::Draw(ID3D11PixelShader* pShader, ID3D11VertexShader* vShader, Camera& cam)
{
    context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_LINESTRIP);

    context->VSSetShader(vShader, nullptr, 0);
    context->PSSetShader(pShader, nullptr, 0);

    LineCBuffer lCB;
    lCB.world = XMMatrixIdentity();
    lCB.view = XMMatrixTranspose(cam.getView());
    lCB.projection = XMMatrixTranspose(cam.getProjection());

    context->UpdateSubresource(constBuffer, 0, nullptr, &lCB, 0, 0);

    context->VSSetConstantBuffers(0, 1, &constBuffer);

    UINT stride = sizeof(Vertex);
    UINT offset = 0;

    context->IASetVertexBuffers(0, 1, &vertexBuffer, &stride, &offset);
    context->IASetIndexBuffer(indexBuffer, DXGI_FORMAT_R16_UINT, 0);

    context->DrawIndexed(100, 0, 0);

    context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
}

And the entire shader file is:

cbuffer lineCBuffer : register(b0)
{
    matrix World;
    matrix View;
    matrix Projection;
};

struct VS_IN
{
    float3 position : POSITION;
    float4 colour : COLOUR;
};

struct VS_OUT
{
    float4 pos : SV_POSITION;
    float4 colour : COLOUR;
};

VS_OUT lineVertexShader(float3 position : POSITION, float4 colour : COLOUR)
{
    VS_OUT output = (VS_OUT)0;
    output.pos = mul(position, World);
    output.pos = mul(output.pos, View);
    output.pos = mul(output.pos, Projection);
    output.pos.w = 1.0f;
    output.colour = colour;

    return output;
}

float4 linePixelShader(VS_OUT input) : SV_TARGET
{
    return input.colour;
}

The issue is that the start of the line (especially when it is set to (0,0,0)) is anchored to the viewport. When the start if the line is at (0,0,0), it will not leave the centre of the screen, even when it should be offscreen.

Upvotes: 2

Views: 2962

Answers (1)

kaiser
kaiser

Reputation: 1009

I think you are doing something wrong with your multiplication.

VS_OUT lineVertexShader(float3 position : POSITION, float4 colour : COLOUR)
{
    VS_OUT output = (VS_OUT)0;
    output.pos = mul(position, World);
    output.pos = mul(output.pos, View);
    output.pos = mul(output.pos, Projection);
    output.pos.w = 1.0f;
    output.colour = colour;
    return output;
}

You use a float3 as position input. The position is a Point, so you should use a

float4(position,1.0f) 

as a point in 3D Space. If your float3 would be a vector, you create a

float4(position,0.0f)

So your Vertex Shader should look like the following:

VS_OUT lineVertexShader(float3 position : POSITION, float4 colour : COLOUR)
{
    VS_OUT output;
    output.pos = mul(float4(position,1.0), mul(mul(World,View),Projection));
    output.colour = colour;
    return output;
}

One more thing. Do not set pos.w to 1! The Rasterizer is doing the perspective divide automatically to the value of SV_POSITION. The homogenous value is then passed to the pixel shader. Sometimes you really want to set z and w to 1, maybe for your cube map rendered at a max distance. But I think that's another error here.

When you don't need to use world, view, and projection in extra multiplications, why do you not precompute it on the CPU and then push the final worldViewProj matrix to your shaders?

Good Luck

Upvotes: 3

Related Questions