NoonanRosenblum
NoonanRosenblum

Reputation: 533

D3D11 CreateTexture2D in 32 bit format

Surprisingly I am not able to find any help on the subject, even though I am pretty certain this is quite a basic question. So maybe somebody here could help me to progress on that.

I am trying to create a d3d11 texture, mono-channel, 32 bit per pixel. The official documentation is interesting ( http://msdn.microsoft.com/en-us/library/windows/desktop/ff476521(v=vs.85).aspx) and I found this example for a four-channel 8 bits per pixel:

ID3D11Texture2D *MakeCheckerboard(ID3D11Device *myDevice)
{
    ID3D11Texture2D *tex;
    D3D11_TEXTURE2D_DESC tdesc;
    D3D11_SUBRESOURCE_DATA tbsd;

    int w = 512;
    int h = 512;
    int bpp = 4;
    int *buf = new int[w*h];

    // filling the image
    for(int i=0;i<h;i++)
        for(int j=0;j<w;j++)
        {
            if((i&32)==(j&32))
                buf[i*w+j] = 0x00000000;
            else
                buf[i*w+j] = 0xffffffff;
        }

    // setting up D3D11_SUBRESOURCE_DATA 
    tbsd.pSysMem = (void *)buf;
    tbsd.SysMemPitch = w*bpp ;
    tbsd.SysMemSlicePitch = w*h*bpp ; // Not needed since this is a 2d texture

    // setting up D3D11_TEXTURE2D_DESC 
    tdesc.Width  = w;
    tdesc.Height = h;
    tdesc.MipLevels = 1;
    tdesc.ArraySize = 1;
    tdesc.SampleDesc.Count   = 1;
    tdesc.SampleDesc.Quality = 0;
    tdesc.Usage     = D3D11_USAGE_DEFAULT;
    tdesc.Format    = DXGI_FORMAT_R8G8B8A8_UNORM;
    tdesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
    tdesc.CPUAccessFlags = 0;
    tdesc.MiscFlags      = 0;

    // checking inputs
    if ( myDevice->CreateTexture2D(&tdesc, &tbsd, NULL) == S_FALSE )
        std::cout << "Inputs correct" << std::endl;
    else
        std::cout << "wrong inputs" << std::endl;

    // create the texture
    if(FAILED(myDevice->CreateTexture2D(&tdesc,&tbsd,&tex)))
    {
        std::cout << "Failed" << std::endl;
        return(0);
    }
    else
        std::cout << "Success" << std::endl;

    delete[] buf;

    return(tex);
}

This exemple works fine, so the output is:

Inputs correct
Success

Then, if I modify it a little for a 32 bits monochannel:

ID3D11Texture2D *MakeCheckerboard(ID3D11Device *myDevice)
{
    ID3D11Texture2D *tex;
    D3D11_TEXTURE2D_DESC tdesc;
    D3D11_SUBRESOURCE_DATA tbsd;

    int w = 512;
    int h = 512;
    int bpp = 4;
    int *buf = new int[w*h];

    for(int i=0;i<h;i++)
        for(int j=0;j<w;j++)
        {
            if((i&32)==(j&32))
                buf[i*w+j] = 0x00000000;
            else
                buf[i*w+j] = 0xffffffff;
        }

    tbsd.pSysMem = (void *)buf;
    tbsd.SysMemPitch = w*bpp ;
    tbsd.SysMemSlicePitch = w*h*bpp ; // Not needed since this is a 2d texture

    tdesc.Width = w;
    tdesc.Height = h;

    tdesc.MipLevels = 1;
    tdesc.ArraySize = 1;
    tdesc.SampleDesc.Count = 1;
    tdesc.SampleDesc.Quality = 0;
    tdesc.Usage = D3D11_USAGE_DEFAULT;
    tdesc.Format = DXGI_FORMAT_D32_FLOAT;
    tdesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;

    tdesc.CPUAccessFlags = 0;
    tdesc.MiscFlags = 0;

    if ( myDevice->CreateTexture2D(&tdesc, &tbsd, NULL) == S_FALSE )
        std::cout << "Inputs correct" << std::endl;
    else
        std::cout << "wrong inputs" << std::endl;

    if(FAILED(myDevice->CreateTexture2D(&tdesc,&tbsd,&tex)))
    {
        std::cout << "Failed" << std::endl;
        return(0);
    }
    else
        std::cout << "Sucess" << std::endl;

    delete[] buf;

    return(tex);
}

The corresponding output becomes

wrong inputs
Failed

So I guess inputs are wrong, but they seems good to me and the documentation is not very verbose on that. Does somebody has a clue ?

Edit - after Chuck Walbourn answer

I see three points here:

1. The debug layer:

Thanks for the advice, I read the link you gave, and also your answser on this question here, and this doc (4), and the last paragraph from here (5):

msdn.microsoft.com/en-us/library/windows/desktop/jj200584(v=vs.85).aspx

msdn.microsoft.com/en-us/library/windows/desktop/jj863687(v=vs.85).aspx

(cannot post links because of low reput)

For developers currently working on applications in Microsoft Visual Studio 2010 or earlier using the D3D11_CREATE_DEVICE_DEBUG flag, be aware that calls to D3D11CreateDevice will fail. This is because the D3D11.1 runtime now requires D3D11_1SDKLayers.dll instead of D3D11SDKLayers.dll.

I have to develop using VS 2010, and if I activate the flag D3D11_CREATE_DEVICE_DEBUG, the call for D3D11CreateDevice() is not creating any error. But there is no debug msg during runtime. So I did a research on my PC, and the dll D3D11_1SDKLayers.dll is missing. Thus I assume I will have to install the Windows SDK 8.1.

This raises two questions I will try to answer:

Can I install Windows 8.1 SDK on a Windows 7 PC?

Where does the debug msg shall appear? In my console during runtime? (silly question I guess, but I have to confess I am not a real programmer and things apparently trivia for you guys are not for me I guess...)

EDIT: According to

The legacy DirectX SDK (June 2010) has the correct DirectX Debug Runtime for Windows Vista SP2 and Windows 7 aka DirectX 11.0

I should be able to use the debug layer. But activating the flag is not making the exe more verbose. I am digging further. (I am using mvs 2010 with DirectX SDK June 2010, on windows 7, with DirectX 11 acording to dxdiag)

EDIT: solution OKAY, so indeed the output msg are visible in the "output" panel of msv, NOT in the console! THIS is a big step forward ^^. Thus this point is solved and closed.

2. The DXGI_FORMAT_R32_FLOAT format:

I did not really understand what you mean by "initData". I guess you mean that I am creating the texture and filling it in the same time, whereas I could use an empty subresource data and use UpdateSubresource(). But anyway I tried using the DXGI_FORMAT_R32_FLOAT. It seemed to work great as I did not had any errors, but the texture is black, or there is no texture. So I guess it's not working properly and I will need to work on that. I hope using the debug layer will help. Even though I am quite surprised because again, I have no error at runtime.

EDIT: As the debug layer is now OK, I am now certain that the texture is correctly created using Chuck Walbourn advice. GOOD! But another problem remains : the texture is either completely black, or completely red. Digging deeper.

EDIT: solution The problem is simple: each pixel value has to a float between 0.0 and 1.0. That's all. Problem solved. But then the image is red...

3. The Direct3D feature level above 9.2:

I don't know. This is starting to be complete alien language to me. The PC is using a NVIDIA Quadro K4000 and I will just secretly assume this is enough. I hope...

EDIT: solution Apparently it is, so it's okay.

4. Thought of a hopeless mind:

I am not a core programmer, I just want to use a mono-channel 32 bits texture instead of a four-channel 32 bits, seriously, why is that so complex ?

Upvotes: 2

Views: 7714

Answers (2)

NoonanRosenblum
NoonanRosenblum

Reputation: 533

I simply found that the DXGI_FORMAT_R32G32B32_FLOAT exists. So I am creating a r32g32b32 texture and, as I need a b&w image, I use the same level on each colour. Pretty simple.

Here is the declination of the example in my question for this solution:

ID3D11Texture2D *MakeCheckerboard(ID3D11Device *myDevice)
{
    ID3D11Texture2D*       tex;
    D3D11_TEXTURE2D_DESC   tdesc;
    D3D11_SUBRESOURCE_DATA tbsd;

    int width    = 3;   // three pixel wide
    int height   = 1;   // one pixel height
    int bpp      = 12;
    int nb_color = 3;

    // CREATING THE IMAGE
    float* buf[ width * height * nb_color ];

    // pixel white
    buf[0] = 1.0f; // red
    buf[1] = 1.0f; // green
    buf[2] = 1.0f; // blue

    // pixel black
    buf[3] = 0.0f;
    buf[4] = 0.0f;
    buf[5] = 0.0f;

    // pixel white
    buf[6] = 1.0f;
    buf[7] = 1.0f;
    buf[8] = 1.0f;

    tbsd.pSysMem          = (void *) buf;
    tbsd.SysMemPitch      = width * bpp ;
    tbsd.SysMemSlicePitch = width * height * bpp ; // Not needed since this is a 2d texture

    tdesc.Width              = width;
    tdesc.Height             = height;
    tdesc.MipLevels          = 1;
    tdesc.ArraySize          = 1;
    tdesc.SampleDesc.Count   = 1;
    tdesc.SampleDesc.Quality = 0;
    tdesc.Usage              = D3D11_USAGE_DEFAULT;
    tdesc.Format             = DXGI_FORMAT_R32G32B32_FLOAT;
    tdesc.BindFlags          = D3D11_BIND_SHADER_RESOURCE;
    tdesc.CPUAccessFlags     = 0;
    tdesc.MiscFlags          = 0;


    if(FAILED(myDevice->CreateTexture2D(&tdesc,&tbsd,&tex)))
        return(0);

    delete[] buf;

    return(tex);
}

Upvotes: 2

Chuck Walbourn
Chuck Walbourn

Reputation: 41077

First thing you should try is to enable the DEBUG device to look for more verbose diagnostics as that will likely tell you exactly what's wrong w.r.t. to parameter validation.

That said, the problem is you are using a depth format DXGI_FORMAT_D32_FLOAT and then trying to initialize it with initData which is not supported. Try DXGI_FORMAT_R32_FLOAT instead.

Note that DXGI_FORMAT_R32_FLOAT requires Direct3D feature level 9.2 or greater.

Upvotes: 2

Related Questions