Bobby Mah
Bobby Mah

Reputation: 33

DXGI_FORMAT_YUY2 textures return different RowPitch under Windows 8.1 and Windows 10

My build environment is as follows: Windows 8.1, VS2012, desktop application built using windows 8.0 SDK and C++.

When I run my program on windows 8.1 the RowPitch prints 2560 but under windows 10 the same program prints 5120.

What am I doing wrong here?

Here is the code, Thanks for all the replies.

#include <d3d11.h>

static bool init_directx11(ID3D11Device **pDevice, ID3D11DeviceContext **pDeviceContext)
{
    D3D_FEATURE_LEVEL featureLevels[] = {D3D_FEATURE_LEVEL_11_0, D3D_FEATURE_LEVEL_10_1, D3D_FEATURE_LEVEL_10_0, D3D_FEATURE_LEVEL_9_1};

    D3D_FEATURE_LEVEL featureLevel;

    UINT createDeviceFlags = D3D11_CREATE_DEVICE_BGRA_SUPPORT;
    HRESULT hr = D3D11CreateDevice(NULL, D3D_DRIVER_TYPE_HARDWARE, NULL, createDeviceFlags, featureLevels, ARRAYSIZE(featureLevels), D3D11_SDK_VERSION, pDevice,
                                   &featureLevel, pDeviceContext);

    return SUCCEEDED(hr);
}

int _tmain(int argc, _TCHAR* argv[])
{
    ID3D11Device *pDevice = nullptr;
    ID3D11DeviceContext *pDeviceContext= nullptr;
    if (!init_directx11(&pDevice, &pDeviceContext))
    {
        return FALSE;
    }

    D3D11_TEXTURE2D_DESC desc;
    ZeroMemory(&desc, sizeof(D3D11_TEXTURE2D_DESC));

    desc.ArraySize = 1;
    desc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
    desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
    desc.Format = DXGI_FORMAT_YUY2;
    desc.MipLevels = 1;
    desc.MiscFlags = 0;
    desc.SampleDesc.Count = 1;
    desc.SampleDesc.Quality = 0;
    desc.Usage = D3D11_USAGE_DYNAMIC;
    desc.Width = 1280;
    desc.Height = 720;

    ID3D11Texture2D* pTexture2D = nullptr;
    HRESULT hr = pDevice->CreateTexture2D(&desc, NULL, &pTexture2D);

    D3D11_MAPPED_SUBRESOURCE mappedResource;
    ZeroMemory(&mappedResource, sizeof(DXGI_MAPPED_RECT));
    hr = pDeviceContext->Map(pTexture2D, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource);

    printf("RowPitch = %d\n", mappedResource.RowPitch);

    pDeviceContext->Unmap(pTexture2D, 0);


    pTexture2D->Release();
    pDeviceContext->Release();
    pDevice->Release();

    getchar();
   }

Upvotes: 0

Views: 865

Answers (1)

Roman Ryltsov
Roman Ryltsov

Reputation: 69672

What am I doing wrong here?

This is not necessarily wrong. RowPitch depends on layout the hardware and driver assigned for the texture. The pitch might vary. You are supposed to read the pitch back once the resource is mapped, and use it respectively to read or write the data.

See this thread and message for pitch use code snippet:

The texture resource will have it's own pitch (the number of bytes in a row), which is probably different than the pitch of your source data. This pitch is given to you as the "RowPitch" member of D3D11_MAPPED_SUBRESOURCE. So typically you do something like this:

BYTE* mappedData = reinterpret_cast<BYTE*>(mappedResource.pData);
for(UINT i = 0; i < height; ++i)
{
  memcpy(mappedData, buffer, rowspan);
  mappedData += mappedResource.RowPitch;
  buffer += rowspan;
}

Upvotes: 2

Related Questions