Reputation: 35
I am gonna try to render a 2D Texture from opencv::Mat to a Texture in DX11 and also to a shaderesource afterwars. The Problem is, the program crashes on Device::CreateTexture2D() and I can't figure out why. I researched the whole day, I just don't see whats wrong here.
Furthermore, the Problem seems not to be the cv::Mat as Resource, because I have tried also this example here: D3D11 CreateTexture2D in 32 bit format with the chess-example as resource for the texture... and the functions still crashes, when calling with the 3rd param.
I found others, who had Problems with that function, sometimes the Problem was because that SysMemPitch was not set for 2D Textures but that's not the case here unfortunately.
Error Output: First-chance exception at 0x692EF11E (igd10iumd32.dll) in ARift.exe: 0xC0000005: Access violation reading location 0x03438000.
here is the relevant code:
bool Texture::InitCameraStream(ID3D11Device* device, ARiftControl* arift_control)
{
D3D11_TEXTURE2D_DESC td;
ZeroMemory(&td, sizeof(td));
td.ArraySize = 1;
td.BindFlags = D3D11_BIND_SHADER_RESOURCE;
td.Usage = D3D11_USAGE_DYNAMIC;
td.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
td.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
td.Height = arift_control->picture_1_.size().height;
td.Width = arift_control->picture_1_.size().width;
td.MipLevels = 1;
td.MiscFlags = 0;
td.SampleDesc.Count = 1;
td.SampleDesc.Quality = 0;
D3D11_SUBRESOURCE_DATA srInitData;
srInitData.pSysMem = arift_control->picture_1_.ptr();
srInitData.SysMemPitch = arift_control->picture_1_.size().width * 4;
ID3D11Texture2D* tex = 0;
if ((device->CreateTexture2D(&td, &srInitData, NULL) == S_FALSE));
{
std::cerr << "Texture Description: OK " << std::endl << "Subresource: OK" << std::endl;
}
if (FAILED(device->CreateTexture2D(&td, &srInitData, &tex)));
{
std::cerr << "Error: Texture could not be created! "<< std::endl;
return false;
}
// Create the shader-resource view
D3D10_SHADER_RESOURCE_VIEW_DESC srDesc;
srDesc.Format = td.Format;
srDesc.ViewDimension = D3D10_SRV_DIMENSION_TEXTURE2D;
srDesc.Texture2D.MostDetailedMip = 0;
srDesc.Texture2D.MipLevels = 1;
if (FAILED(device->CreateShaderResourceView(tex, NULL, &texture_)));
{
std::cerr << "Can't create Shader Resource View" << std::endl;
return false;
}
return true;
}
CreateTexture2D Returns S_FALSE when the first 2 Parameters are valid, and passing 0 as the 3rd param. So in my case, it also Returns S_FALSE the first time, so the Debug Output appears. When calling CreateTexture2D with the 3rd param (the TExture COM object), it crashed. I have absolutely no idea anymore.
Furthermore, I tried to Setup Debugging with DirectX and followed that tutorial: http://blog.rthand.com/post/2010/10/25/Capture-DirectX-1011-debug-output-to-Visual-Studio.aspx - but I can't see a "Debug" window in my Project Properties in Visual Studio 2013. So I still get to "igd10iumd32.pdb not loaded" window, after the program crashes.
Edit: at least I could fix the issue with the additional D3D debug Outputs for now. In my Visual Studio 2013 I had to set the following: Project Properties -> Debugging -> Debug Type -> Mixed for getting the Additional D3D logs :)
Can anyone help here? It's really frustrating, just getting stuck on that single function the whole day ..
Many thanks!
Upvotes: 2
Views: 6369
Reputation: 10039
Your input texture data passed in D3D11_SUBRESOURCE_DATA
is not sufficiently sized. In your comment, you said that the input image data is 900x1600, and the link is a JPEG. However, you are specifying to D3D that the data format is DXGI_FORMAT_B8G8R8A8_UNORM
. JPEG is a compressed format, thus, the data stream will be smaller than it would be in BGRA format. When your drive (igd10iumd32.dll) attempts to read this input stream, it crashes because the buffer is not as large as you told D3D it was.
You can use D3DX11CreateTextureFromFile
to load JPEG data. There are also some free image conversion libraries you can use to convert the JPEG into a D3D natively compatible format.
Upvotes: 1