Reputation: 31
I'd like to call IDXGIDevice1::SetMaximumFrameLatency method from my dx12app, for that I need to get a valid IDXGIDevice1 from the current Direct3D 12 device. querying the interface return a E_NOINTERFACE:
IDXGIDevice * pDXGIDevice;
HRESULT hr = myDevice->QueryInterface(__uuidof(IDXGIDevice), (void **)&pDXGIDevice);
assert(hr != S_OK); // returns E_NOINTERFACE
IDXGIDevice1 * pDXGIDevice1;
HRESULT hr1 = myDevice->QueryInterface(__uuidof(IDXGIDevice1), (void **)&pDXGIDevice1);
assert(hr != S_OK); // returns E_NOINTERFACE
Not sure if I'm missing something or there is sequence of dxgi logic I need to implement to get a valid IDXGIDevice1 interface.
Would appreciate any hints & thanks in advance! Klip
Upvotes: 3
Views: 3613
Reputation: 41127
For Direct3D 12, this 'legacy pattern' of obtaining the DXGI factory is not supported, so your code above won't work as it's the first step:
ComPtr<IDXGIDevice3> dxgiDevice;
DX::ThrowIfFailed(
m_d3dDevice.As(&dxgiDevice)
);
ComPtr<IDXGIAdapter> dxgiAdapter;
DX::ThrowIfFailed(
dxgiDevice->GetAdapter(&dxgiAdapter)
);
ComPtr<IDXGIFactory4> dxgiFactory;
DX::ThrowIfFailed(
dxgiAdapter->GetParent(IID_PPV_ARGS(&dxgiFactory))
);
For Direct3D 12, you should always create the DXGI factory explicitly. See Anatomy of Direct3D 12 Create Device.
In Direct3D 12 swap chains, you explicitly control the backbuffer swapping behavior. Ideally you'd use DXGI_SWAP_CHAIN_FLAG_FRAME_LATENCY_WAITABLE_OBJECT
and then use the waitable object to throttle your rendering speed instead. You can set the latency count via IDXGISwapChain2::SetMaximumFrameLatency
which defaults to 3 (Microsoft Docs is currently wrong about the defaults).
If you want to support 'higher-than-refresh-rate' updates (such as nVidia G-Sync or AMD FreeSync), then you use the new DXGI_PRESENT_ALLOW_TEARING
flag for Present
. For details on using this flag, see Microsoft Docs or this YouTube video.
See also DirectX 12: Presentation Modes In Windows 10 (YouTube).
Upvotes: 3