Reputation: 30724
I am able to display a video stream from an mp4 video by writing byte samples directly to the Enhanced Video Renderer (EVR) sink (thanks to answer on Media Foundation EVR no video displaying).
I'd like to do the same thing but for a webcam source. The current problem I've got is that my webcam only support RGB24 and I420 formats and as far as I can tell the EVR only supports RGB32. In some Media Foundation scenarios I believe the conversion will happen automatically provided a CColorConvertDMO
class is registered in the process. I've done that but I suspect because of the way I'm writing samples to the EVR the color conversion is not being invoked.
My question is what sort of approach should I take to allow RGB24 samples read from my webcam IMFSourceReader
to allow writing to the EVR IMFStreamSink
?
My full sample program is here and is unfortunately rather long due to the Media Foundation plumbing required. The block where I attempt to match the EVR sink media type to the webcam source media type is below.
The problem is the setting of the MF_MT_SUBTYPE
attribute. From what I can tell tt has to be MFVideoFormat_RGB32
for the EVR but my webcam will only accept MFVideoFormat_RGB24
.
IMFMediaSource* pVideoSource = NULL;
IMFSourceReader* pVideoReader = NULL;
IMFMediaType* videoSourceOutputType = NULL, * pvideoSourceModType = NULL;
IMFMediaType* pVideoOutType = NULL;
IMFMediaType* pHintMediaType = NULL;
IMFMediaSink* pVideoSink = NULL;
IMFStreamSink* pStreamSink = NULL;
IMFSinkWriter* pSinkWriter = NULL;
IMFMediaTypeHandler* pSinkMediaTypeHandler = NULL, * pSourceMediaTypeHandler = NULL;
IMFPresentationDescriptor* pSourcePresentationDescriptor = NULL;
IMFStreamDescriptor* pSourceStreamDescriptor = NULL;
IMFVideoRenderer* pVideoRenderer = NULL;
IMFVideoDisplayControl* pVideoDisplayControl = NULL;
IMFGetService* pService = NULL;
IMFActivate* pActive = NULL;
IMFPresentationClock* pClock = NULL;
IMFPresentationTimeSource* pTimeSource = NULL;
IDirect3DDeviceManager9* pD3DManager = NULL;
IMFVideoSampleAllocator* pVideoSampleAllocator = NULL;
IMFSample* pD3DVideoSample = NULL;
RECT rc = { 0, 0, VIDEO_WIDTH, VIDEO_HEIGHT };
BOOL fSelected = false;
CHECK_HR(CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE),
"COM initialisation failed.");
CHECK_HR(MFStartup(MF_VERSION),
"Media Foundation initialisation failed.");
//CHECK_HR(ListCaptureDevices(DeviceType::Video),
// "Error listing video capture devices.");
// Need the color converter DSP for conversions between YUV, RGB etc.
CHECK_HR(MFTRegisterLocalByCLSID(
__uuidof(CColorConvertDMO),
MFT_CATEGORY_VIDEO_PROCESSOR,
L"",
MFT_ENUM_FLAG_SYNCMFT,
0,
NULL,
0,
NULL),
"Error registering colour converter DSP.");
// Create a separate Window and thread to host the Video player.
CreateThread(NULL, 0, (LPTHREAD_START_ROUTINE)InitializeWindow, NULL, 0, NULL);
Sleep(1000);
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}
// ----- Set up Video sink (Enhanced Video Renderer). -----
CHECK_HR(MFCreateVideoRendererActivate(_hwnd, &pActive),
"Failed to created video rendered activation context.");
CHECK_HR(pActive->ActivateObject(IID_IMFMediaSink, (void**)&pVideoSink),
"Failed to activate IMFMediaSink interface on video sink.");
// Initialize the renderer before doing anything else including querying for other interfaces,
// see https://msdn.microsoft.com/en-us/library/windows/desktop/ms704667(v=vs.85).aspx.
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFVideoRenderer), (void**)&pVideoRenderer),
"Failed to get video Renderer interface from EVR media sink.");
CHECK_HR(pVideoRenderer->InitializeRenderer(NULL, NULL),
"Failed to initialise the video renderer.");
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFGetService), (void**)&pService),
"Failed to get service interface from EVR media sink.");
CHECK_HR(pService->GetService(MR_VIDEO_RENDER_SERVICE, __uuidof(IMFVideoDisplayControl), (void**)&pVideoDisplayControl),
"Failed to get video display control interface from service interface.");
CHECK_HR(pVideoDisplayControl->SetVideoWindow(_hwnd),
"Failed to SetVideoWindow.");
CHECK_HR(pVideoDisplayControl->SetVideoPosition(NULL, &rc),
"Failed to SetVideoPosition.");
CHECK_HR(pVideoSink->GetStreamSinkByIndex(0, &pStreamSink),
"Failed to get video renderer stream by index.");
CHECK_HR(pStreamSink->GetMediaTypeHandler(&pSinkMediaTypeHandler),
"Failed to get media type handler for stream sink.");
DWORD sinkMediaTypeCount = 0;
CHECK_HR(pSinkMediaTypeHandler->GetMediaTypeCount(&sinkMediaTypeCount),
"Failed to get sink media type count.");
std::cout << "Sink media type count: " << sinkMediaTypeCount << "." << std::endl;
// ----- Set up Video source (is either a file or webcam capture device). -----
#if USE_WEBCAM_SOURCE
CHECK_HR(GetVideoSourceFromDevice(WEBCAM_DEVICE_INDEX, &pVideoSource, &pVideoReader),
"Failed to get webcam video source.");
#else
CHECK_HR(GetVideoSourceFromFile(MEDIA_FILE_PATH, &pVideoSource, &pVideoReader),
"Failed to get file video source.");
#endif
CHECK_HR(pVideoReader->GetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, &videoSourceOutputType),
"Error retrieving current media type from first video stream.");
CHECK_HR(pVideoReader->SetStreamSelection((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, TRUE),
"Failed to set the first video stream on the source reader.");
CHECK_HR(pVideoSource->CreatePresentationDescriptor(&pSourcePresentationDescriptor),
"Failed to create the presentation descriptor from the media source.");
CHECK_HR(pSourcePresentationDescriptor->GetStreamDescriptorByIndex(0, &fSelected, &pSourceStreamDescriptor),
"Failed to get source stream descriptor from presentation descriptor.");
CHECK_HR(pSourceStreamDescriptor->GetMediaTypeHandler(&pSourceMediaTypeHandler),
"Failed to get source media type handler.");
DWORD srcMediaTypeCount = 0;
CHECK_HR(pSourceMediaTypeHandler->GetMediaTypeCount(&srcMediaTypeCount),
"Failed to get source media type count.");
std::cout << "Source media type count: " << srcMediaTypeCount << ", is first stream selected " << fSelected << "." << std::endl;
std::cout << "Default output media type for source reader:" << std::endl;
std::cout << GetMediaTypeDescription(videoSourceOutputType) << std::endl << std::endl;
// ----- Create a compatible media type and set on the source and sink. -----
// Set the video input type on the EVR sink.
CHECK_HR(MFCreateMediaType(&pVideoOutType), "Failed to create video output media type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video), "Failed to set video output media major type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB32), "Failed to set video sub-type attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive), "Failed to set interlace mode attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE), "Failed to set independent samples attribute on media type.");
CHECK_HR(MFSetAttributeRatio(pVideoOutType, MF_MT_PIXEL_ASPECT_RATIO, 1, 1), "Failed to set pixel aspect ratio attribute on media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_SIZE), "Failed to copy video frame size attribute to media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_RATE), "Failed to copy video frame rate attribute to media type.");
//CHECK_HR(GetSupportedMediaType(pMediaTypeHandler, &pVideoOutType),
// "Failed to get supported media type.");
std::cout << "Custom media type defined as:" << std::endl;
std::cout << GetMediaTypeDescription(pVideoOutType) << std::endl << std::endl;
auto doesSinkSupport = pSinkMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSinkSupport != S_OK) {
std::cout << "Sink does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSinkMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
"Failed to set input media type on EVR sink.");
}
// The block below returnedalways failed furing testing. My guess is the source media type handler
// is not aligned with the video reader somehow.
/*auto doesSrcSupport = pSourceMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSrcSupport != S_OK) {
std::cout << "Source does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSourceMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
"Failed to set output media type on source reader.");
}*/
CHECK_HR(pVideoReader->SetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, NULL, pVideoOutType),
"Failed to set output media type on source reader.");
// ----- Source and sink now configured. Set up remaining infrastructure and then start sampling. -----
Upvotes: 1
Views: 432
Reputation: 1515
The Source Reader is normally able to do this conversion : RGB24 -> RGB32.
as far as I can tell the EVR only supports RGB32
Not really, it's just depend on your video processor : mofo7777 / Stackoverflow
Under MFVideoEVR project, replace all MFVideoFormat_RGB32 with MFVideoFormat_NV12, it should work with NVidia GPU Card. Change Sleep(20); with Sleep(40); in Main.cpp (HRESULT DisplayVideo(...)), because using NV12 format is more optimized (value for 25 fps video frame rate).
About your question :
You can do it without handle colour conversion MFT. From MFVideoEVR, two things to update :
The source code here : mofo7777 / Stackoverflow
Under MFVideoCaptureEVR project.
Upvotes: 1
Reputation: 30724
I needed to manually wire up a colour conversion MFT (I'm pretty sure some Media Foundation scenarios wire it in automatically but probably only when using a topology) AND adjust the clock set on the Direct3D IMFSample provided to the EVR.
Working example.
Upvotes: 0