网络摄像机源到EVR接收器

问题描述 投票:1回答:2

通过将字节样本直接写入Enhanced Video Renderer(EVR)接收器,我可以显示mp4视频的视频流(感谢在Media Foundation EVR no video displaying上回答)。

我想做同样的事情,但要使用网络摄像头。我目前遇到的问题是,我的网络摄像头仅支持RGB24和I420格式,据我所知,EVR仅支持RGB32。我相信在某些Media Foundation方案中,只要在该过程中注册了CColorConvertDMO类,转换就会自动发生。我已经做到了,但是我怀疑由于我将样本写入EVR的方式,因此没有调用颜色转换。

我的问题是我应该采取哪种方法来允许从网络摄像头IMFSourceReader中读取RGB24样本以允许写入EVR IMFStreamSink

我的完整示例程序为here,但由于需要Media Foundation管道而非常长。下面是我尝试将EVR接收器媒体类型与网络摄像头源媒体类型匹配的块。

问题是MF_MT_SUBTYPE属性的设置。据我所知,对于EVR,tt必须为MFVideoFormat_RGB32,但我的网络摄像头仅接受MFVideoFormat_RGB24

IMFMediaSource* pVideoSource = NULL;
IMFSourceReader* pVideoReader = NULL;
IMFMediaType* videoSourceOutputType = NULL, * pvideoSourceModType = NULL;
IMFMediaType* pVideoOutType = NULL;
IMFMediaType* pHintMediaType = NULL;
IMFMediaSink* pVideoSink = NULL;
IMFStreamSink* pStreamSink = NULL;
IMFSinkWriter* pSinkWriter = NULL;
IMFMediaTypeHandler* pSinkMediaTypeHandler = NULL, * pSourceMediaTypeHandler = NULL;
IMFPresentationDescriptor* pSourcePresentationDescriptor = NULL;
IMFStreamDescriptor* pSourceStreamDescriptor = NULL;
IMFVideoRenderer* pVideoRenderer = NULL;
IMFVideoDisplayControl* pVideoDisplayControl = NULL;
IMFGetService* pService = NULL;
IMFActivate* pActive = NULL;
IMFPresentationClock* pClock = NULL;
IMFPresentationTimeSource* pTimeSource = NULL;
IDirect3DDeviceManager9* pD3DManager = NULL;
IMFVideoSampleAllocator* pVideoSampleAllocator = NULL;
IMFSample* pD3DVideoSample = NULL;
RECT rc = { 0, 0, VIDEO_WIDTH, VIDEO_HEIGHT };
BOOL fSelected = false;

CHECK_HR(CoInitializeEx(NULL, COINIT_APARTMENTTHREADED | COINIT_DISABLE_OLE1DDE),
"COM initialisation failed.");

CHECK_HR(MFStartup(MF_VERSION),
"Media Foundation initialisation failed.");

//CHECK_HR(ListCaptureDevices(DeviceType::Video), 
//  "Error listing video capture devices.");

// Need the color converter DSP for conversions between YUV, RGB etc.
CHECK_HR(MFTRegisterLocalByCLSID(
__uuidof(CColorConvertDMO),
MFT_CATEGORY_VIDEO_PROCESSOR,
L"",
MFT_ENUM_FLAG_SYNCMFT,
0,
NULL,
0,
NULL),
"Error registering colour converter DSP.");

// Create a separate Window and thread to host the Video player.
CreateThread(NULL, 0, (LPTHREAD_START_ROUTINE)InitializeWindow, NULL, 0, NULL);
Sleep(1000);
if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}

if (_hwnd == nullptr)
{
printf("Failed to initialise video window.\n");
goto done;
}

// ----- Set up Video sink (Enhanced Video Renderer). -----

CHECK_HR(MFCreateVideoRendererActivate(_hwnd, &pActive),
"Failed to created video rendered activation context.");

CHECK_HR(pActive->ActivateObject(IID_IMFMediaSink, (void**)&pVideoSink),
"Failed to activate IMFMediaSink interface on video sink.");

// Initialize the renderer before doing anything else including querying for other interfaces,
// see https://msdn.microsoft.com/en-us/library/windows/desktop/ms704667(v=vs.85).aspx.
CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFVideoRenderer), (void**)&pVideoRenderer),
"Failed to get video Renderer interface from EVR media sink.");

CHECK_HR(pVideoRenderer->InitializeRenderer(NULL, NULL),
"Failed to initialise the video renderer.");

CHECK_HR(pVideoSink->QueryInterface(__uuidof(IMFGetService), (void**)&pService),
"Failed to get service interface from EVR media sink.");

CHECK_HR(pService->GetService(MR_VIDEO_RENDER_SERVICE, __uuidof(IMFVideoDisplayControl), (void**)&pVideoDisplayControl),
"Failed to get video display control interface from service interface.");

CHECK_HR(pVideoDisplayControl->SetVideoWindow(_hwnd),
"Failed to SetVideoWindow.");

CHECK_HR(pVideoDisplayControl->SetVideoPosition(NULL, &rc),
"Failed to SetVideoPosition.");

CHECK_HR(pVideoSink->GetStreamSinkByIndex(0, &pStreamSink),
"Failed to get video renderer stream by index.");

CHECK_HR(pStreamSink->GetMediaTypeHandler(&pSinkMediaTypeHandler),
"Failed to get media type handler for stream sink.");

DWORD sinkMediaTypeCount = 0;
CHECK_HR(pSinkMediaTypeHandler->GetMediaTypeCount(&sinkMediaTypeCount),
"Failed to get sink media type count.");

std::cout << "Sink media type count: " << sinkMediaTypeCount << "." << std::endl;

// ----- Set up Video source (is either a file or webcam capture device). -----

#if USE_WEBCAM_SOURCE
CHECK_HR(GetVideoSourceFromDevice(WEBCAM_DEVICE_INDEX, &pVideoSource, &pVideoReader),
"Failed to get webcam video source.");
#else
CHECK_HR(GetVideoSourceFromFile(MEDIA_FILE_PATH, &pVideoSource, &pVideoReader),
"Failed to get file video source.");
#endif

CHECK_HR(pVideoReader->GetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, &videoSourceOutputType),
"Error retrieving current media type from first video stream.");

CHECK_HR(pVideoReader->SetStreamSelection((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, TRUE),
"Failed to set the first video stream on the source reader.");

CHECK_HR(pVideoSource->CreatePresentationDescriptor(&pSourcePresentationDescriptor),
"Failed to create the presentation descriptor from the media source.");

CHECK_HR(pSourcePresentationDescriptor->GetStreamDescriptorByIndex(0, &fSelected, &pSourceStreamDescriptor),
"Failed to get source stream descriptor from presentation descriptor.");

CHECK_HR(pSourceStreamDescriptor->GetMediaTypeHandler(&pSourceMediaTypeHandler),
"Failed to get source media type handler.");

DWORD srcMediaTypeCount = 0;
CHECK_HR(pSourceMediaTypeHandler->GetMediaTypeCount(&srcMediaTypeCount),
"Failed to get source media type count.");

std::cout << "Source media type count: " << srcMediaTypeCount << ", is first stream selected " << fSelected << "." << std::endl;
std::cout << "Default output media type for source reader:" << std::endl;
std::cout << GetMediaTypeDescription(videoSourceOutputType) << std::endl << std::endl;

// ----- Create a compatible media type and set on the source and sink. -----

// Set the video input type on the EVR sink.
CHECK_HR(MFCreateMediaType(&pVideoOutType), "Failed to create video output media type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video), "Failed to set video output media major type.");
CHECK_HR(pVideoOutType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB32), "Failed to set video sub-type attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive), "Failed to set interlace mode attribute on media type.");
CHECK_HR(pVideoOutType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE), "Failed to set independent samples attribute on media type.");
CHECK_HR(MFSetAttributeRatio(pVideoOutType, MF_MT_PIXEL_ASPECT_RATIO, 1, 1), "Failed to set pixel aspect ratio attribute on media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_SIZE), "Failed to copy video frame size attribute to media type.");
CHECK_HR(CopyAttribute(videoSourceOutputType, pVideoOutType, MF_MT_FRAME_RATE), "Failed to copy video frame rate attribute to media type.");

//CHECK_HR(GetSupportedMediaType(pMediaTypeHandler, &pVideoOutType),
//  "Failed to get supported media type.");

std::cout << "Custom media type defined as:" << std::endl;
std::cout << GetMediaTypeDescription(pVideoOutType) << std::endl << std::endl;

auto doesSinkSupport = pSinkMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSinkSupport != S_OK) {
std::cout << "Sink does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSinkMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
  "Failed to set input media type on EVR sink.");
}

// The block below returnedalways failed furing testing. My guess is the source media type handler
// is not aligned with the video reader somehow.
/*auto doesSrcSupport = pSourceMediaTypeHandler->IsMediaTypeSupported(pVideoOutType, &pHintMediaType);
if (doesSrcSupport != S_OK) {
std::cout << "Source does not support desired media type." << std::endl;
goto done;
}
else {
CHECK_HR(pSourceMediaTypeHandler->SetCurrentMediaType(pVideoOutType),
  "Failed to set output media type on source reader.");
}*/

CHECK_HR(pVideoReader->SetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, NULL, pVideoOutType),
"Failed to set output media type on source reader.");

// ----- Source and sink now configured. Set up remaining infrastructure and then start sampling. -----
ms-media-foundation
2个回答
0
投票

我需要手动连接颜色转换MFT(我很确定某些Media Foundation方案会自动将其连接,但可能仅在使用拓扑时才连接),并调整提供给EVR的Direct3D IMFSample上的时钟设置。

正在工作example


0
投票

源阅读器通常能够执行此转换:RGB24-> RGB32。

据我所知,EVR仅支持RGB32

不是,这仅取决于您的视频处理器:mofo7777 / Stackoverflow

在MFVideoEVR项目下,将所有MFVideoFormat_RGB32替换为MFVideoFormat_NV12,它应与NVidia GPU卡一起使用。改变睡眠(20);带睡眠(40);在Main.cpp(HRESULT DisplayVideo(...))中,因为使用NV12格式进行了更优化(25 fps视频帧速率的值)。

关于您的问题:

您可以在不进行颜色转换MFT的情况下进行操作。从MFVideoEVR,有两件事需要更新:

  • 设置视频捕获源而不是视频文件源
  • 手动处理采样时间,因为捕获采样时间不准确

这里的源代码:mofo7777 / Stackoverflow

在MFVideoCaptureEVR项目下。

© www.soinside.com 2019 - 2024. All rights reserved.