我正在解码视频并获得 YUV 420 帧。为了使用 D3D11 渲染它们,它们需要转换为 RGB(或者至少我假设渲染目标视图本身不能是 YUV)。
YUV帧都是planar format,意思是UV,没有packed。我正在创建 3 个纹理和
DXGI_FORMAT_R8G8_UNORM
类型的 ShaderResourceView。我正在将帧中的每个平面复制到它自己的 ShaderResourceView 中。然后我依靠采样器来解释 Y 平面和 UV 平面之间的大小差异。黑色/白色看起来很棒。但是,如果我添加颜色,我会得到一张过于绿色的图片:
我对自己可能做错的事情感到茫然。我试过切换 UV 和平面,我也试过调整转换值。我正在关注 Microsoft 的图片转换指南。
这是我的着色器:
min16float4 main(PixelShaderInput input) : SV_TARGET
{
float y = YChannel.Sample(defaultSampler, input.texCoord).r;
float u = UChannel.Sample(defaultSampler, input.texCoord).r - 0.5;
float v = VChannel.Sample(defaultSampler, input.texCoord).r - 0.5;
float r = y + 1.13983 * v;
float g = y - 0.39465 * u - 0.58060 * v;
float b = y + 2.03211 * u;
return min16float4(r, g, b , 1.f);
}
创建我的 ShaderResourceViews:
D3D11_TEXTURE2D_DESC texDesc;
ZeroMemory(&texDesc, sizeof(texDesc));
texDesc.Width = 1670;
texDesc.Height = 626;
texDesc.MipLevels = 1;
texDesc.ArraySize = 1;
texDesc.Format = DXGI_FORMAT_R8_UNORM;
texDesc.SampleDesc.Count = 1;
texDesc.SampleDesc.Quality = 0;
texDesc.Usage = D3D11_USAGE_DYNAMIC;
texDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
texDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
dev->CreateTexture2D(&texDesc, NULL, &pYPictureTexture);
dev->CreateTexture2D(&texDesc, NULL, &pUPictureTexture);
dev->CreateTexture2D(&texDesc, NULL, &pVPictureTexture);
D3D11_SHADER_RESOURCE_VIEW_DESC shaderResourceViewDesc;
shaderResourceViewDesc.Format = DXGI_FORMAT_R8_UNORM;
shaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
shaderResourceViewDesc.Texture2D.MostDetailedMip = 0;
shaderResourceViewDesc.Texture2D.MipLevels = 1;
dev->CreateShaderResourceView(pYPictureTexture, &shaderResourceViewDesc, &pYPictureTextureResourceView);
dev->CreateShaderResourceView(pUPictureTexture, &shaderResourceViewDesc, &pUPictureTextureResourceView);
dev->CreateShaderResourceView(pVPictureTexture, &shaderResourceViewDesc, &pVPictureTextureResourceView);
然后我如何复制解码的 ffmpeg AVFrame:
int height = 626;
int width = 1670;
D3D11_MAPPED_SUBRESOURCE msY;
D3D11_MAPPED_SUBRESOURCE msU;
D3D11_MAPPED_SUBRESOURCE msV;
devcon->Map(pYPictureTexture, 0, D3D11_MAP_WRITE_DISCARD, 0, &msY);
memcpy(msY.pData, frame->data[0], height * width);
devcon->Unmap(pYPictureTexture, 0);
devcon->Map(pUPictureTexture, 0, D3D11_MAP_WRITE_DISCARD, 0, &msU);
memcpy(msU.pData, frame->data[1], (height*width) / 4);
devcon->Unmap(pUPictureTexture, 0);
devcon->Map(pVPictureTexture, 0, D3D11_MAP_WRITE_DISCARD, 0, &msV);
memcpy(msV.pData, frame->data[2], (height*width) / 4);
devcon->Unmap(pVPictureTexture, 0);
PS:很高兴提供更多额外的请求代码!我只是想尽可能简洁。
我找到问题了!基本上,我的方程错了。
对于任何试图将
AV_PIX_FMT_YUVJ420P
(平面 YUV 4:2:0)转换为 DXGI_FORMAT_R8G8B8A8_UNORM
Direct3D 2DTexture 的人:
// Derived from https://msdn.microsoft.com/en-us/library/windows/desktop/dd206750(v=vs.85).aspx
// Section: Converting 8-bit YUV to RGB888
float3 yuv = (y, u, v);
// These values are calculated from (16 / 255) and (128 / 255)
float C = y - 0.062745f;
float D = u - 0.501960f;
float E = v - 0.501960f;
yuv.r = 1.164383f * C + 1.596027f * E ;
yuv.g = 1.164383f * C - (0.391762f * D) - (0.812968f * E) ;
yuv.b = 1.164383f * C + 2.017232f * D ;
return saturate(yuv);
我的问题的确切原因是错误计算了绿色值。它从来没有低于
0.4
或高于0.5
(白色或黑色的事件)。确保始终尽可能严格遵守规范!