使用 gstreamer 将自定义图像帧编码为视频

问题描述 投票:0回答:1

我正在制作一个 C++ 应用程序,其中有一堆帧(采用 unsigned char* 格式),我需要使用在 GPU 上运行的 gstreamer H265 编码器将它们编码为视频。 大多数 gstreamer 示例都直接使用相机,但在我的例子中没有相机。

使用一些示例我制作了视频编码器,但帧没有被推送到视频文件并且输出视频为空。

这是我实现的代码:

GstElement *pipeline, *appsrc, *videoconvert, *x264enc, *mp4mux, *filesink, *autovideosink;
GstCaps *caps;
GstBuffer *buf;
GstMapInfo map;

gst_init(nullptr, nullptr);

pipeline = gst_pipeline_new("mypipeline");

// Create elements
appsrc = gst_element_factory_make("appsrc", "mysource");
videoconvert = gst_element_factory_make("videoconvert", "myconvert");
x264enc = gst_element_factory_make("x264enc", "myencoder");
mp4mux = gst_element_factory_make("mp4mux", "mymux");
filesink = gst_element_factory_make("filesink", "myfileoutput");

if (!pipeline || !appsrc || !videoconvert || !x264enc || !mp4mux || !filesink) {
    g_printerr("Not all elements could be created.\n");
    // return -1;
}

// Set the properties for filesink
g_object_set(filesink, "location", "output.mp4", NULL);

// Build the pipeline
gst_bin_add(GST_BIN(pipeline), appsrc);
gst_bin_add(GST_BIN(pipeline), videoconvert);
gst_bin_add(GST_BIN(pipeline), x264enc);
gst_bin_add(GST_BIN(pipeline), mp4mux);
gst_bin_add(GST_BIN(pipeline), filesink);

// Link the elements
gst_element_link(appsrc, videoconvert);
gst_element_link(videoconvert, x264enc);
gst_element_link(x264enc, mp4mux);
gst_element_link(mp4mux, filesink);

caps = gst_caps_from_string("video/x-raw, format=(string)BGR, width=(int)800, height=(int)600, framerate=(fraction)30/1");

gst_element_set_state(pipeline, GST_STATE_PLAYING);

for (int i = 0; i < 10; i++) {
    buf = gst_buffer_new_and_alloc(800 * 600 * 3); // Assuming BGR format
    gst_buffer_map(buf, &map, GST_MAP_WRITE);
    memset(map.data, i, 800 * 600 * 3); // Filling with dummy data
    gst_buffer_unmap(buf, &map);
    gst_app_src_push_buffer(GST_APP_SRC(appsrc), buf);
}

gst_app_src_end_of_stream(GST_APP_SRC(appsrc));

GstBus *bus = gst_element_get_bus(pipeline);
GstMessage *msg = gst_bus_timed_pop(bus, GST_CLOCK_TIME_NONE);
if (msg != NULL)
    gst_message_unref(msg);

gst_object_unref(bus);
gst_element_set_state(pipeline, GST_STATE_NULL);
gst_object_unref(pipeline);

似乎命令

gst_app_src_push_buffer
什么也没做,我不知道为什么。这里有什么错误吗?

c++ gstreamer video-encoding
1个回答
0
投票

您的代码中有几个问题:

  1. 缓冲区没有正确标记时间戳(PTS/DTS/持续时间)。 Appsrc 元素允许设置
    do-timestamp
    属性(值为
    true
    )以便为您执行自动时间戳,但您无法使用它,因为您将缓冲区推送到管道而不先等待管道达到播放状态。或者,您可以另外将
    appsrc
    is-live
    属性设置为
    true
    ,让
    appsrc
    在达到播放状态后将缓冲区向下游推送,但是自动时间戳缓冲区将以比配置的帧速率大得多的帧速率播放(30FPS)。如果您想要 30FPS,那么类似下面的内容将正确设置缓冲区时间戳:
    buf->pts = GST_MSECOND * 30 * i;
    buf->dts = buf->pts;
    buf->duration = GST_MSECOND * 33;
    gst_app_src_push_buffer(GST_APP_SRC(appsrc), buf);
  1. appsrc
    format
    属性应设置为
    GST_FORMAT_TIME
    值。
  2. 大写字母已创建,但未在
    appsrc
    元素中设置

以下固定版本工作正常:

#include <gst/app/gstappsrc.h>
#include <gst/gst.h>

#include <cassert>

int main() {
  GstElement *pipeline, *appsrc, *videoconvert, *x264enc, *mp4mux, *filesink,
      *autovideosink;
  GstCaps *caps;
  GstBuffer *buf;
  GstMapInfo map;

  gst_init(nullptr, nullptr);

  pipeline = gst_pipeline_new("mypipeline");

  // Create elements
  appsrc = gst_element_factory_make("appsrc", "mysource");
  videoconvert = gst_element_factory_make("videoconvert", "myconvert");
  x264enc = gst_element_factory_make("x264enc", "myencoder");
  mp4mux = gst_element_factory_make("mp4mux", "mymux");
  filesink = gst_element_factory_make("filesink", "myfileoutput");

  if (!pipeline || !appsrc || !videoconvert || !x264enc || !mp4mux ||
      !filesink) {
    g_printerr("Not all elements could be created.\n");
    // return -1;
  }

  // Set the properties for filesink
  g_object_set(filesink, "location", "output.mp4", NULL);

  // Build the pipeline
  gst_bin_add(GST_BIN(pipeline), appsrc);
  gst_bin_add(GST_BIN(pipeline), videoconvert);
  gst_bin_add(GST_BIN(pipeline), x264enc);
  gst_bin_add(GST_BIN(pipeline), mp4mux);
  gst_bin_add(GST_BIN(pipeline), filesink);

  // Link the elements
  gst_element_link(appsrc, videoconvert);
  gst_element_link(videoconvert, x264enc);
  gst_element_link(x264enc, mp4mux);
  gst_element_link(mp4mux, filesink);

  caps =
      gst_caps_from_string("video/x-raw, format=(string)BGR, width=(int)800, "
                           "height=(int)600, framerate=(fraction)30/1");

  g_object_set(appsrc, "caps",  caps, nullptr);
  gst_caps_unref(caps);
  g_object_set(appsrc, "format", GST_FORMAT_TIME, nullptr);

  gst_element_set_state(pipeline, GST_STATE_PLAYING);

  for (int i = 0; i < 10; i++) {
    buf = gst_buffer_new_allocate(nullptr, 800 * 600 * 3, nullptr); // Assuming BGR format
    gst_buffer_map(buf, &map, GST_MAP_WRITE);
    assert(map.size == 800*600*3);
    memset(map.data, i%255, 800 * 600 * 3); // Filling with dummy data
    gst_buffer_unmap(buf, &map);

    buf->pts = GST_MSECOND * 30 * i;
    buf->dts = buf->pts;
    buf->duration = GST_MSECOND * 33;
    gst_app_src_push_buffer(GST_APP_SRC(appsrc), buf);
  }

  gst_app_src_end_of_stream(GST_APP_SRC(appsrc));

  GstBus *bus = gst_element_get_bus(pipeline);
  GstMessage *msg = gst_bus_timed_pop_filtered(bus, GST_CLOCK_TIME_NONE, static_cast<GstMessageType>(GST_MESSAGE_ERROR | GST_MESSAGE_EOS ));
  if (GST_MESSAGE_TYPE (msg) == GST_MESSAGE_ERROR) {
    g_error ("An error occurred! Re-run with the GST_DEBUG=*:WARN environment "
        "variable set for more details.");
  }
  
  if (msg != NULL)
    gst_message_unref(msg);

  gst_object_unref(bus);
  gst_element_set_state(pipeline, GST_STATE_NULL);
  gst_object_unref(pipeline);
}

编译、运行和测试(在macos上):

➜  ~ clang++ main.cpp $(pkg-config --libs --cflags  gstreamer-base-1.0 gstreamer-app-1.0)
➜  ~ ffprobe output.mp4
ffprobe version 6.0 Copyright (c) 2007-2023 the FFmpeg developers
  built with Apple clang version 14.0.3 (clang-1403.0.22.14.1)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/6.0_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
  libavutil      58.  2.100 / 58.  2.100
  libavcodec     60.  3.100 / 60.  3.100
  libavformat    60.  3.100 / 60.  3.100
  libavdevice    60.  1.100 / 60.  1.100
  libavfilter     9.  3.100 /  9.  3.100
  libswscale      7.  1.100 /  7.  1.100
  libswresample   4. 10.100 /  4. 10.100
  libpostproc    57.  1.100 / 57.  1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'output.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 0
    compatible_brands: mp42mp41isomiso2
    creation_time   : 2023-11-03T23:15:15.000000Z
    encoder         : x264
  Duration: 00:01:16.50, start: 0.000000, bitrate: 14 kb/s
  Stream #0:0[0x1](und): Video: h264 (High 4:4:4 Predictive) (avc1 / 0x31637661), yuv444p(tv, bt709, progressive), 800x600 [SAR 1:1 DAR 4:3], 11 kb/s, 33.33 fps, 33.33 tbr, 3k tbn (default)
    Metadata:
      creation_time   : 2023-11-03T23:15:15.000000Z
      handler_name    : VideoHandler
      vendor_id       : [0][0][0][0]
➜  ~

我发现您是用 C++ 编写的代码。还请考虑使用 RAII 技术来避免手动维护资源。一个简单的 RAII 包装器可能会有所帮助,简化代码并使其不易出错。例如:

template <auto Fn>
using FunctionObj = std::integral_constant<decltype(Fn), Fn>;

template <typename T, auto Fun>
using ResourceReleasedByFunction = std::unique_ptr<T, FunctionObj<Fun>>;

using ScopedGChar = ResourceReleasedByFunction<gchar, g_free>;
using ScopedGstElement = ResourceReleasedByFunction<GstElement, gst_object_unref>;
using ScopedGstCaps = ResourceReleasedByFunction<GstCaps, gst_caps_unref>;
using ScopedGstSample = ResourceReleasedByFunction<GstSample, gst_sample_unref>;
using ScopedGstBuffer = ResourceReleasedByFunction<GstBuffer, gst_buffer_unref>;

使用示例:

ScopedGstElement appsrc{gst_element_factory_make("appsrc", "mysource")};
...
// no need to call gst_object_unref at the end.
© www.soinside.com 2019 - 2024. All rights reserved.