在 Surface/TextureView 上绘制 YUV 帧

问题描述 投票:0回答:0

我有一些回调在自己的

Thread
(不是主要的)连续调用,因为这是 1920 x 1088(是的,88),30 fps 视频:

@Override
public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer byteBuffer, 
                              final int width, final int height) {

来自

mediaFormat
我可以猜到
colorFormat
COLOR_FormatYUV420SemiPlanar
COLOR_FormatYUV420Planar
(我更愿意支持两者,但至少其中之一)

现在我想绘制这些帧,最好是在

TextureView
上,但也可能是
SurfaceView
,因为转换为 RGB/
Bitmap
效率不高,在我的情况下甚至可能需要 60+ 毫秒(那是 30 fps视频...),所以我应该坚持一些“原生方式”或“GPU 方式”(对吗?)

WebRTC方式

我发现

WebRTC
lib 非常有用,它包含一些用于我的渲染案例的面包屑,但我无法实现“视频”,仅绘制第一帧(正确,没有问题)

    int rowStrideY = width;
    int rowStrideU = width / 2;
    int rowStrideV = width / 2;

    // TODO asjust to ColorFormat
    int basicOffset = byteBuffer.remaining() / 6;
    int offsetY = 0;
    int offsetU = basicOffset * 4;
    int offsetV = basicOffset * 5;

    ByteBuffer i420ByteBuffer = byteBuffer.duplicate();
    i420ByteBuffer.position(offsetY);
    final ByteBuffer dataY = i420ByteBuffer.slice();
    i420ByteBuffer.position(offsetU);
    final ByteBuffer dataU = i420ByteBuffer.slice();
    i420ByteBuffer.position(offsetV);
    final ByteBuffer dataV = i420ByteBuffer.slice();

    JavaI420Buffer javaI420Buffer = JavaI420Buffer.wrap(width, height,
            dataY, rowStrideY,
            dataU, rowStrideU,
            dataV, rowStrideV,
            () -> {
                JniCommon.nativeFreeByteBuffer(i420ByteBuffer);
            });

    VideoFrame frame = new VideoFrame(javaI420Buffer, 0, System.currentTimeMillis());
    surfaceViewRenderer.onFrame(frame);
    //turnOffYuv(); // no crash, but only first frame drawn
}

HERE中的一些来源)

SurfaceViewRenderer
当第二个/更远的帧被喂食时会抛出

FATAL EXCEPTION: SurfaceViewRendererEglRenderer
   Process: thats.my.package, PID: 12970
   java.lang.IllegalStateException: buffer is inaccessible
    at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
    at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
    at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
    at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
    at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:664)
    at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
    at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
    at android.os.Handler.handleCallback(Handler.java:883)
    at android.os.Handler.dispatchMessage(Handler.java:100)
    at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:103)
    at android.os.Looper.loop(Looper.java:214)
    at android.os.HandlerThread.run(HandlerThread.java:67)

一些消息来源建议创建

VideoTrack
addSink
等,但我没能准备好自己的,也被代码中的native methods弄糊涂和害怕了

OpenGL ES 方式

害怕 WebRTC 的一些原生关系我已经转向其他一些更“纯粹”的方式来达到我的目的 - OpenGL ES 和

GLSurfaceView
。找到了THIS
Renderer
,稍微调整了一下,实现了乱色正形的视频,可能只有UV问题,但是很流畅...

onCreateView
Fragment

mGLSurfaceView = rootView.findViewById(R.id.GLSurfaceView);
mGLSurfaceView.setEGLContextClientVersion(2);
mRenderer = new NV21Renderer();
mGLSurfaceView.setRenderer(mRenderer);
mGLSurfaceView.setPreserveEGLContextOnPause(true);
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);

帧传递:

@Override
public void onYuvData(MediaFormat mediaFormat, byte[] data, int dataSize, int width, int height) {
    // data[] in here is NV21
    //YuvImage yuvImage = new YuvImage(data, ImageFormat.NV21, width, height, null);
    // mediaFormat contains "original" colorFormat
    mGLSurfaceView.queueEvent(new Runnable() {
        @Override
        public void run() {
            mRenderer.onPreviewFrame(data);
            mGLSurfaceView.requestRender();
        }
    });
}

渲染器:

public class NV21Renderer implements GLSurfaceView.Renderer {
    public static final int recWidth = 1920;
    public static final int recHeight = 1088;

    private static final int LENGTH = recWidth * recHeight;
    private static final int LENGTH_4 = recWidth * recHeight / 4;

    private static final int U_INDEX = LENGTH;
    private static final int V_INDEX = LENGTH + LENGTH_4;

    private int[] yTextureNames;
    private int[] uTextureNames;
    private int[] vTextureNames;

    private final FloatBuffer mVertices;
    private final ShortBuffer mIndices;

    private int mProgramObject;
    private int mPositionLoc;
    private int mTexCoordLoc;

    private int yTexture;
    private int uTexture;
    private int vTexture;

    private final ByteBuffer yBuffer;
    private final ByteBuffer uBuffer;
    private final ByteBuffer vBuffer;

    byte[] ydata = new byte[LENGTH];
    byte[] uData = new byte[LENGTH_4];
    byte[] vData = new byte[LENGTH_4];

    private boolean surfaceCreated = false;
    private boolean dirty = false; // prevent drawing when no frame

    public NV21Renderer() {

        mVertices = ByteBuffer.allocateDirect(mVerticesData.length * 4)
                .order(ByteOrder.nativeOrder()).asFloatBuffer();
        mVertices.put(mVerticesData).position(0);

        mIndices = ByteBuffer.allocateDirect(mIndicesData.length * 2)
                .order(ByteOrder.nativeOrder()).asShortBuffer();
        mIndices.put(mIndicesData).position(0);

        yBuffer = ByteBuffer.allocateDirect(LENGTH);
        uBuffer = ByteBuffer.allocateDirect(LENGTH_4/* * 2*/);
        vBuffer = ByteBuffer.allocateDirect(LENGTH_4);
    }

    @Override
    public void onSurfaceCreated(GL10 gl, EGLConfig config) {
        Timber.d("onSurfaceCreated");

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);

        GLES20.glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

        // Define a simple shader program for our point.
        final String vShaderStr = vertexShader;
        final String fShaderStr = fragmentShader;
        IntBuffer frameBuffer = IntBuffer.allocate(1);
        IntBuffer renderBuffer = IntBuffer.allocate(1);
        GLES20.glGenFramebuffers(1, frameBuffer);
        GLES20.glGenRenderbuffers(1, renderBuffer);
        GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);
        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, frameBuffer.get(0));
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
        GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderBuffer.get(0));

        GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16,
                recWidth, recHeight);

        IntBuffer parameterBufferHeigth = IntBuffer.allocate(1);
        IntBuffer parameterBufferWidth = IntBuffer.allocate(1);
        GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER, GLES20.GL_RENDERBUFFER_WIDTH, parameterBufferWidth);
        GLES20.glGetRenderbufferParameteriv(GLES20.GL_RENDERBUFFER, GLES20.GL_RENDERBUFFER_HEIGHT, parameterBufferHeigth);
        GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_RENDERBUFFER, renderBuffer.get(0));
        if (GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE) {
            Timber.w("gl frame buffer status != frame buffer complete %s",
                    GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER));
        }
        GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        mProgramObject = loadProgram(vShaderStr, fShaderStr);

        // Get the attribute locations
        mPositionLoc = GLES20.glGetAttribLocation(mProgramObject, "a_position");
        mTexCoordLoc = GLES20.glGetAttribLocation(mProgramObject, "a_texCoord");

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);
        yTexture = GLES20.glGetUniformLocation(mProgramObject, "y_texture");
        yTextureNames = new int[1];
        GLES20.glGenTextures(1, yTextureNames, 0);
        int yTextureName = yTextureNames[0];

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);
        uTexture = GLES20.glGetUniformLocation(mProgramObject, "u_texture");
        uTextureNames = new int[1];
        GLES20.glGenTextures(1, uTextureNames, 0);
        int uTextureName = uTextureNames[0];

        GLES20.glEnable(GLES20.GL_TEXTURE_2D);
        vTexture = GLES20.glGetUniformLocation(mProgramObject, "v_texture");
        vTextureNames = new int[1];
        GLES20.glGenTextures(1, vTextureNames, 0);
        int vTextureName = vTextureNames[0];

        surfaceCreated = true;
    }

    @Override
    public void onSurfaceChanged(GL10 gl, int width, int height) {
        Timber.d("onSurfaceChanged width:" + width + " height:" + height +
                " surfaceCreated:" + surfaceCreated + "dirty:" + dirty);
        GLES20.glActiveTexture(GLES20.GL_ACTIVE_TEXTURE);
        GLES20.glViewport(0, 0, width, height);
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);
    }

    @Override
    public final void onDrawFrame(GL10 gl) {
        Timber.d("onDrawFrame surfaceCreated:" + surfaceCreated + " dirty:" + dirty);
        if (!surfaceCreated || !dirty) return;

        // Clear the color buffer
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        // Use the program object
        GLES20.glUseProgram(mProgramObject);

        // Load the vertex position
        mVertices.position(0);
        GLES20.glVertexAttribPointer(mPositionLoc, 3, GLES20.GL_FLOAT, false, 5 * 4, mVertices);
        // Load the texture coordinate
        mVertices.position(3);
        GLES20.glVertexAttribPointer(mTexCoordLoc, 2, GLES20.GL_FLOAT, false, 5 * 4, mVertices);

        GLES20.glEnableVertexAttribArray(mPositionLoc);
        GLES20.glEnableVertexAttribArray(mTexCoordLoc);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTextureNames[0]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                recWidth, recHeight, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, yBuffer);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, yTextureNames[0]);
        GLES20.glUniform1i(yTexture, 0);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTextureNames[0]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                recWidth, recHeight, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, uBuffer);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1 + 2);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, uTextureNames[0]);
        GLES20.glUniform1i(uTexture, 2);

        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTextureNames[0]);
        GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                recWidth, recHeight, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, vBuffer);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE1 + 1);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, vTextureNames[0]);
        GLES20.glUniform1i(vTexture, 1);

        GLES20.glDrawElements(GLES20.GL_TRIANGLES, 6, GLES20.GL_UNSIGNED_SHORT, mIndices);

        dirty = false;
    }

    private int loadShader(int type, String shaderSrc) {
        int shader;
        int[] compiled = new int[1];

        // Create the shader object
        shader = GLES20.glCreateShader(type);
        if (shader == 0) {
            return 0;
        }
        // Load the shader source
        GLES20.glShaderSource(shader, shaderSrc);
        // Compile the shader
        GLES20.glCompileShader(shader);
        // Check the compile status
        GLES20.glGetShaderiv(shader, GLES20.GL_COMPILE_STATUS, compiled, 0);

        if (compiled[0] == 0) {
            Timber.d("loadShader %s", GLES20.glGetShaderInfoLog(shader));
            GLES20.glDeleteShader(shader);
            return 0;
        }
        return shader;
    }

    private int loadProgram(String vertShaderSrc, String fragShaderSrc) {
        int vertexShader;
        int fragmentShader;
        int programObject;
        int[] linked = new int[1];

        // Load the vertex/fragment shaders
        vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertShaderSrc);
        if (vertexShader == 0) {
            return 0;
        }

        fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragShaderSrc);
        if (fragmentShader == 0) {
            GLES20.glDeleteShader(vertexShader);
            return 0;
        }

        // Create the program object
        programObject = GLES20.glCreateProgram();

        if (programObject == 0) {
            return 0;
        }

        GLES20.glAttachShader(programObject, vertexShader);
        GLES20.glAttachShader(programObject, fragmentShader);

        // Link the program
        GLES20.glLinkProgram(programObject);

        // Check the link status
        GLES20.glGetProgramiv(programObject, GLES20.GL_LINK_STATUS, linked, 0);

        if (linked[0] == 0) {
            Timber.e("Error linking program:%s", GLES20.glGetProgramInfoLog(programObject));
            GLES20.glDeleteProgram(programObject);
            return 0;
        }

        // Free up no longer needed shader resources
        GLES20.glDeleteShader(vertexShader);
        GLES20.glDeleteShader(fragmentShader);

        return programObject;
    }

    public void onPreviewFrame(byte[] data) {
        System.arraycopy(data, 0, ydata, 0, LENGTH);
        yBuffer.put(ydata);
        yBuffer.position(0);

        System.arraycopy(data, U_INDEX, uData, 0, LENGTH_4);
        uBuffer.put(uData);
        uBuffer.position(0);

        System.arraycopy(data, V_INDEX, vData, 0, LENGTH_4);
        vBuffer.put(vData);
        vBuffer.position(0);

        dirty = true;
    }

    private static final String vertexShader =
            "attribute vec4 a_position;                         \n" +
                    "attribute vec2 a_texCoord;                         \n" +
                    "varying vec2 v_texCoord;                           \n" +

                    "void main(){                                       \n" +
                    "   gl_Position = a_position;                       \n" +
                    "   v_texCoord = a_texCoord;                        \n" +
                    "}                                                  \n";

    private static final String fragmentShader =
            "#ifdef GL_ES                                       \n" +
                    "precision highp float;                             \n" +
                    "#endif                                             \n" +

                    "varying vec2 v_texCoord;                           \n" +
                    "uniform sampler2D y_texture;                       \n" +
                    "uniform sampler2D u_texture;                       \n" +
                    "uniform sampler2D v_texture;                       \n" +

                    "void main (void){                                  \n" +
                    "   float r, g, b, y, u, v;                         \n" +

                    //We had put the Y values of each pixel to the R,G,B components by GL_LUMINANCE,
                    //that's why we're pulling it from the R component, we could also use G or B
                    //see https://stackoverflow.com/questions/12130790/yuv-to-rgb-conversion-by-fragment-shader/17615696#17615696
                    //and https://stackoverflow.com/questions/22456884/how-to-render-androids-yuv-nv21-camera-image-on-the-background-in-libgdx-with-o
                    "   y = texture2D(y_texture, v_texCoord).r;         \n" +

                    //Since we use GL_LUMINANCE, each compoentn it on it own map
                    "   u = texture2D(u_texture, v_texCoord).r - 0.5;  \n" +
                    "   v = texture2D(v_texture, v_texCoord).r - 0.5;  \n" +


                    //The numbers are just YUV to RGB conversion constants
                    "   r = y + 1.13983*v;                              \n" +
                    "   g = y - 0.39465*u - 0.58060*v;                  \n" +
                    "   b = y + 2.03211*u;                              \n" +

                    //We finally set the RGB color of our pixel
                    "   gl_FragColor = vec4(r, g, b, 1.0);              \n" +
                    "}                                                  \n";

    private static final float[] mVerticesData = {
            -1.f, 1.f, 0.0f, // Position 0
            0.0f, 0.0f, // TexCoord 0
            -1.f, -1.f, 0.0f, // Position 1
            0.0f, 1.0f, // TexCoord 1
            1.f, -1.f, 0.0f, // Position 2
            1.0f, 1.0f, // TexCoord 2
            1.f, 1.f, 0.0f, // Position 3
            1.0f, 0.0f // TexCoord 3
    };
    private static final short[] mIndicesData = {0, 1, 2, 0, 2, 3};
}

但是在某些 Android 10 上第一帧绘制可能原因

signal 11 (SIGSEGV), code 2 (SEGV_ACCERR), fault addr 0xa4d4a710

堆栈指向

glTexImage2D

在 Android 13 (Pixel) 上我总是这样

signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x00000000

投入

onSurfaceCreated

试图用

surfaceCreated
dirty
标志为此确保安全,但没有成功..

so: 如何绘制 YUV/NV21,从简单的字节数组/缓冲区到屏幕上的图片/视频?

附言。流很好,我可以用例如编码h264 并拖放到 mp4 文件或流出,没有问题,或使用

YuvImage
生成的 jpeg 检查我只想实时绘制它

android opengl-es-2.0 glsurfaceview webrtc-android android-textureview
© www.soinside.com 2019 - 2024. All rights reserved.