ARCore背景渲染

1.前言

像Vuforia、ARCore、EasyAR等sdk,使用时都会将背景与虚拟进行叠加。此功能的实现有很多种,比如直接在Camera的远平面附近添加一个UI或者quad,将camera的画面付给它;比如用CommandBuffer将画面最先渲染;或者重写shader,更改渲染顺序或者设置深度值。ARCore采用的CommandBuffer的方式,但是里面有很多细节需要说明。

2.ARCore流程

ARCore背景渲染脚本为ARCoreBackgroundRenderer,其基本流程为:
1)OnEnable时初始化CommandBuffer
2)Update中实时更新材质的属性,比如相机背景的Texture
3)OnDisable时移除CommandBuffer。
第一步和第三步比较简单,如下代码所示:

       private void EnableARBackgroundRendering()
        {
    
    
            if (BackgroundMaterial == null || m_Camera == null)
            {
    
    
                return;
            }

            m_CameraClearFlags = m_Camera.clearFlags;
            m_Camera.clearFlags = CameraClearFlags.Depth;

            m_CommandBuffer = new CommandBuffer();

#if UNITY_ANDROID
            if (SystemInfo.graphicsMultiThreaded && !InstantPreviewManager.IsProvidingPlatform)
            {
    
    
                m_CommandBuffer.IssuePluginEvent(ExternApi.ARCoreRenderingUtils_GetRenderEventFunc(),
                                                (int)ApiRenderEvent.WaitOnPostUpdateFence);
#if UNITY_2018_2_OR_NEWER
                // There is a bug in Unity that IssuePluginEvent will reset the opengl state but it
                // doesn't respect the value set to GL.invertCulling. Hence we need to reapply
                // the invert culling in the command buffer for front camera session.
                // Note that the CommandBuffer.SetInvertCulling is only available for 2018.2+.
                var sessionComponent = LifecycleManager.Instance.SessionComponent;
                if (sessionComponent != null &&
                    sessionComponent.DeviceCameraDirection == DeviceCameraDirection.FrontFacing)
                {
    
    
                    m_CommandBuffer.SetInvertCulling(true);
                }
#endif
            }

#endif
            m_CommandBuffer.Blit(null,
                BuiltinRenderTextureType.CameraTarget, BackgroundMaterial);

            m_Camera.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, m_CommandBuffer);
            m_Camera.AddCommandBuffer(CameraEvent.BeforeGBuffer, m_CommandBuffer);
        }
        private void DisableARBackgroundRendering()
        {
    
    
            if (m_CommandBuffer == null || m_Camera == null)
            {
    
    
                return;
            }

            m_Camera.clearFlags = m_CameraClearFlags;

            m_Camera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, m_CommandBuffer);
            m_Camera.RemoveCommandBuffer(CameraEvent.BeforeGBuffer, m_CommandBuffer);
        }

3.渲染流程

渲染流程从如下两个方面讲解,第一个为数据层,即update方法中实时更新的参数;第二个为渲染层,即CommandBuffer使用材质(shader)。

3.1 数据更新

背景渲染时每帧的数据更新如下流程:
1)设置_Brightness,通过此变量实现背景出现时由默认ICON切换到Camera背景的一个小动画。
2)设置_TransitionIconTexTransform,此变量根据默认icon和屏幕大小计算出。如果默认Icon长宽超出屏幕,保证正确裁剪,根据注释以及shader,其计算也是先按比例缩放,然后在平移到中心位置。计算如下:

        /// <summary>
        /// Textures transform used in background shader to get texture uv coordinates based on
        /// screen uv.
        /// The transformation follows these equations:
        /// textureUv.x = transform[0] * screenUv.x + transform[1],
        /// textureUv.y = transform[2] * screenUv.y + transform[3].
        /// </summary>
        /// <returns>The transform.</returns>
        private Vector4 _TextureTransform()
        {
    
    
            float transitionWidthTransform = (m_TransitionImageTexture.width - Screen.width) /
                (2.0f * m_TransitionImageTexture.width);
            float transitionHeightTransform = (m_TransitionImageTexture.height - Screen.height) /
                (2.0f * m_TransitionImageTexture.height);
            return new Vector4(
                (float)Screen.width / m_TransitionImageTexture.width,
                transitionWidthTransform,
                (float)Screen.height / m_TransitionImageTexture.height,
                transitionHeightTransform);
        }

注释中有说明如何使用这四个参数。
3)传入视频流Texture,即设置_MainTex。此_MainTex是通过Texture2D.CreateExternalTexture生成的android OES类型texture,并非传统的glTexture,所以在采样时使用的是samplerExternalOES。此Texture通过Frame.CameraImage.Texture获取,并非AcquireCameraImageBytes方法获取的ImageBytes,此ImageBytes为YUV格式数据,并且大小只有640*480;
4)传入_UvTopLeftRight和_UvBottomLeftRight值,此两个值跟2)类似,因为uv并非完全匹配。

3.2 渲染

渲染所用的shader为ARCore/ARBackground,此shader较为简单,有两个pass。第一个pass采用GLSL编写,为了从OES纹理采样,并基于输入的参数进行uv裁切(包括初始ARIcon以及_MainTex)。第二个pass则进行最简单的uv采样,不过采样时反转了x值

Shader "ARCore/ARBackground"
{
    
    
    Properties
    {
    
    
        _MainTex ("Main Texture", 2D) = "white" {
    
    }
        _UvTopLeftRight ("UV of top corners", Vector) = (0, 1, 1, 1)
        _UvBottomLeftRight ("UV of bottom corners", Vector) = (0 , 0, 1, 0)
    }

    // For GLES3 or GLES2 on device
    SubShader
    {
    
    
        Pass
        {
    
    
            ZWrite Off
            Cull Off

            GLSLPROGRAM

            #pragma only_renderers gles3 gles

            // #ifdef SHADER_API_GLES3 cannot take effect because
            // #extension is processed before any Unity defined symbols.
            // Use "enable" instead of "require" here, so it only gives a
            // warning but not compile error when the implementation does not
            // support the extension.
            #extension GL_OES_EGL_image_external_essl3 : enable
            #extension GL_OES_EGL_image_external : enable

            uniform vec4 _UvTopLeftRight;
            uniform vec4 _UvBottomLeftRight;

            // Use the same method in UnityCG.cginc to convert from gamma to
            // linear space in glsl.
            vec3 GammaToLinearSpace(vec3 color)
            {
    
    
                return color *
                    (color * (color * 0.305306011 + 0.682171111) + 0.012522878);
            }

            #ifdef VERTEX

            varying vec2 textureCoord;
            varying vec2 uvCoord;

            void main()
            {
    
    
                vec2 uvTop = mix(_UvTopLeftRight.xy,
                                 _UvTopLeftRight.zw,
                                 gl_MultiTexCoord0.x);
                vec2 uvBottom = mix(_UvBottomLeftRight.xy,
                                    _UvBottomLeftRight.zw,
                                    gl_MultiTexCoord0.x);
                textureCoord = mix(uvTop, uvBottom, gl_MultiTexCoord0.y);
                uvCoord = vec2(gl_MultiTexCoord0.x, gl_MultiTexCoord0.y);

                gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
            }

            #endif

            #ifdef FRAGMENT
            varying vec2 textureCoord;
            varying vec2 uvCoord;
            uniform samplerExternalOES _MainTex;
            uniform sampler2D _TransitionIconTex;
            uniform vec4 _TransitionIconTexTransform;
            uniform float _Brightness;

            void main()
            {
    
    
                vec3 mainTexColor;

                #ifdef SHADER_API_GLES3
                mainTexColor = texture(_MainTex, textureCoord).rgb;
                #else
                mainTexColor = textureExternal(_MainTex, textureCoord).rgb;
                #endif

                if (_Brightness < 1.0)
                {
    
    
                    mainTexColor = mainTexColor * _Brightness;

                    if (_TransitionIconTexTransform.x > 0.0 &&
                        _TransitionIconTexTransform.z > 0.0)
                    {
    
    
                        vec2 uvCoordTex = vec2(uvCoord.x *
                                               _TransitionIconTexTransform.x +
                                               _TransitionIconTexTransform.y,
                                               uvCoord.y *
                                               _TransitionIconTexTransform.z +
                                               _TransitionIconTexTransform.w);

                        vec4 transitionColor = vec4(0.0);
                        if (uvCoordTex.x >= 0.0 &&
                            uvCoordTex.x <= 1.0 &&
                            uvCoordTex.y >= 0.0 &&
                            uvCoordTex.y <= 1.0)
                        {
    
    
                            transitionColor = texture2D(_TransitionIconTex,
                                                        uvCoordTex);
                        }

                        if (transitionColor.a > 0.0)
                        {
    
    
                            mainTexColor = mix(transitionColor.rgb,
                                               mainTexColor,
                                               _Brightness);
                        }
                    }
                }

#ifndef UNITY_COLORSPACE_GAMMA

                mainTexColor = GammaToLinearSpace(mainTexColor);
#endif
                gl_FragColor = vec4(mainTexColor, 1.0);
            }

            #endif

            ENDGLSL
        }
    }

    // For Instant Preview
    Subshader
    {
    
    
        Pass
        {
    
    
            ZWrite Off

            CGPROGRAM

            #pragma exclude_renderers gles3 gles
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            uniform float4 _UvTopLeftRight;
            uniform float4 _UvBottomLeftRight;

            struct appdata
            {
    
    
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
    
    
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            v2f vert(appdata v)
            {
    
    
                float2 uvTop = lerp(_UvTopLeftRight.xy,
                                    _UvTopLeftRight.zw, v.uv.x);
                float2 uvBottom = lerp(_UvBottomLeftRight.xy,
                                       _UvBottomLeftRight.zw, v.uv.x);

                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = lerp(uvTop, uvBottom, v.uv.y);

                // Instant preview's texture is transformed differently.
                o.uv = o.uv.yx;
                o.uv.x = 1.0 - o.uv.x;

                return o;
            }

            sampler2D _MainTex;

            fixed4 frag(v2f i) : SV_Target
            {
    
    
                return tex2D(_MainTex, i.uv);
            }

            ENDCG
        }
    }

    FallBack Off
}

4.结语

如上是ARCore渲染背景的流程,背景渲染最主要一点是OES纹理渲染,当然也可以现在内部转成普通纹理。

猜你喜欢

转载自blog.csdn.net/qq_37833413/article/details/124613075