OpenGL.Shader:志哥教你写一个滤镜直播客户端(1)项目分析

OpenGL.Shader:志哥教你写一个滤镜直播客户端(1)项目分析

0、聊点题外话

2020年绝对是魔幻的一年,澳洲大火还没烧完就迎接我最不喜欢的鼠年了,但由于新型冠状病毒引发的肺炎卷席全国,春节假期得到国人梦寐以求的延长。我也能好好的总结这一系列的文章,完善对应工程项目并对其开源,这也算是自己对21世纪第二个十年的一个总结吧;球星科比(我的青春啊!)突然陨落又使我明白,明天和意外你永远不知道哪个谁会先来,所以请珍惜当下吧骚年。

1、项目分析

在些年前滤镜直播爆发期,我有幸参加了一些类似的项目,参读了开源工程GPUImage并留下了一篇阅读笔记,之后开始慢慢入门OpenGL/OpenCV。一直想做一个具有代表性的开源项目,有助于自己总结,也希望能帮助其他人。

高兴的是现阶段项目已经基本完成了!项目特色包括:是基于Android端,结合多种滤镜效果(多数来自GPUImage和 http://www.zealfilter.com/),摄像头采集视频,麦克风采集音频,然后编码输出字节流的项目工程,大部分都是用使用NDK下的C++写,shader算法的安全性提高,没有使用任何额外的第三方库,方便新手掌握知识点,也方便开源改造移植到 iOS。

工程结构组成大致如下:

整体有个大概认识之后,接下来分析模块组成:

CFEScheduler.java:Camera Filter Encoder三者的调度模块,把一些业务逻辑放在这个调度模块里面,方便规范业务代码。

GpuFilterRender.java:进入Native的入口,接管Surface生命周期,还有Filter Manage等相关接口。

Camera / AudioRecord:系统API,获取视频数据(NV21格式)和音频数据(PCM格式)。

Filter Manage About:滤镜特效相关的一系列cpp文件,是滤镜渲染的核心部分,算法来自GUImage和http://www.zealfilter.com

GL Render:全自定义的GL-Thread-Render模块,是滤镜渲染的主要组成部分。

AMediaCodecEncoder:NDK的MediaCodec,个人觉得比较有争议 / 意义的一部分,因为很多坑和细节都需要注意的地方,但如果把这部分的问题解决,内容吃透,就能掌握很多高阶的知识,甚至可以入门Android的系统开发。

2、show me the code

首先从简单的java页面开始,超级简单。界面布局大致如下:

CameraFilterEncoderActivity的逻辑代码也超级简单(我都不想放上来了)

public class CameraFilterEncodeActivity extends Activity {
    private static final String TAG = "CFEScheduler";
    private CFEScheduler cfeScheduler;

    @Override
    protected void onCreate(@Nullable Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_camera_filter_encode);
        SurfaceView surfaceView = findViewById(R.id.camera_view);
        // 初始化CFEScheduler,传入ctx和surface,用于渲染。
        if( cfeScheduler==null)
            cfeScheduler = new CFEScheduler(this, surfaceView);

        initFilterSpinner();
        initFilterAdjuster();
    }

    SeekBar filterAdjuster;
    private void initFilterAdjuster() {
        filterAdjuster = findViewById(R.id.value_seek_bar);
        filterAdjuster.setMax(100);
        filterAdjuster.setProgress(0);
        filterAdjuster.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
            @Override
            public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
                cfeScheduler.adjustFilterValue(progress, seekBar.getMax());
            }
            @Override
            public void onStartTrackingTouch(SeekBar seekBar) { }
            @Override
            public void onStopTrackingTouch(SeekBar seekBar) { }
        });
    }

    private void initFilterSpinner() {
        //从CFEScheduler获取当前所支持的滤镜清单
        String[] mItems = cfeScheduler.getSupportedFiltersName();
        //String[] mItems = this.getResources().getStringArray(R.array.filter_name);
        Spinner filterSpinner = findViewById(R.id.filter_spinner);
        ArrayAdapter<String> adapter=new ArrayAdapter<String>(this, android.R.layout.simple_spinner_dropdown_item, mItems);
        filterSpinner.setAdapter(adapter);
        filterSpinner.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() {
            @Override
            public void onItemSelected(AdapterView<?> parent, View view, int position, long id) {
                // 复原Filter.Adjuster
                filterAdjuster.setProgress(0);
                // 正常是通过getSupportedFilterTypeID(name)查询typeId
                // 这里的position更好是names[]的索引,所以可以直接通过index查询typeId了
                int typeId = cfeScheduler.getSupportedFilterTypeID(position);
                // 设置当前滤镜id,替换滤镜效果
                cfeScheduler.setFilterType(typeId);
            }
            @Override
            public void onNothingSelected(AdapterView<?> parent) { }
        });
    }

    @Override
    protected void onResume() {
        super.onResume();
        if( cfeScheduler!=null) {
            cfeScheduler.onResume();
        }
    }

    @Override
    protected void onPause() {
        super.onPause();
        if( cfeScheduler!=null) {
            cfeScheduler.onPause();
        }
    }

}

代码意义都写上注释了,比较好理解。接下来顺藤摸瓜进入 CFEScheduler 内部看看实现逻辑。

(PS:先已视频为主线讲解,音频内容稍后追加。)

/**
 * Created by zzr on 2019/11/27.
 * CFE : Camera Filter Encode
 * 这些逻辑代码,如果不嫌弃混乱,可以直接写在CameraFilterEncodeActivity.
 */
public class CFEScheduler implements Camera.PreviewCallback, SurfaceHolder.Callback {
    private static final String TAG = "CFEScheduler";
    private WeakReference<Activity> mActivityWeakRef;
    private GpuFilterRender mGpuFilterRender;
    /*Camera SurfaceView相关*/
    CFEScheduler(Activity activity, SurfaceView view) {
        mActivityWeakRef = new WeakReference<>(activity);
        mGpuFilterRender = new GpuFilterRender(activity);

        view.getHolder().setFormat(PixelFormat.RGBA_8888);
        view.getHolder().addCallback(this);
    }
    public void onResume() {
        Log.d(TAG, "onResume ...");
        setUpCamera(mCurrentCameraId);
    }
    public void onPause() {
        Log.d(TAG, "onPause ...");
        releaseCamera();
    }

    // ... ... holder callback
    // ... ... (篇幅所限,分开显示)

    private int mCurrentCameraId = Camera.CameraInfo.CAMERA_FACING_FRONT;
    private Camera mCameraInstance;

    private void setUpCamera(final int id) {
        mCameraInstance = getCameraInstance(id);
        Camera.Parameters parameters = mCameraInstance.getParameters();
        // 设置frame data格式-NV21(默认格式)
        parameters.setPreviewFormat(ImageFormat.NV21);
        if (parameters.getSupportedFocusModes().contains(
                Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE)) {
            parameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_PICTURE);
        }
        // 这里,我没有设置best的previewsize。
        // adjust by getting supportedPreviewSizes and then choosing
        // the best one for screen size (best fill screen)
        mCameraInstance.setParameters(parameters);

        Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
        Camera.getCameraInfo(mCurrentCameraId, cameraInfo);
        Activity activity = mActivityWeakRef.get();
        int orientation = getCameraDisplayOrientation(activity, cameraInfo);
        Log.i(TAG, "getCameraDisplayOrientation : "+orientation);
        boolean flipHorizontal = cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT;
        //!!!根据需要设置是否水平翻转,垂直翻转
        mGpuFilterRender.setRotationCamera(orientation, flipHorizontal, false);
    }

    private int getCameraDisplayOrientation(final Activity activity,
                                            @NonNull Camera.CameraInfo info) {
        if(activity == null) return 0;
        int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
        int degrees = 0;
        switch (rotation) {
            case Surface.ROTATION_0:
                degrees = 0;
                break;
            case Surface.ROTATION_90:
                degrees = 90;
                break;
            case Surface.ROTATION_180:
                degrees = 180;
                break;
            case Surface.ROTATION_270:
                degrees = 270;
                break;
        }

        int result;
        if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
            result = (info.orientation + degrees) % 360;
        } else { // back-facing
            result = (info.orientation - degrees + 360) % 360;
        }
        return result;
    }
    /** A safe way to get an instance of the Camera object. */
    private Camera getCameraInstance(final int id) {
        Camera c = null;
        try {
            c = Camera.open(id);
        } catch (Exception e) {
            e.printStackTrace();
        }
        return c;
    }
    private void releaseCamera() {
        mCameraInstance.setPreviewCallback(null);
        mCameraInstance.release();
        mCameraInstance = null;
    }
}

默认我们打开前置摄像头为视频数据源。这里我没有设置best的previewsize,主要是因为参照GPUImage的处理方法,在shader内部处理调整这方面的问题。最后通过GpuFilterRender.setRotationCamera设置镜头的旋转角度,和是否需要进行水平翻转和垂直翻转。接下来就是holder callback 部分。

    // ... holder callback
    private SurfaceTexture mCameraTexture = null;
    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(TAG, "surfaceCreated ... ");
        try {
            int[] textures = new int[1];
            GLES20.glGenTextures(1, textures, 0);
            mCameraTexture = new SurfaceTexture(textures[0]);
            mCameraInstance.setPreviewTexture(mCameraTexture);
            mCameraInstance.setPreviewCallback(this);
            mCameraInstance.startPreview();
        } catch (Exception e) {
            e.printStackTrace();
        }
        mGpuFilterRender.onSurfaceCreate(holder.getSurface());
    }
    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
        Log.d(TAG, "surfaceChanged ... ");
        mGpuFilterRender.onSurfaceChange(holder.getSurface(), width, height);
    }
    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(TAG, "surfaceDestroyed ... ");
        mGpuFilterRender.onSurfaceDestroy(holder.getSurface());
        if( mCameraTexture!=null){
            mCameraTexture.release();
        }
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        if( mGpuFilterRender!=null){
            final Camera.Size previewSize = camera.getParameters().getPreviewSize();
            mGpuFilterRender.feedVideoData(data.clone(), previewSize.width, previewSize.height);
        }
    }

surfaceview的三大回调接口当中,我们也把其绑定到GpuFilterRender当中,便于生命周期的控制。并利用SurfaceTexture设置Camera的预览,在预览数据的回调接口中,把数据的clone传入GpuFilterRender缓存下来。

然后再看看GpuFilterRender.java,以及对应的JNI文件接口。

/**
 * Created by zzr on 2019/11/27.
 */
class GpuFilterRender {
    static {
        System.loadLibrary("gpu-filter");
    }

    private Context ctx;
    GpuFilterRender(Context context) {
        ctx = context;
    }

    public native void onSurfaceCreate(Surface surface);

    public native void onSurfaceChange(Surface surface, int width, int height);

    public native void onSurfaceDestroy(Surface surface);

    // 发送视频nv21数据
    public native void feedVideoData(byte[] data,int width,int height);
    // 发送音频 pcm数据
    public native void feedAudioData(byte[] data);
    /**
     * 设置摄像头角度和方向
     * @param rotation 角度
     * @param flipHorizontal 是否水平翻转
     * @param flipVertical 是否垂直翻转
     */
    public native void setRotationCamera(final int rotation, final boolean flipHorizontal,
                                         final boolean flipVertical);
    // 设置滤镜类型
    public native void setFilterType(int typeId);
    // 调整滤镜效果
    public native void adjustFilterValue(int value,int max);
}
#include <jni.h>
#include <android/native_window.h>
#include <android/native_window_jni.h>
#include "../egl/GLThread.h"
#include "render/GpuFilterRender.h"

GLThread* glThread = NULL;
GpuFilterRender* render = NULL;

extern "C" {

JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_onSurfaceCreate(JNIEnv *env, jobject instance, jobject surface) {
    ANativeWindow *nativeWindow = ANativeWindow_fromSurface(env, surface);
    if (render == NULL) {
        render = new GpuFilterRender();
    }
    if (glThread == NULL) {
        glThread = new GLThread();
    }
    glThread->setGLRender(render);
    glThread->onSurfaceCreate(nativeWindow);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_onSurfaceChange(JNIEnv *env, jobject instance, jobject surface,
                                                           jint width, jint height) {
    glThread->onSurfaceChange(width, height);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_onSurfaceDestroy(JNIEnv *env, jobject instance, jobject surface) {
    glThread->onSurfaceDestroy();
    glThread->release();
    delete glThread;
    glThread = NULL;

    if (render == NULL) {
        delete render;
        render = NULL;
    }
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_setRotationCamera(JNIEnv *env, jobject instance,
                                                             jint rotation, jboolean flipHorizontal,
                                                             jboolean flipVertical) {
    // 注意这里flipVertical对应render->setRotationCamera.flipHorizontal
    // 注意这里flipHorizontal对应render->setRotationCamera.flipVertical
    // 因为Android的预览帧数据是横着的,仿照GPUImage的处理方式。
    if (render == NULL) {
        render = new GpuFilterRender();
    }
    render->setRotationCamera(rotation, flipVertical, flipHorizontal);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_setFilterType(JNIEnv *env, jobject instance, jint typeId) {
    if (render == NULL)
        render = new GpuFilterRender();
    render->setFilter(typeId);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_adjustFilterValue(JNIEnv *env, jobject instance, jint value, jint max) {
    if (render == NULL)
        render = new GpuFilterRender();
    render->adjustFilterValue(value, max);
}
JNIEXPORT void JNICALL
Java_org_zzrblog_gpufilter_GpuFilterRender_feedVideoData(JNIEnv *env, jobject instance,
                                                         jbyteArray array, jint width, jint height) {
    if (render == NULL) return;
    jbyte *nv21_buffer = env->GetByteArrayElements(array, NULL);
    jsize array_len = env->GetArrayLength(array);
    render->feedVideoData(nv21_buffer, array_len, width, height);
    env->ReleaseByteArrayElements(array, nv21_buffer, 0);
}
} // extern "C"

(GLThread部分请参考以前的文章 https://blog.csdn.net/a360940265a/article/details/88600962

下一章进入GpuFilterRender.cpp,深入分析Java_org_zzrblog_gpufilter_GpuFilterRender_setRotationCamera的注意事项,还有 核心部分——Filter的内容。

猜你喜欢

转载自blog.csdn.net/a360940265a/article/details/104203321