Android Camera2 API簡介及使用相機拍照錄像的過程

Android 5.0 開始,Google 引入了一套全新的相機框架Camera2(android.hardware.camera2)並且廢棄了舊的相機框架 Camera1(android.hardware.Camera)。Camera1 寥寥無幾的 API 和極差的靈活性早已不能滿足日益複雜的相機功能開發。Camera2 的出現給相機應用程序帶來了巨大的變革,因爲它的目的是爲了給應用層提供更多的相機控制權限,從而構建出更高質量的相機應用程序。

Architecture of Camera1

android.hardware.Camera

三個工作模式:

  1. Preview
  2. Capture
  3. Video recording
    在這裏插入圖片描述

Limitations of Camera1

  • 難以增加新的功能,如:快速拍照、零延遲拍照等
  • 無法實現針對每幀的控制
  • 無法實現RAW格式拍照

Architecture of Camera2

android.hardware.camera2
在這裏插入圖片描述

Features of Camera2

  • 允許用戶更好的控制聚焦、曝光等
  • 可以對每個視頻幀進行獨立控制
  • 可以保存Sensor RAW data
  • 更靈活的圖像後期處理

Camera1 vs Camera2

在這裏插入圖片描述

Camera2主要的幾個類

  • CameraManager:最頂層的管理類,提供檢測系統攝像頭、打開攝像頭等操作
  • CameraCharacteristics:用於描述特定攝像頭所支持的各種特性,通過CameraManager來獲取
  • CameraDevice:代表系統攝像頭設備
  • CameraCaptureSession:攝像頭建立會話的類,預覽、拍照和錄影都要先通過它建立Session來實現,數據通過內部類StateCallback和CaptureCallback返回
  • CameraRequest和CameraRequest.Builder:對攝像頭的設定和控制,以及拍照、預覽和錄像等都是通過發送請求實現

Camera2主要的幾個類

在這裏插入圖片描述

Camera2 應用 – 拍照

在這裏插入圖片描述

拍照的流程

在這裏插入圖片描述

獲取CameraManager

mCameraManager = (CameraManager) 
                 activity.getSystemService(Context.CAMERA_SERVICE);

查詢攝像頭

private String getCameraId(CAMERA camera) {
    int lensFacing = (camera == CAMERA.EXT) ?
            CameraCharacteristics.LENS_FACING_FRONT :
            CameraCharacteristics.LENS_FACING_BACK;
    try {
        for (String cameraId : mCameraManager.getCameraIdList()) {
            CameraCharacteristics characteristics
                    = mCameraManager.getCameraCharacteristics(cameraId);

            if (characteristics.get(CameraCharacteristics.LENS_FACING) == lensFacing) {
                return cameraId;
            }
        }
    } catch (Exception e) {
        e.printStackTrace();
    }
    return "";
}

打開攝像頭設備

String cameraId = getCameraId(camera);
mCameraManager.openCamera(cameraId, mDeviceStateCallback, mBackgroundHandler);

建立Session

SurfaceTexture texture = mTextureView.getSurfaceTexture();
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
Surface surface = new Surface(texture);
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(
                             CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
        new CameraCaptureSession.StateCallback() {
            @Override
            public void onConfigured(@NonNull CameraCaptureSession session) {
                mPreviewSession = session;
                updatePreview();
            }
            @Override
            public void onConfigureFailed(@NonNull CameraCaptureSession session) {
            }
        }, mBackgroundHandler);

請求預覽

private void updatePreview() {
    ...
    mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
            CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
    setAutoFlash(mPreviewRequestBuilder);
    mPreviewRequest = mPreviewRequestBuilder.build();
    mCaptureSession.setRepeatingRequest(mPreviewRequest,
            mCaptureCallback, mBackgroundHandler);
    ...
 }

Take Picture

final CaptureRequest.Builder captureBuilder =
        mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());

captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
        CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
setAutoFlash(captureBuilder);

int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(rotation));

CameraCaptureSession.CaptureCallback CaptureCallback
        = new CameraCaptureSession.CaptureCallback() {

    @Override
    public void onCaptureCompleted(@NonNull CameraCaptureSession session,
                                   @NonNull CaptureRequest request,
                                   @NonNull TotalCaptureResult result) {
    }
};

mCaptureSession.stopRepeating();
mCaptureSession.abortCaptures();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);

保存Image

private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
        = new ImageReader.OnImageAvailableListener() {

    @Override
    public void onImageAvailable(ImageReader reader) {
        mBackgroundHandler.post(new ImageSaver(reader.acquireNextImage(), mFile));
    }

};

Camera2 應用 – 錄像

流程

  • 打開Camera設備,與拍照的過程一樣
  • 設置參數,建立MediaRecorder
  • 從獲取MediaRecorder的input surface,建立Capture Session
  • Session發送repeating request獲取視頻
  • start MediaRecorder

系統框圖

在這裏插入圖片描述

設置MediaRecorder

mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setOutputFile(mNextVideoAbsolutePath);
mMediaRecorder.setVideoEncodingBitRate(10000000);
mMediaRecorder.setVideoFrameRate(30);
mMediaRecorder.setVideoSize(mVideoSize.getWidth(), mVideoSize.getHeight());
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
int rotation = activity.getWindowManager().getDefaultDisplay().getRotation();
switch (mSensorOrientation) {
    case SENSOR_ORIENTATION_DEFAULT_DEGREES:
        mMediaRecorder.setOrientationHint(DEFAULT_ORIENTATIONS.get(rotation));
        break;
    case SENSOR_ORIENTATION_INVERSE_DEGREES:
        mMediaRecorder.setOrientationHint(INVERSE_ORIENTATIONS.get(rotation));
        break;
}
mMediaRecorder.prepare();

建立Session

SurfaceTexture texture = mTextureView.getSurfaceTexture();
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
mPreviewBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
List<Surface> surfaces = new ArrayList<>();
Surface previewSurface = new Surface(texture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);
Surface recorderSurface = mMediaRecorder.getSurface();
surfaces.add(recorderSurface);
mPreviewBuilder.addTarget(recorderSurface);
mCameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
    @Override
    public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
        mPreviewSession = cameraCaptureSession;
        updatePreview();
        getActivity().runOnUiThread(new Runnable() {
            @Override
            public void run() {
                mMediaRecorder.start();
            }
        });
    }, mBackgroundHandler);
}

錄影

流程

  • 實例化MediaCodec作爲H.264 encoder,獲取input surface
  • 通過OpenGL創建一個surface用於接收camera的圖像輸出 創建capture session,傳入OpenGL surface
  • 發送repeating request獲取連續的視頻流 OpenGL將camera輸出的圖像texture渲染到MediaCodec
  • input surface MediaCodec對input surface進行編碼,輸出H.264數據流

系統框圖

在這裏插入圖片描述

創建OpenGL surface

int texture = GLDrawer2D.initTex();
mInputSurface = new SurfaceTexture(texture);
mInputSurface.setDefaultBufferSize(1920, 1080);
mInputSurface.setOnFrameAvailableListener(EGLRenderer.this);

建立Capture Session

Surface surface = new Surface(surfaceTexture);
mCaptureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
mCaptureBuilder.addTarget(surface);
mCameraDevice.createCaptureSession(Collections.singletonList(surface),
        new CameraCaptureSession.StateCallback() {
    @Override
    public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
        mCaptureSession = cameraCaptureSession;
        updatePreview();
    }
    @Override
    public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
    }
}, mBackgroundHandler);

渲染圖像

@Override //OnFrameAvailableListener
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    mRenderHandler.sendEmptyMessage(MSG_UPDATE_FRAME);
}

private void drawFrame() {
    mInputSurface.updateTexImage();
    mInputSurface.getTransformMatrix(mTmpMatrix);
    mTextureController.setMatrix(mTmpMatrix);
    mEncoderSurface.makeCurrent();
    GLES20.glViewport(0, 0, 1920, 1080);
    mTextureController.draw();
    mEncoderSurface.setPresentationTime(mInputSurface.getTimestamp());
    if (mGroupOsd != null) {
        mGroupOsd.draw();
    }
    if (mFrameListener != null) {
        mFrameListener.frameAvailableSoon();
    }
    mEncoderSurface.swapBuffers();
}

與MediaRecorder錄影的差異

  • MediaRecorder:將MediaRecorder input surface傳給Camera,圖像數據直接輸出到MediaRecorder surface
  • MediaCodec:需要藉助OpenGL渲染,必須將camera圖像數據輸出到OpenGL創建的一箇中間SurfaceTexture,再用OpenGL將Texture渲染到MediaCodec input surface

Camera2與Camera1的使用差異

在這裏插入圖片描述
參考資料:
googlesamples/android-Camera2Basic
googlesamples/android-Camera2Video

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章