android 圖像和視頻編程中Image類和YUV詳解

Image類在Android的API 19中引入,但真正開始發揮作用還是在API 21引入CameraDeviceMediaCodec的增強後。API 21引入了Camera2,deprecated掉了Camera,確立Image作爲相機得到的原始幀數據的載體;硬件編解碼的MediaCodec類加入了對ImageImage的封裝ImageReader的全面支持。可以預見,Image將會用來統一Android內部混亂的中間圖片數據(這裏中間圖片數據指如各式YUV格式數據,在處理過程中產生和銷燬)管理。

YUV即通過Y、U和V三個分量表示顏色空間,其中Y表示亮度,U和V表示色度。不同於RGB中每個像素點都有獨立的R、G和B三個顏色分量值,YUV根據U和V採樣數目的不同,分爲如YUV444、YUV422和YUV420等,而YUV420表示的就是每個像素點有一個獨立的亮度表示,即Y分量;而色度,即U和V分量則由每4個像素點共享一個。舉例來說,對於4x4的圖片,在YUV420下,有16個Y值,4個U值和4個V值。


YUV420是一類格式的集合,YUV420並不能完全確定顏色數據的存儲順序。YUV420根據顏色數據的存儲順序不同,又分爲了多種不同的格式,如YUV420Planar、YUV420PackedPlanar、YUV420SemiPlanar和YUV420PackedSemiPlanar,這些格式實際存儲的信息還是完全一致的。舉例來說,對於4x4的圖片,在YUV420下,任何格式都有16個Y值,4個U值和4個V值,不同格式只是Y、U和V的排列順序變化。I420(YUV420Planar的一種)則爲YYYYYYYYYYYYYYYY UUUU VVVV,NV21(YUV420SemiPlanar)則爲YYYYYYYYYYYYYYYY vu vu vu vu

Camera2使用Image YUV_420_888,舊的API Camera默認使用NV21,NV21也是YUV 420的一種。事實上android也支持422編碼。可以通過getFormat()獲取camera2的Image 類型爲YUV_420_888。YUV_420_888是一種三通道Planar存儲方式,可以通過getPlanes()獲取每個通道的數據,通道1爲Y,通道2爲U,通道3爲V,和其他YUV420一樣,四個Y對應一個U和一個V,也就是四個像素共用四個Y,一個U,一個V,但和其他存儲格式的YUV420主要差別在於,單個U使用兩個byte存儲,單個V也使用兩個byte,多出來的一個byte使用空數據127填充,總的大小是widthXheightX2,其他爲乘以1.5,和其他YUV轉換需要先去掉空數據。或者把U的空數據替換爲V,也或者把V的空數據替換爲U。

Planar

image format: 35
get data from 3 planes
pixelStride 1
rowStride 1920
width 1920
height 1080
buffer size 2088960
Finished reading data from plane 0
pixelStride 1
rowStride 960
width 1920
height 1080
buffer size 522240
Finished reading data from plane 1
pixelStride 1
rowStride 960
width 1920
height 1080
buffer size 522240
Finished reading data from plane 2

4X4圖像,數據大小16+4+4

YYYY
YYYY
YYYY
YYYY

UUUU

VVVV

 SemiPlanar,這是android Image默認輸出的YUV_420_888

image format: 35
get data from 3 planes
pixelStride 1
rowStride 1920
width 1920
height 1080
buffer size 2088960
Finished reading data from plane 0
pixelStride 2
rowStride 1920
width 1920
height 1080
buffer size 1044479
Finished reading data from plane 1
pixelStride 2
rowStride 1920
width 1920
height 1080
buffer size 1044479
Finished reading data from plane 2

4X4圖像,數據大小16+7+7,UV最後一個空數據被省略了

YYYY
YYYY
YYYY
YYYY

U0xffU0xff 
U0xffU

V0xffV0xff
V0xffV

 

常見YUV420存儲格式

P是UV不交叉,SP是UV交叉存儲,

對於不交叉,YV12是V在前,YU12是U在前,YU12雙叫I420,opencv對應於IYUV,周屬於420p

對於交叉,NV21是V在前,NV12是U在前, android老camera使用NV21,IOS使用NV12,同屬於420sp

 

綜上所知

Image YUV_420_888要轉轉換爲大致分爲三步

1 通過getPlanes獲取Y,U,V三個通道數據
2.把U,V數據處理去掉空數據,得到緊湊的U和V
3.根據需要的數據對Y,U,V進行重新組合

注:對於需要輸出420sp數據的情況,開始覺得第二步也可以不去掉空白,根據需要把U的空白數據依次替換爲V或者把V的空白數據替換爲U然後和Y合併,這會出錯,因爲u,v的最後一個空白被省掉了,會出錯越界問題。但可以申請一個新空間交叉存放UV

把YUV_420_888轉爲NV21

 private ByteBuffer imageToByteBuffer(final Image image) {
        final Rect crop = image.getCropRect();
        final int width = crop.width();
        final int height = crop.height();

        final Image.Plane[] planes = image.getPlanes();
        final int bufferSize = width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
        final ByteBuffer output = ByteBuffer.allocateDirect(bufferSize);

        int channelOffset = 0;
        int outputStride = 0;

        for (int planeIndex = 0; planeIndex < planes.length; planeIndex++) {
            if (planeIndex == 0) {
                channelOffset = 0;
                outputStride = 1;
            } else if (planeIndex == 1) {
                channelOffset = width * height + 1;
                outputStride = 2;
            } else if (planeIndex == 2) {
                channelOffset = width * height;
                outputStride = 2;
            }

            final ByteBuffer buffer = planes[planeIndex].getBuffer();
            final int rowStride = planes[planeIndex].getRowStride();
            final int pixelStride = planes[planeIndex].getPixelStride();
            byte[] rowData = new byte[rowStride];

            final int shift = (planeIndex == 0) ? 0 : 1;
            final int widthShifted = width >> shift;
            final int heightShifted = height >> shift;

            buffer.position(rowStride * (crop.top >> shift) + pixelStride * (crop.left >> shift));

            for (int row = 0; row < heightShifted; row++) {
                final int length;

                if (pixelStride == 1 && outputStride == 1) {
                    length = widthShifted;
                    buffer.get(output.array(), channelOffset, length);
                    channelOffset += length;
                } else {
                    length = (widthShifted - 1) * pixelStride + 1;
                    buffer.get(rowData, 0, length);

                    for (int col = 0; col < widthShifted; col++) {
                        output.array()[channelOffset] = rowData[col * pixelStride];
                        channelOffset += outputStride;
                    }
                }

                if (row < heightShifted - 1) {
                    buffer.position(buffer.position() + rowStride - length);
                }
            }
        }

        return output;
    }

使用OPENCV轉換提升效率,經測試看來速度相對JAVA差接近一個數量級

extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_HXAIEngineNative_RGBA2BGRA(JNIEnv *env, jclass type, jbyteArray rgba,
                                                 jint width, jint height, jintArray bgra) {
    jbyte *_rgba = env->GetByteArrayElements(rgba, 0);
    jint *_bgra = env->GetIntArrayElements(bgra, 0);

    cv::Mat mRgba(height, width * 4, CV_8UC1, (unsigned char *) _rgba);
    cv::Mat mBgra(height, width, CV_8UC4, (unsigned char *) _bgra);

    cvtColor(mRgba, mBgra, CV_RGBA2BGRA, 4);

    env->ReleaseIntArrayElements(bgra, _bgra, 0);
    env->ReleaseByteArrayElements(rgba, _rgba, 0);
}
//420sp, nv21:先Y後VU交叉混合 YYYYYYYY VU VU
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2rgbaNv21(JNIEnv *env, jclass type, jint width, jint height,
                                                  jbyteArray yuv, jintArray rgba) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgra = env->GetIntArrayElements(rgba, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mrgba(height, width, CV_8UC4, (uchar *) _bgra);

    //cvtColor(myuv, mrgba, CV_YUV420sp2RGBA);
    cvtColor(myuv, mrgba, CV_YUV2RGBA_NV21);
    env->ReleaseIntArrayElements(rgba, _bgra, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}
//420p, 對應I420, 不交叉先U後V, YYYYYYYY UU VV
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2rgbaI420(JNIEnv *env, jclass type, jint width, jint height,
                                                  jbyteArray yuv, jintArray rgba) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgra = env->GetIntArrayElements(rgba, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mrgba(height, width, CV_8UC4, (uchar *) _bgra);

    cvtColor(myuv, mrgba, CV_YUV2RGBA_I420);  //CV_YUV2RGBA_IYUV
    env->ReleaseIntArrayElements(rgba, _bgra, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}

//420sp,對應NV12,交叉先U後V, YYYYYYYY UV UV
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2rgbaNv12(JNIEnv *env, jclass type, jint width, jint height,
                                                  jbyteArray yuv, jintArray rgba) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgra = env->GetIntArrayElements(rgba, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mrgba(height, width, CV_8UC4, (uchar *) _bgra);

    //cvtColor(myuv, mrgba, CV_YUV420sp2RGBA);
    cvtColor(myuv, mrgba, CV_YUV2RGBA_NV12);
    env->ReleaseIntArrayElements(rgba, _bgra, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}

//轉bgra
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2bgraNv12(JNIEnv *env, jclass type, jint width, jint height,
                                                  jbyteArray yuv, jintArray bgra) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgra = env->GetIntArrayElements(bgra, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mbgra(height, width, CV_8UC4, (uchar *) _bgra);

    cvtColor(myuv, mbgra, CV_YUV2BGRA_NV12);
    env->ReleaseIntArrayElements(bgra, _bgra, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}
//bgr
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2bgrNv12(JNIEnv *env, jclass type, jint width, jint height,
                                                 jbyteArray yuv, jintArray bgr) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgr = env->GetIntArrayElements(bgr, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mbgr(height, width, CV_8UC3, (uchar *) _bgr);

    cvtColor(myuv, mbgr, CV_YUV2BGR_NV12);
    env->ReleaseIntArrayElements(bgr, _bgr, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}
//bgra
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2bgraNv21(JNIEnv *env, jclass type, jint width,
                                                  jint height, jbyteArray yuv, jintArray bgra) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgra = env->GetIntArrayElements(bgra, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mbgra(height, width, CV_8UC4, (uchar *) _bgra);

    cvtColor(myuv, mbgra, CV_YUV2BGRA_NV21);
    env->ReleaseIntArrayElements(bgra, _bgra, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}
//bgr
extern "C" JNIEXPORT void JNICALL
Java_com_hxdl_coco_ai_AIEngineNative_yuv2bgrNv21(JNIEnv *env, jclass type, jint width,
                                                 jint height, jbyteArray yuv, jintArray bgr) {
    jbyte *_yuv = env->GetByteArrayElements(yuv, 0);
    jint *_bgr = env->GetIntArrayElements(bgr, 0);

    cv::Mat myuv(height + height / 2, width, CV_8UC1, (uchar *) _yuv);
    cv::Mat mbgr(height, width, CV_8UC3, (uchar *) _bgr);

    cvtColor(myuv, mbgr, CV_YUV2BGR_NV21);
    env->ReleaseIntArrayElements(bgr, _bgr, 0);
    env->ReleaseByteArrayElements(yuv, _yuv, 0);
}

java 轉換庫

public final class ImageUtil {
    private static final String TAG = ImageUtil.class.getSimpleName();

    /**
     * YUV420p I420
     */
    public static final int YUV420PI420 = 0;

    /**
     * YUV420SP NV12
     */
    public static final int YUV420SPNV12 = 1;

    /**
     * YUV420SP NV21
     */
    public static final int YUV420SPNV21 = 2;
    

    /**
     * 高速變換,減少內存分配的開銷和防OOM
     */
    private static byte[] Bytes_y = null;
    private static byte[] uBytes = null;
    private static byte[] vBytes = null;
    private static byte[] Bytes_uv = null;


    /**
     * Image YUV420_888轉NV12, NV21, I420(YU12)
     * @param image
     * @param type
     * @param yuvBytes
     */
    @RequiresApi(api = Build.VERSION_CODES.KITKAT)
    public static void getBytesFromImageAsType(Image image, int type, byte[] yuvBytes) {
        try {
            //獲取源數據,如果是YUV格式的數據planes.length = 3
            //plane[i]裏面的實際數據可能存在byte[].length <= capacity (緩衝區總大小)
            final Image.Plane[] planes = image.getPlanes();

            //數據有效寬度,一般的,圖片width <= rowStride,這也是導致byte[].length <= capacity的原因
            // 所以我們只取width部分
            int width = image.getWidth();
            int height = image.getHeight();

            //此處用來裝填最終的YUV數據,需要1.5倍的圖片大小,因爲Y U V 比例爲 4:1:1
            //byte[] yuvBytes = new byte[width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
            //目標數組的裝填到的位置
            int dstIndex = 0;

            //臨時存儲uv數據的
            //byte[] uBytes = new byte[width * height / 4];
            //byte[] vBytes = new byte[width * height / 4];
            if (uBytes == null) {
                uBytes = new byte[width * height / 4];
            }
            if (vBytes == null) {
                vBytes = new byte[width * height / 4];
            }

            if (Bytes_y == null){
                Bytes_y = new byte[width * height];
            }
            if (Bytes_uv == null){
                Bytes_uv = new byte[width * height / 4];
            }

            int uIndex = 0;
            int vIndex = 0;

            int pixelsStride, rowStride;
            for (int i = 0; i < planes.length; i++) {
                pixelsStride = planes[i].getPixelStride();
                rowStride = planes[i].getRowStride();

                ByteBuffer buffer = planes[i].getBuffer();

                //如果pixelsStride==2,一般的Y的buffer長度=640*480,UV的長度=640*480/2-1
                //源數據的索引,y的數據是byte中連續的,u的數據是v向左移以爲生成的,兩者都是偶數位爲有效數據
                byte[] bytes;
                if (buffer.capacity() == width*height) {
                    bytes = Bytes_y;
                }else if(buffer.capacity() == width*height/4){
                    bytes = Bytes_uv;
                }else{
                    bytes = new byte[buffer.capacity()];
                }
                buffer.get(bytes);

                int srcIndex = 0;
                if (i == 0) {
                    //直接取出來所有Y的有效區域,也可以存儲成一個臨時的bytes,到下一步再copy
                    for (int j = 0; j < height; j++) {
                        System.arraycopy(bytes, srcIndex, yuvBytes, dstIndex, width);
                        srcIndex += rowStride;
                        dstIndex += width;
                    }
                } else if (i == 1) {
                    //根據pixelsStride取相應的數據
                    for (int j = 0; j < height / 2; j++) {
                        for (int k = 0; k < width / 2; k++) {
                            uBytes[uIndex++] = bytes[srcIndex];
                            srcIndex += pixelsStride;
                        }
                        if (pixelsStride == 2) {
                            srcIndex += rowStride - width;
                        } else if (pixelsStride == 1) {
                            srcIndex += rowStride - width / 2;
                        }
                    }
                } else if (i == 2) {
                    //根據pixelsStride取相應的數據
                    for (int j = 0; j < height / 2; j++) {
                        for (int k = 0; k < width / 2; k++) {
                            vBytes[vIndex++] = bytes[srcIndex];
                            srcIndex += pixelsStride;
                        }
                        if (pixelsStride == 2) {
                            srcIndex += rowStride - width;
                        } else if (pixelsStride == 1) {
                            srcIndex += rowStride - width / 2;
                        }
                    }
                }
            }

            //   image.close();

            //根據要求的結果類型進行填充
            switch (type) {
                case YUV420PI420:
                    System.arraycopy(uBytes, 0, yuvBytes, dstIndex, uBytes.length);
                    System.arraycopy(vBytes, 0, yuvBytes, dstIndex + uBytes.length, vBytes.length);
                    break;
                case YUV420SPNV12:
                    for (int i = 0; i < vBytes.length; i++) {
                        yuvBytes[dstIndex++] = uBytes[i];
                        yuvBytes[dstIndex++] = vBytes[i];
                    }
                    break;
                case YUV420SPNV21:
                    for (int i = 0; i < vBytes.length; i++) {
                        yuvBytes[dstIndex++] = vBytes[i];
                        yuvBytes[dstIndex++] = uBytes[i];
                    }
                    break;
            }
        } catch (final Exception e) {
            if (image != null) {
                image.close();
            }
            Log.i(TAG, e.toString());
        }
    }

    /**
     * Image Yuv420_888轉Yu12
     * @param image Image
     * @param yuvBytes data
     */
    @SuppressWarnings("unused")
    public static void getBytes420PYu12(Image image, byte[] yuvBytes) {
        try {
            //獲取源數據,如果是YUV格式的數據planes.length = 3
            //plane[i]裏面的實際數據可能存在byte[].length <= capacity (緩衝區總大小)
            final Image.Plane[] planes = image.getPlanes();

            //數據有效寬度,一般的,圖片width <= rowStride,這也是導致byte[].length <= capacity的原因
            // 所以我們只取width部分
            int width = image.getWidth();
            int height = image.getHeight();

            //此處用來裝填最終的YUV數據,需要1.5倍的圖片大小,因爲Y U V 比例爲 4:1:1
            //byte[] yuvBytes = out;//new byte[width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
            //目標數組的裝填到的位置
            int i0 = width*height;
            int i2 = i0 + i0/2;
            int i1 = i0 + i0/4;
            System.arraycopy(planes[0].getBuffer().array(), 0, yuvBytes, 0, width*height);
            //System.arraycopy(planes[1], 0, yuvBytes, width*height, width*height/2);
            for (int i = i0; i < i1; i++){
                yuvBytes[i] = planes[1].getBuffer().get((i-i0)*2);
            }
            for (int i = i1; i < i2; i++){
                yuvBytes[i] = planes[2].getBuffer().get((i- i1)*2);
            }
        } catch (final Exception e) {
            if (image != null) {
                image.close();
            }
            Log.i(TAG, e.toString());
        }
    }

    /**
     * Image Yuv420_888轉YV12
     * @param image Image
     * @param yuvBytes data
     */
    @SuppressWarnings("unused")
    public static void getBytes420PYv12(Image image, byte[] yuvBytes) {
        try {
            //獲取源數據,如果是YUV格式的數據planes.length = 3
            //plane[i]裏面的實際數據可能存在byte[].length <= capacity (緩衝區總大小)
            final Image.Plane[] planes = image.getPlanes();

            //數據有效寬度,一般的,圖片width <= rowStride,這也是導致byte[].length <= capacity的原因
            // 所以我們只取width部分
            int width = image.getWidth();
            int height = image.getHeight();

            //此處用來裝填最終的YUV數據,需要1.5倍的圖片大小,因爲Y U V 比例爲 4:1:1
            //byte[] yuvBytes = out;//new byte[width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
            //目標數組的裝填到的位置
            int i0 = width*height;
            int i2 = i0 + i0/2;
            int i1 = i0 + i0/4;
            System.arraycopy(planes[0].getBuffer().array(), 0, yuvBytes, 0, width*height);
            //System.arraycopy(planes[1], 0, yuvBytes, width*height, width*height/2);
            for (int i = i0; i < i1; i++){
                yuvBytes[i] = planes[2].getBuffer().get((i-i0)*2);
            }
            for (int i = i1; i < i2; i++){
                yuvBytes[i] = planes[1].getBuffer().get((i- i1)*2);
            }
        } catch (final Exception e) {
            if (image != null) {
                image.close();
            }
            Log.i(TAG, e.toString());
        }
    }

    /**
     * Image Yuv420_888轉Nu12
     * @param image Image
     * @param yuvBytes data
     */
    @SuppressWarnings("unused")
    public static void getBytes420SPNv12(Image image, byte[] yuvBytes) {
        try {
            //獲取源數據,如果是YUV格式的數據planes.length = 3
            //plane[i]裏面的實際數據可能存在byte[].length <= capacity (緩衝區總大小)
            final Image.Plane[] planes = image.getPlanes();

            //數據有效寬度,一般的,圖片width <= rowStride,這也是導致byte[].length <= capacity的原因
            // 所以我們只取width部分
            int width = image.getWidth();
            int height = image.getHeight();

            //此處用來裝填最終的YUV數據,需要1.5倍的圖片大小,因爲Y U V 比例爲 4:1:1
            //byte[] yuvBytes = new byte[width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
            //目標數組的裝填到的位置

            System.arraycopy(planes[0].getBuffer().array(), 0, yuvBytes, 0, width*height);
            System.arraycopy(planes[1].getBuffer().array(), 0, yuvBytes, width*height, width*height/2);
            for (int i = width*height; i <width*height+ width*height/2; i+=2){
                yuvBytes[i+1] = planes[2].getBuffer().get(i-width*height);
            }
        } catch (final Exception e) {
            if (image != null) {
                image.close();
            }
            Log.i(TAG, e.toString());
        }
    }

    /**
     * Image Yuv420_888轉NV21
     * @param image Image
     * @param yuvBytes data
     */
    @SuppressWarnings("unused")
    public static void getBytesNv21(Image image, byte[] yuvBytes) {
        try {
            //獲取源數據,如果是YUV格式的數據planes.length = 3
            //plane[i]裏面的實際數據可能存在byte[].length <= capacity (緩衝區總大小)
            final Image.Plane[] planes = image.getPlanes();

            //數據有效寬度,一般的,圖片width <= rowStride,這也是導致byte[].length <= capacity的原因
            // 所以我們只取width部分
            int width = image.getWidth();
            int height = image.getHeight();

            //此處用來裝填最終的YUV數據,需要1.5倍的圖片大小,因爲Y U V 比例爲 4:1:1
            //byte[] yuvBytes = new byte[width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
            //目標數組的裝填到的位置

            System.arraycopy(planes[0].getBuffer().array(), 0, yuvBytes, 0, width*height);
            System.arraycopy(planes[2].getBuffer().array(), 0, yuvBytes, width*height, width*height/2);
            for (int i = width*height; i <width*height+ width*height/2; i+=2){
                yuvBytes[i+1] = planes[1].getBuffer().get(i-width*height);
            }
        } catch (final Exception e) {
            if (image != null) {
                image.close();
            }
            Log.i(TAG, e.toString());
        }
    }

    @SuppressWarnings("unused")
    public static byte[] NV21toJPEG(byte[] nv21, int width, int height, int quality) {
        ByteStreamWrapper out = ByteStreamPool.get();
        YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
        yuv.compressToJpeg(new Rect(0, 0, width, height), quality, out);
        byte[] result = out.toByteArray();
        ByteStreamPool.ret2pool(out);
        return result;
    }

    @SuppressWarnings("unused")
    public static void Rgba2Bgr(byte[] src, byte[] dest, boolean isAlpha){
        if (isAlpha) {
            //RGBA TO BGRA
            if (src != null && src.length == dest.length) {
                for (int i = 0; i < src.length / 4; i++) {
                    dest[i * 4] = src[i * 4 + 2];        //B
                    dest[i * 4 + 1] = src[i * 4 + 1];    //G
                    dest[i * 4 + 2] = src[i * 4];        //R
                    dest[i * 4 + 3] = src[i * 4 + 3];        //a
                }
            }else{
                LogUtil.d(TAG, "RGBA2BGR error, dest length too short");
            }
        }else{
            //RGBA TO BGR
            if (src != null && src.length == (dest.length/4)*3) {
                for (int i = 0; i < src.length / 4; i++) {
                    dest[i * 3] = src[i * 4 + 2];        //B
                    dest[i * 3 + 1] = src[i * 4 + 1];    //G
                    dest[i * 3 + 2] = src[i * 4];        //R
                }
            }else{
                LogUtil.d(TAG, "RGBA2BGR error, dest length too short");
            }
        }
    }

    //bitmap
    @SuppressWarnings("unused")
    public static Bitmap getOriBitmap(byte[] jpgArray){
        return BitmapFactory.decodeByteArray(jpgArray,
                0, jpgArray.length);
    }
    //RGBA
    public static byte[] getOriBitmapRgba(byte[] jpgArray){
        Bitmap bitmap = BitmapFactory.decodeByteArray(jpgArray,
                0, jpgArray.length);
        int bytes = bitmap.getByteCount();
        ByteBuffer buffer = ByteBuffer.allocate(bytes);
        bitmap.copyPixelsToBuffer(buffer);
        return buffer.array();
    }

    @SuppressWarnings("unused")
    public static byte[] getJpegByte(byte[] rgba, int w, int h){
        Bitmap bm = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
        ByteBuffer buf = ByteBuffer.wrap(rgba);
        bm.copyPixelsFromBuffer(buf);
        ByteArrayOutputStream fos = new ByteArrayOutputStream();
        bm.compress(Bitmap.CompressFormat.JPEG, 100, fos);
        return fos.toByteArray();
    }
    @SuppressWarnings("unused")
    public static int[] byte2int(byte[] byteArray){
        IntBuffer intBuf =
                ByteBuffer.wrap(byteArray)
                        .order(ByteOrder.LITTLE_ENDIAN)
                        .asIntBuffer();
        int[] array = new int[intBuf.remaining()];
        intBuf.get(array);

        return array;
    }
    @SuppressWarnings("unused")
    public byte[] int2byte(int[] intArray){
        ByteBuffer byteBuffer = ByteBuffer.allocate(intArray.length * 4);
        IntBuffer intBuffer = byteBuffer.asIntBuffer();
        intBuffer.put(intArray);
        byte[] byteConverted = byteBuffer.array();
        for (int i = 0; i < 840; i++) {
            Log.d("Bytes sfter Insert", ""+byteConverted[i]);
        }

        return byteConverted;
    }
    @SuppressWarnings("unused")
    public void testXXX(String path){
        Bitmap bm = BitmapFactory.decodeFile(path);
        ByteStreamWrapper baos = ByteStreamPool.get();
        bm.compress(Bitmap.CompressFormat.JPEG, 100, baos);
        byte[] bytes = baos.toByteArray();
        ByteStreamPool.ret2pool(baos);
        long x = System.currentTimeMillis();
        System.out.println("jiaXXX"+ "testXXX");
        getOriBitmapRgba(bytes);
        System.out.println("jiaXXX"+ "testXXX"+(System.currentTimeMillis()-x));
    }

    //Image to nv21
    @SuppressWarnings("unused")
    private ByteBuffer imageToByteBuffer(final Image image) {
        final Rect crop = image.getCropRect();
        final int width = crop.width();
        final int height = crop.height();

        final Image.Plane[] planes = image.getPlanes();
        final int bufferSize = width * height * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
        final ByteBuffer output = ByteBuffer.allocateDirect(bufferSize);

        int channelOffset = 0;
        int outputStride = 0;

        for (int planeIndex = 0; planeIndex < planes.length; planeIndex++) {
            if (planeIndex == 0) {
                channelOffset = 0;
                outputStride = 1;
            } else if (planeIndex == 1) {
                channelOffset = width * height + 1;
                outputStride = 2;
            } else if (planeIndex == 2) {
                channelOffset = width * height;
                outputStride = 2;
            }

            final ByteBuffer buffer = planes[planeIndex].getBuffer();
            final int rowStride = planes[planeIndex].getRowStride();
            final int pixelStride = planes[planeIndex].getPixelStride();
            byte[] rowData = new byte[rowStride];

            final int shift = (planeIndex == 0) ? 0 : 1;
            final int widthShifted = width >> shift;
            final int heightShifted = height >> shift;

            buffer.position(rowStride * (crop.top >> shift) + pixelStride * (crop.left >> shift));

            for (int row = 0; row < heightShifted; row++) {
                final int length;

                if (pixelStride == 1 && outputStride == 1) {
                    length = widthShifted;
                    buffer.get(output.array(), channelOffset, length);
                    channelOffset += length;
                } else {
                    length = (widthShifted - 1) * pixelStride + 1;
                    buffer.get(rowData, 0, length);

                    for (int col = 0; col < widthShifted; col++) {
                        output.array()[channelOffset] = rowData[col * pixelStride];
                        channelOffset += outputStride;
                    }
                }

                if (row < heightShifted - 1) {
                    buffer.position(buffer.position() + rowStride - length);
                }
            }
        }

        return output;
    }

    @SuppressWarnings("unused")
    private static byte[] NV21toJPEG(byte[] nv21, int width, int height) {
        ByteStreamWrapper out = ByteStreamPool.get();
        try {
            YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
            yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);
            return out.toByteArray();
        }finally{
            ByteStreamPool.ret2pool(out);
        }
    }

    @SuppressWarnings("unused")
    private static byte[] NV21toRgba(byte[] nv21, int width, int height) {
        ByteStreamWrapper out = ByteStreamPool.get();
        try {
            YuvImage yuv = new YuvImage(nv21, ImageFormat.NV21, width, height, null);
            yuv.compressToJpeg(new Rect(0, 0, width, height), 100, out);

            final Bitmap bitmap = BitmapFactory.decodeStream(out.getInputStream());
            int bytes = bitmap.getByteCount();
            ByteBuffer buffer = ByteBuffer.allocate(bytes);
            bitmap.copyPixelsToBuffer(buffer); //Move the byte data to the buffer
            return buffer.array();
        }finally{
            ByteStreamPool.ret2pool(out);
        }
    }

    @SuppressWarnings("unused")
    private static Bitmap getBitmapFromByte(byte[] rgba, int w, int h){
        ByteBuffer buffer = ByteBuffer.wrap(rgba);
        Bitmap bm = Bitmap.createBitmap(w, h, Bitmap.Config.ARGB_8888);
        bm.copyPixelsFromBuffer(buffer);
        return bm;
    }
}

 

https://blog.csdn.net/bd_zengxinxin/article/details/37388325

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章