Android JavaCV Camera2

Android JavaCV Camera2,第1张

概述尝试使用javaCV从摄像机录制视频,//recodersettings:privateintimageWidth=320;privateintimageHeight=240;privateintframeRate=30;recorder=newFFmpegFrameRecorder(ffmpeg_link,imageWidth,imageHeight,1);recorder.setFormat("mp

尝试使用javaCV从摄像机录制视频,

  // recoder settings:  private int imageWIDth  = 320;    private int imageHeight = 240;  private int frameRate   = 30;  recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWIDth, imageHeight, 1);  recorder.setFormat("mp4");  recorder.setFrameRate(frameRate);  // frame settings:  Iplimage yuvIplimage = null;  yuvIplimage = Iplimage.create(320, 320, IPL_DEPTH_16U, 1); //32 not supported  //image reader:  private ImageReader mImageReader;  mImageReader = ImageReader.newInstance(320, 320, ImageFormat.YUV_420_888, 10);    mImageReader.setonImageAvailableListener(                    mOnImageAvailableListener, mBackgroundHandler);private final ImageReader.OnImageAvailableListener mOnImageAvailableListener        = new ImageReader.OnImageAvailableListener() {    @OverrIDe      public voID onImageAvailable(ImageReader reader) {        Image image = reader.acquireNextimage();// acquireLatestimage(); - also trIEd        if (image == null)            return;         final ByteBuffer buffer = image.getPlanes()[0].getBuffer();        byte[] bytes = new byte[buffer.remaining()];        buffer.get(bytes, 0, bytes.length);         if (yuvIplimage != null ) {            // OPTION 1            yuvIplimage.getByteBuffer().put(convertYUV420ToNV21(image));             // OPTION 2            //yuvIplimage.getByteBuffer().put(decodeYUV420SP(bytes,320,320));            try {                if (started)  {                recorder.record(yuvIplimage);                }            } catch (Exception e) {                e.printstacktrace();            }        }          image.close();    }}; 

选项1是使用以下代码将图像解码为NV21:

 private byte[] convertYUV420ToNV21(Image imgYUV420) {    byte[] rez;    ByteBuffer buffer0 = imgYUV420.getPlanes()[0].getBuffer();    ByteBuffer buffer2 = imgYUV420.getPlanes()[2].getBuffer();    int buffer0_size = buffer0.remaining();    int buffer2_size = buffer2.remaining();    rez = new byte[buffer0_size + buffer2_size];     buffer0.get(rez, 0, buffer0_size);      buffer2.get(rez, buffer0_size, buffer2_size);    return rez;}

选项2是转换为rgb,就像我了解corect一样:

public byte[] decodeYUV420SP( byte[] yuv420sp, int wIDth, int height) {    final int frameSize = wIDth * height;    byte rgb[]=new byte[wIDth*height];    for (int j = 0, yp = 0; j < height; j++) {        int uvp = frameSize + (j >> 1) * wIDth, u = 0, v = 0;        for (int i = 0; i < wIDth; i++, yp++) {            int y = (0xff & ((int) yuv420sp[yp])) - 16;            if (y < 0) y = 0;            if ((i & 1) == 0) {                v = (0xff & yuv420sp[uvp++]) - 128;                u = (0xff & yuv420sp[uvp++]) - 128;            }            int y1192 = 1192 * y;            int r = (y1192 + 1634 * v);            int g = (y1192 - 833 * v - 400 * u);            int b = (y1192 + 2066 * u);            if (r < 0) r = 0; else if (r > 262143) r = 262143;            if (g < 0) g = 0; else if (g > 262143) g = 262143;            if (b < 0) b = 0; else if (b > 262143) b = 262143;            rgb[yp] = (byte) (0xff000000 | ((r << 6) & 0xff0000)                    | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff));        }    }    return rgb;   }

它看起来也不正确.
wisch是将camera2图像转换为Iplimage的正确方法吗?
并且有可能做到这一点吗?

解决方法:

如果记录仪需要NV21,则将图像转换为该图像而不是RGB可能是最快的选择.

但是,为什么不只使用androID.media.MediaRecorder?它效率更高,可以使用硬件编码器.

但是,如果您需要坚持使用ffmpeg,则对于许多设备,您的第一个选择是不正确的.另外,请确保早些删除那个buffer.get调用-这将使从平面0的其余读取无法正常工作,这可能是您当前的问题.一旦您读取了平面0,.remaining()将返回0.

YUV图像有3个平面,除非您已检查基础格式实际上是NV21,否则您不应盲目地假设该行或跨步等于宽度.
为了安全起见,将三个平面复制到半平面byte []时,您需要同时注意行距和像素跨距.

总结

以上是内存溢出为你收集整理的Android JavaCV Camera2全部内容,希望文章能够帮你解决Android JavaCV Camera2所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址:https://54852.com/web/1090680.html

(0)
打赏 微信扫一扫微信扫一扫 支付宝扫一扫支付宝扫一扫
上一篇 2022-05-27
下一篇2022-05-27

发表评论

登录后才能评论

评论列表(0条)

    保存