So I'm getting [Image](https://developer.android.com/reference/android/media/Image.html) objects from Android's Camera2 API, then I convert them to OpenCV Mat objects via their byte buffers. The [YUV_420_888](http://developer.android.com/reference/android/graphics/ImageFormat.html#YUV_420_888) format is what I set as the output of the camera as recommended by the docs, but when I try converting the Mat from YUV to RGB, all it shows is green.
Following the answers from [this](https://groups.google.com/forum/#!topic/android-opencv/WIHBLu6FF1w) thread, this is how I convert the Mat:
Image image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
Mat mat = new Mat(image.getHeight()+image.getHeight()/2, image.getWidth(), CvType.CV_8UC1);
mat.put(0, 0, bytes);
Mat rgb = new Mat(image.getHeight(), image.getWidth(), CvType.CV_8UC4);
Imgproc.cvtColor(mat, rgb, Imgproc.COLOR_YUV420sp2BGR, 4);
After these lines, all I did next was use imwrite to write the mats to disk. For reference, here's some sample images resulting from the writes:
YUV - http://i.imgur.com/qm765AZ.jpg (straight from the Camera2 API, no processing yet)
RGB - http://i.imgur.com/FzLx2Cc.jpg (the exact same image, but converted from YUV to RGB)
Any insights as to why the RGB image looks the way it does? I've also tried a whole lot of other conversion options besides COLOR_YUV420sp2BGR, but they all seem to have the same effect, which is a green image. Thank you in advance!
EDIT: As has been pointed out in the comments, it seems I need to use all 3 planes of the YUV image, and not just the first one. I know how to convert each plane into a byte array, and now I have 3 byte arrays each representing a plane, but my question is now how do I create a Mat from these 3 byte arrays? The put() method I'm familiar with only accepts a single byte array. Do I concatenate or combine them somehow?
↧