1/* This is the contributed code:
2
3File: cvcap_v4l.cpp
4Current Location: ../opencv-0.9.6/otherlibs/videoio
5
6Original Version: 2003-03-12 Magnus Lundin lundin@mlu.mine.nu
7Original Comments:
8
9ML:This set of files adds support for firevre and usb cameras.
10First it tries to install a firewire camera,
11if that fails it tries a v4l/USB camera
12It has been tested with the motempl sample program
13
14First Patch: August 24, 2004 Travis Wood TravisOCV@tkwood.com
15For Release: OpenCV-Linux Beta4 opencv-0.9.6
16Tested On: LMLBT44 with 8 video inputs
17Patched Comments:
18
19TW: The cv cam utils that came with the initial release of OpenCV for LINUX Beta4
20were not working. I have rewritten them so they work for me. At the same time, trying
21to keep the original code as ML wrote it as unchanged as possible. No one likes to debug
22someone elses code, so I resisted changes as much as possible. I have tried to keep the
23same "ideas" where applicable, that is, where I could figure out what the previous author
24intended. Some areas I just could not help myself and had to "spiffy-it-up" my way.
25
26These drivers should work with other V4L frame capture cards other then my bttv
27driven frame capture card.
28
29Re Written driver for standard V4L mode. Tested using LMLBT44 video capture card.
30Standard bttv drivers are on the LMLBT44 with up to 8 Inputs.
31
32This utility was written with the help of the document:
33http://pages.cpsc.ucalgary.ca/~sayles/VFL_HowTo
34as a general guide for interfacing into the V4l standard.
35
36Made the index value passed for icvOpenCAM_V4L(index) be the number of the
37video device source in the /dev tree. The -1 uses original /dev/video.
38
39Index Device
40 0 /dev/video0
41 1 /dev/video1
42 2 /dev/video2
43 3 /dev/video3
44 ...
45 7 /dev/video7
46with
47 -1 /dev/video
48
49TW: You can select any video source, but this package was limited from the start to only
50ONE camera opened at any ONE time.
51This is an original program limitation.
52If you are interested, I will make my version available to other OpenCV users. The big
53difference in mine is you may pass the camera number as part of the cv argument, but this
54convention is non standard for current OpenCV calls and the camera number is not currently
55passed into the called routine.
56
57Second Patch: August 28, 2004 Sfuncia Fabio fiblan@yahoo.it
58For Release: OpenCV-Linux Beta4 Opencv-0.9.6
59
60FS: this patch fix not sequential index of device (unplugged device), and real numCameras.
61 for -1 index (icvOpenCAM_V4L) I don't use /dev/video but real device available, because
62 if /dev/video is a link to /dev/video0 and i unplugged device on /dev/video0, /dev/video
63 is a bad link. I search the first available device with indexList.
64
65Third Patch: December 9, 2004 Frederic Devernay Frederic.Devernay@inria.fr
66For Release: OpenCV-Linux Beta4 Opencv-0.9.6
67
68[FD] I modified the following:
69 - handle YUV420P, YUV420, and YUV411P palettes (for many webcams) without using floating-point
70 - cvGrabFrame should not wait for the end of the first frame, and should return quickly
71 (see videoio doc)
72 - cvRetrieveFrame should in turn wait for the end of frame capture, and should not
73 trigger the capture of the next frame (the user choses when to do it using GrabFrame)
74 To get the old behavior, re-call cvRetrieveFrame just after cvGrabFrame.
75 - having global bufferIndex and FirstCapture variables makes the code non-reentrant
76 (e.g. when using several cameras), put these in the CvCapture struct.
77 - according to V4L HowTo, incrementing the buffer index must be done before VIDIOCMCAPTURE.
78 - the VID_TYPE_SCALES stuff from V4L HowTo is wrong: image size can be changed
79 even if the hardware does not support scaling (e.g. webcams can have several
80 resolutions available). Just don't try to set the size at 640x480 if the hardware supports
81 scaling: open with the default (probably best) image size, and let the user scale it
82 using SetProperty.
83 - image size can be changed by two subsequent calls to SetProperty (for width and height)
84 - bug fix: if the image size changes, realloc the new image only when it is grabbed
85 - issue errors only when necessary, fix error message formatting.
86
87Fourth Patch: Sept 7, 2005 Csaba Kertesz sign@freemail.hu
88For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
89
90I modified the following:
91 - Additional Video4Linux2 support :)
92 - Use mmap functions (v4l2)
93 - New methods are internal:
94 try_palette_v4l2 -> rewrite try_palette for v4l2
95 mainloop_v4l2, read_image_v4l2 -> this methods are moved from official v4l2 capture.c example
96 try_init_v4l -> device v4l initialisation
97 try_init_v4l2 -> device v4l2 initialisation
98 autosetup_capture_mode_v4l -> autodetect capture modes for v4l
99 autosetup_capture_mode_v4l2 -> autodetect capture modes for v4l2
100 - Modifications are according with Video4Linux old codes
101 - Video4Linux handling is automatically if it does not recognize a Video4Linux2 device
102 - Tested successfully with Logitech Quickcam Express (V4L), Creative Vista (V4L) and Genius VideoCam Notebook (V4L2)
103 - Correct source lines with compiler warning messages
104 - Information message from v4l/v4l2 detection
105
106Fifth Patch: Sept 7, 2005 Csaba Kertesz sign@freemail.hu
107For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
108
109I modified the following:
110 - SN9C10x chip based webcams support
111 - New methods are internal:
112 bayer2rgb24, sonix_decompress -> decoder routines for SN9C10x decoding from Takafumi Mizuno <taka-qce@ls-a.jp> with his pleasure :)
113 - Tested successfully with Genius VideoCam Notebook (V4L2)
114
115Sixth Patch: Sept 10, 2005 Csaba Kertesz sign@freemail.hu
116For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
117
118I added the following:
119 - Add capture control support (hue, saturation, brightness, contrast, gain)
120 - Get and change V4L capture controls (hue, saturation, brightness, contrast)
121 - New method is internal:
122 icvSetControl -> set capture controls
123 - Tested successfully with Creative Vista (V4L)
124
125Seventh Patch: Sept 10, 2005 Csaba Kertesz sign@freemail.hu
126For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
127
128I added the following:
129 - Detect, get and change V4L2 capture controls (hue, saturation, brightness, contrast, gain)
130 - New methods are internal:
131 v4l2_scan_controls_enumerate_menu, v4l2_scan_controls -> detect capture control intervals
132 - Tested successfully with Genius VideoCam Notebook (V4L2)
133
1348th patch: Jan 5, 2006, Olivier.Bornet@idiap.ch
135Add support of V4L2_PIX_FMT_YUYV and V4L2_PIX_FMT_MJPEG.
136With this patch, new webcams of Logitech, like QuickCam Fusion works.
137Note: For use these webcams, look at the UVC driver at
138http://linux-uvc.berlios.de/
139
1409th patch: Mar 4, 2006, Olivier.Bornet@idiap.ch
141- try V4L2 before V4L, because some devices are V4L2 by default,
142 but they try to implement the V4L compatibility layer.
143 So, I think this is better to support V4L2 before V4L.
144- better separation between V4L2 and V4L initialization. (this was needed to support
145 some drivers working, but not fully with V4L2. (so, we do not know when we
146 need to switch from V4L2 to V4L.
147
14810th patch: July 02, 2008, Mikhail Afanasyev fopencv@theamk.com
149Fix reliability problems with high-resolution UVC cameras on linux
150the symptoms were damaged image and 'Corrupt JPEG data: premature end of data segment' on stderr
151- V4L_ABORT_BADJPEG detects JPEG warnings and turns them into errors, so bad images
152 could be filtered out
153- USE_TEMP_BUFFER fixes the main problem (improper buffer management) and
154 prevents bad images in the first place
155
15611th patch: April 2, 2013, Forrest Reiling forrest.reiling@gmail.com
157Added v4l2 support for getting capture property CAP_PROP_POS_MSEC.
158Returns the millisecond timestamp of the last frame grabbed or 0 if no frames have been grabbed
159Used to successfully synchronize 2 Logitech C310 USB webcams to within 16 ms of one another
160
16112th patch: March 9, 2018, Taylor Lanclos <tlanclos@live.com>
162 added support for CAP_PROP_BUFFERSIZE
163
164make & enjoy!
165
166*/
167
168/*M///////////////////////////////////////////////////////////////////////////////////////
169//
170// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
171//
172// By downloading, copying, installing or using the software you agree to this license.
173// If you do not agree to this license, do not download, install,
174// copy or use the software.
175//
176//
177// Intel License Agreement
178// For Open Source Computer Vision Library
179//
180// Copyright (C) 2000, Intel Corporation, all rights reserved.
181// Third party copyrights are property of their respective owners.
182//
183// Redistribution and use in source and binary forms, with or without modification,
184// are permitted provided that the following conditions are met:
185//
186// * Redistribution's of source code must retain the above copyright notice,
187// this list of conditions and the following disclaimer.
188//
189// * Redistribution's in binary form must reproduce the above copyright notice,
190// this list of conditions and the following disclaimer in the documentation
191// and/or other materials provided with the distribution.
192//
193// * The name of Intel Corporation may not be used to endorse or promote products
194// derived from this software without specific prior written permission.
195//
196// This software is provided by the copyright holders and contributors "as is" and
197// any express or implied warranties, including, but not limited to, the implied
198// warranties of merchantability and fitness for a particular purpose are disclaimed.
199// In no event shall the Intel Corporation or contributors be liable for any direct,
200// indirect, incidental, special, exemplary, or consequential damages
201// (including, but not limited to, procurement of substitute goods or services;
202// loss of use, data, or profits; or business interruption) however caused
203// and on any theory of liability, whether in contract, strict liability,
204// or tort (including negligence or otherwise) arising in any way out of
205// the use of this software, even if advised of the possibility of such damage.
206//
207//M*/
208
209#include "precomp.hpp"
210
211#if !defined _WIN32 && (defined HAVE_CAMV4L2 || defined HAVE_VIDEOIO)
212
213#include <stdio.h>
214#include <unistd.h>
215#include <fcntl.h>
216#include <errno.h>
217#include <sys/ioctl.h>
218#include <sys/types.h>
219#include <sys/mman.h>
220
221#include <string.h>
222#include <stdlib.h>
223#include <sys/stat.h>
224#include <sys/ioctl.h>
225#include <limits>
226
227#include <poll.h>
228
229#ifdef HAVE_CAMV4L2
230#include <asm/types.h> /* for videodev2.h */
231#include <linux/videodev2.h>
232#endif
233
234#ifdef HAVE_VIDEOIO
235// NetBSD compatibility layer with V4L2
236#include <sys/videoio.h>
237#endif
238
239#ifdef __OpenBSD__
240typedef uint32_t __u32;
241#endif
242
243// https://github.com/opencv/opencv/issues/13335
244#ifndef V4L2_CID_ISO_SENSITIVITY
245#define V4L2_CID_ISO_SENSITIVITY (V4L2_CID_CAMERA_CLASS_BASE+23)
246#endif
247
248// https://github.com/opencv/opencv/issues/13929
249#ifndef V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT
250#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT (V4L2_CID_MPEG_BASE+364)
251#endif
252#ifndef V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH
253#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH (V4L2_CID_MPEG_BASE+365)
254#endif
255
256#ifndef V4L2_CID_ROTATE
257#define V4L2_CID_ROTATE (V4L2_CID_BASE+34)
258#endif
259#ifndef V4L2_CID_IRIS_ABSOLUTE
260#define V4L2_CID_IRIS_ABSOLUTE (V4L2_CID_CAMERA_CLASS_BASE+17)
261#endif
262
263#ifndef v4l2_fourcc_be
264#define v4l2_fourcc_be(a, b, c, d) (v4l2_fourcc(a, b, c, d) | (1U << 31))
265#endif
266
267#ifndef V4L2_PIX_FMT_Y10
268#define V4L2_PIX_FMT_Y10 v4l2_fourcc('Y', '1', '0', ' ')
269#endif
270
271#ifndef V4L2_PIX_FMT_Y12
272#define V4L2_PIX_FMT_Y12 v4l2_fourcc('Y', '1', '2', ' ')
273#endif
274
275#ifndef V4L2_PIX_FMT_Y16
276#define V4L2_PIX_FMT_Y16 v4l2_fourcc('Y', '1', '6', ' ')
277#endif
278
279#ifndef V4L2_PIX_FMT_Y16_BE
280#define V4L2_PIX_FMT_Y16_BE v4l2_fourcc_be('Y', '1', '6', ' ')
281#endif
282
283#ifndef V4L2_PIX_FMT_ABGR32
284#define V4L2_PIX_FMT_ABGR32 v4l2_fourcc('A', 'R', '2', '4')
285#endif
286#ifndef V4L2_PIX_FMT_XBGR32
287#define V4L2_PIX_FMT_XBGR32 v4l2_fourcc('X', 'R', '2', '4')
288#endif
289
290/* Defaults - If your board can do better, set it here. Set for the most common type inputs. */
291#define DEFAULT_V4L_WIDTH 640
292#define DEFAULT_V4L_HEIGHT 480
293#define DEFAULT_V4L_FPS 30
294
295#define MAX_CAMERAS 8
296
297// default and maximum number of V4L buffers, not including last, 'special' buffer
298#define MAX_V4L_BUFFERS 10
299#define DEFAULT_V4L_BUFFERS 4
300
301// types of memory in 'special' buffer
302enum {
303 MEMORY_ORIG = 0, // Image data in original format.
304 MEMORY_RGB = 1, // Image data converted to RGB format.
305};
306
307// if enabled, then bad JPEG warnings become errors and cause NULL returned instead of image
308#define V4L_ABORT_BADJPEG
309
310namespace cv {
311
312static const char* decode_ioctl_code(unsigned long ioctlCode)
313{
314 switch (ioctlCode)
315 {
316#define CV_ADD_IOCTL_CODE(id) case id: return #id
317 CV_ADD_IOCTL_CODE(VIDIOC_G_FMT);
318 CV_ADD_IOCTL_CODE(VIDIOC_S_FMT);
319 CV_ADD_IOCTL_CODE(VIDIOC_REQBUFS);
320 CV_ADD_IOCTL_CODE(VIDIOC_DQBUF);
321 CV_ADD_IOCTL_CODE(VIDIOC_QUERYCAP);
322 CV_ADD_IOCTL_CODE(VIDIOC_S_PARM);
323 CV_ADD_IOCTL_CODE(VIDIOC_G_PARM);
324 CV_ADD_IOCTL_CODE(VIDIOC_QUERYBUF);
325 CV_ADD_IOCTL_CODE(VIDIOC_QBUF);
326 CV_ADD_IOCTL_CODE(VIDIOC_STREAMON);
327 CV_ADD_IOCTL_CODE(VIDIOC_STREAMOFF);
328 CV_ADD_IOCTL_CODE(VIDIOC_ENUMINPUT);
329 CV_ADD_IOCTL_CODE(VIDIOC_G_INPUT);
330 CV_ADD_IOCTL_CODE(VIDIOC_S_INPUT);
331 CV_ADD_IOCTL_CODE(VIDIOC_G_CTRL);
332 CV_ADD_IOCTL_CODE(VIDIOC_S_CTRL);
333#undef CV_ADD_IOCTL_CODE
334 }
335 return "unknown";
336}
337
338struct Memory
339{
340 void * start;
341 size_t length;
342
343 Memory() : start(NULL), length(0) {}
344};
345
346/* Device Capture Objects */
347/* V4L2 structure */
348struct Buffer
349{
350 Memory memories[VIDEO_MAX_PLANES];
351 v4l2_plane planes[VIDEO_MAX_PLANES] = {};
352 // Total number of bytes occupied by data in the all planes (payload)
353 __u32 bytesused;
354 // This is dequeued buffer. It used for to put it back in the queue.
355 // The buffer is valid only if capture->bufferIndex >= 0
356 v4l2_buffer buffer;
357
358 Buffer()
359 {
360 buffer = v4l2_buffer();
361 }
362};
363
364struct CvCaptureCAM_V4L CV_FINAL : public IVideoCapture
365{
366 int getCaptureDomain() /*const*/ CV_OVERRIDE { return cv::CAP_V4L; }
367
368 int deviceHandle;
369 bool v4l_buffersRequested;
370 bool v4l_streamStarted;
371
372 int bufferIndex;
373 bool FirstCapture;
374 String deviceName;
375
376 Mat frame;
377
378 __u32 palette;
379 int width, height;
380 int width_set, height_set;
381 int bufferSize;
382 __u32 fps;
383 bool convert_rgb;
384 bool returnFrame;
385 // To select a video input set cv::CAP_PROP_CHANNEL to channel number.
386 // If the new channel number is than 0, then a video input will not change
387 int channelNumber;
388 // Normalize properties. If set parameters will be converted to/from [0,1) range.
389 // Enabled by default (as OpenCV 3.x does).
390 // Value is initialized from the environment variable `OPENCV_VIDEOIO_V4L_RANGE_NORMALIZED`:
391 // To select real parameters mode after devise is open set cv::CAP_PROP_MODE to 0
392 // any other value revert the backward compatibility mode (with normalized properties).
393 // Range normalization affects the following parameters:
394 // cv::CAP_PROP_*: BRIGHTNESS,CONTRAST,SATURATION,HUE,GAIN,EXPOSURE,FOCUS,AUTOFOCUS,AUTO_EXPOSURE.
395 bool normalizePropRange;
396
397 /* V4L2 variables */
398 Buffer buffers[MAX_V4L_BUFFERS + 1];
399 v4l2_capability capability;
400 v4l2_input videoInput;
401 v4l2_format form;
402 v4l2_requestbuffers req;
403 v4l2_buf_type type;
404 unsigned char num_planes;
405
406 timeval timestamp;
407
408 bool open(int _index);
409 bool open(const std::string & filename);
410 bool isOpened() const CV_OVERRIDE;
411
412 void closeDevice();
413
414 virtual double getProperty(int) const CV_OVERRIDE;
415 virtual bool setProperty(int, double) CV_OVERRIDE;
416 virtual bool grabFrame() CV_OVERRIDE;
417 virtual bool retrieveFrame(int, OutputArray) CV_OVERRIDE;
418
419 CvCaptureCAM_V4L();
420 virtual ~CvCaptureCAM_V4L();
421 bool requestBuffers();
422 bool requestBuffers(unsigned int buffer_number);
423 bool createBuffers();
424 void releaseBuffers();
425 bool initCapture();
426 bool streaming(bool startStream);
427 bool setFps(int value);
428 bool tryIoctl(unsigned long ioctlCode, void *parameter, bool failIfBusy = true, int attempts = 10) const;
429 bool controlInfo(int property_id, __u32 &v4l2id, cv::Range &range) const;
430 bool icvControl(__u32 v4l2id, int &value, bool isSet) const;
431 void initFrameNonBGR();
432
433 bool icvSetFrameSize(int _width, int _height);
434 bool v4l2_reset();
435 bool setVideoInputChannel();
436 bool try_palette_v4l2();
437 bool try_init_v4l2();
438 bool autosetup_capture_mode_v4l2();
439 bool read_frame_v4l2();
440 bool convertableToRgb() const;
441 void convertToRgb(const Buffer &currentBuffer);
442
443 bool havePendingFrame; // true if next .grab() should be noop, .retrieve() resets this flag
444};
445
446/*********************** Implementations ***************************************/
447
448CvCaptureCAM_V4L::CvCaptureCAM_V4L() :
449 deviceHandle(-1),
450 v4l_buffersRequested(false),
451 v4l_streamStarted(false),
452 bufferIndex(-1),
453 FirstCapture(true),
454 palette(0),
455 width(0), height(0), width_set(0), height_set(0),
456 bufferSize(DEFAULT_V4L_BUFFERS),
457 fps(0), convert_rgb(0), returnFrame(false),
458 channelNumber(-1), normalizePropRange(false),
459 type(V4L2_BUF_TYPE_VIDEO_CAPTURE),
460 num_planes(0),
461 havePendingFrame(false)
462{
463 memset(s: &timestamp, c: 0, n: sizeof(timestamp));
464}
465
466CvCaptureCAM_V4L::~CvCaptureCAM_V4L()
467{
468 try
469 {
470 closeDevice();
471 }
472 catch (...)
473 {
474 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2): unable properly close device: " << deviceName);
475 if (deviceHandle != -1)
476 close(fd: deviceHandle);
477 }
478}
479
480void CvCaptureCAM_V4L::closeDevice()
481{
482 if (v4l_streamStarted)
483 streaming(startStream: false);
484 if (v4l_buffersRequested)
485 releaseBuffers();
486 if(deviceHandle != -1)
487 {
488 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): close(" << deviceHandle << ")");
489 close(fd: deviceHandle);
490 }
491 deviceHandle = -1;
492}
493
494bool CvCaptureCAM_V4L::isOpened() const
495{
496 return deviceHandle != -1;
497}
498
499bool CvCaptureCAM_V4L::try_palette_v4l2()
500{
501 form = v4l2_format();
502 form.type = type;
503 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
504 form.fmt.pix_mp.pixelformat = palette;
505 form.fmt.pix_mp.field = V4L2_FIELD_ANY;
506 form.fmt.pix_mp.width = width;
507 form.fmt.pix_mp.height = height;
508 } else {
509 form.fmt.pix.pixelformat = palette;
510 form.fmt.pix.field = V4L2_FIELD_ANY;
511 form.fmt.pix.width = width;
512 form.fmt.pix.height = height;
513 }
514 if (!tryIoctl(VIDIOC_S_FMT, parameter: &form, failIfBusy: true))
515 {
516 return false;
517 }
518 if (V4L2_TYPE_IS_MULTIPLANAR(type))
519 return palette == form.fmt.pix_mp.pixelformat;
520 return palette == form.fmt.pix.pixelformat;
521}
522
523bool CvCaptureCAM_V4L::setVideoInputChannel()
524{
525 if(channelNumber < 0)
526 return true;
527 /* Query channels number */
528 int channel = 0;
529 if (!tryIoctl(VIDIOC_G_INPUT, parameter: &channel))
530 return false;
531
532 if(channel == channelNumber)
533 return true;
534
535 /* Query information about new input channel */
536 videoInput = v4l2_input();
537 videoInput.index = channelNumber;
538 if (!tryIoctl(VIDIOC_ENUMINPUT, parameter: &videoInput))
539 return false;
540
541 //To select a video input applications store the number of the desired input in an integer
542 // and call the VIDIOC_S_INPUT ioctl with a pointer to this integer. Side effects are possible.
543 // For example inputs may support different video standards, so the driver may implicitly
544 // switch the current standard.
545 // It is good practice to select an input before querying or negotiating any other parameters.
546 return tryIoctl(VIDIOC_S_INPUT, parameter: &channelNumber);
547}
548
549bool CvCaptureCAM_V4L::try_init_v4l2()
550{
551 /* The following code sets the CHANNEL_NUMBER of the video input. Some video sources
552 have sub "Channel Numbers". For a typical V4L TV capture card, this is usually 1.
553 I myself am using a simple NTSC video input capture card that uses the value of 1.
554 If you are not in North America or have a different video standard, you WILL have to change
555 the following settings and recompile/reinstall. This set of settings is based on
556 the most commonly encountered input video source types (like my bttv card) */
557
558 // The cv::CAP_PROP_MODE used for set the video input channel number
559 if (!setVideoInputChannel())
560 {
561 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Unable to set Video Input Channel");
562 return false;
563 }
564
565 // Test device for V4L2 compatibility
566 capability = v4l2_capability();
567 if (!tryIoctl(VIDIOC_QUERYCAP, parameter: &capability))
568 {
569 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Unable to query capability");
570 return false;
571 }
572
573 if ((capability.capabilities & (V4L2_CAP_VIDEO_CAPTURE | V4L2_CAP_VIDEO_CAPTURE_MPLANE)) == 0)
574 {
575 /* Nope. */
576 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): not supported - device is unable to capture video (missing V4L2_CAP_VIDEO_CAPTURE or V4L2_CAP_VIDEO_CAPTURE_MPLANE)");
577 return false;
578 }
579
580 if (capability.capabilities & V4L2_CAP_VIDEO_CAPTURE_MPLANE)
581 type = V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE;
582 return true;
583}
584
585bool CvCaptureCAM_V4L::autosetup_capture_mode_v4l2()
586{
587 //in case palette is already set and works, no need to setup.
588 if (palette != 0)
589 {
590 if (try_palette_v4l2())
591 {
592 return true;
593 }
594 else if (errno == EBUSY)
595 {
596 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): device is busy");
597 closeDevice();
598 return false;
599 }
600 }
601 __u32 try_order[] = {
602 V4L2_PIX_FMT_BGR24,
603 V4L2_PIX_FMT_RGB24,
604 V4L2_PIX_FMT_YVU420,
605 V4L2_PIX_FMT_YUV420,
606 V4L2_PIX_FMT_YUV411P,
607 V4L2_PIX_FMT_YUYV,
608 V4L2_PIX_FMT_UYVY,
609 V4L2_PIX_FMT_NV12,
610 V4L2_PIX_FMT_NV21,
611 V4L2_PIX_FMT_SBGGR8,
612 V4L2_PIX_FMT_SGBRG8,
613 V4L2_PIX_FMT_SGRBG8,
614 V4L2_PIX_FMT_XBGR32,
615 V4L2_PIX_FMT_ABGR32,
616 V4L2_PIX_FMT_SN9C10X,
617#ifdef HAVE_JPEG
618 V4L2_PIX_FMT_MJPEG,
619 V4L2_PIX_FMT_JPEG,
620#endif
621 V4L2_PIX_FMT_Y16,
622 V4L2_PIX_FMT_Y16_BE,
623 V4L2_PIX_FMT_Y12,
624 V4L2_PIX_FMT_Y10,
625 V4L2_PIX_FMT_GREY,
626 };
627
628 for (size_t i = 0; i < sizeof(try_order) / sizeof(__u32); i++) {
629 palette = try_order[i];
630 if (try_palette_v4l2()) {
631 return true;
632 } else if (errno == EBUSY) {
633 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): device is busy");
634 closeDevice();
635 return false;
636 }
637 }
638 return false;
639}
640
641bool CvCaptureCAM_V4L::setFps(int value)
642{
643 if (!isOpened())
644 return false;
645
646 v4l2_streamparm streamparm = v4l2_streamparm();
647 streamparm.type = type;
648 streamparm.parm.capture.timeperframe.numerator = 1;
649 streamparm.parm.capture.timeperframe.denominator = __u32(value);
650 if (!tryIoctl(VIDIOC_S_PARM, parameter: &streamparm) || !tryIoctl(VIDIOC_G_PARM, parameter: &streamparm))
651 {
652 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): can't set FPS: " << value);
653 return false;
654 }
655
656 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): FPS="
657 << streamparm.parm.capture.timeperframe.denominator << "/"
658 << streamparm.parm.capture.timeperframe.numerator);
659 fps = streamparm.parm.capture.timeperframe.denominator; // TODO use numerator
660 return true;
661}
662
663bool CvCaptureCAM_V4L::convertableToRgb() const
664{
665 switch (palette) {
666 case V4L2_PIX_FMT_YVU420:
667 case V4L2_PIX_FMT_YUV420:
668 case V4L2_PIX_FMT_NV12:
669 case V4L2_PIX_FMT_NV21:
670 case V4L2_PIX_FMT_YUV411P:
671#ifdef HAVE_JPEG
672 case V4L2_PIX_FMT_MJPEG:
673 case V4L2_PIX_FMT_JPEG:
674#endif
675 case V4L2_PIX_FMT_YUYV:
676 case V4L2_PIX_FMT_UYVY:
677 case V4L2_PIX_FMT_SBGGR8:
678 case V4L2_PIX_FMT_SN9C10X:
679 case V4L2_PIX_FMT_SGBRG8:
680 case V4L2_PIX_FMT_SGRBG8:
681 case V4L2_PIX_FMT_RGB24:
682 case V4L2_PIX_FMT_Y16:
683 case V4L2_PIX_FMT_Y16_BE:
684 case V4L2_PIX_FMT_Y10:
685 case V4L2_PIX_FMT_GREY:
686 case V4L2_PIX_FMT_BGR24:
687 case V4L2_PIX_FMT_XBGR32:
688 case V4L2_PIX_FMT_ABGR32:
689 return true;
690 default:
691 break;
692 }
693 return false;
694}
695
696bool CvCaptureCAM_V4L::initCapture()
697{
698 if (!isOpened())
699 return false;
700
701 if (!try_init_v4l2())
702 {
703 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): init failed: errno=" << errno << " (" << strerror(errno) << ")");
704 return false;
705 }
706
707 /* Find Window info */
708 form = v4l2_format();
709 form.type = type;
710
711 if (!tryIoctl(VIDIOC_G_FMT, parameter: &form))
712 {
713 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Could not obtain specifics of capture window (VIDIOC_G_FMT): errno=" << errno << " (" << strerror(errno) << ")");
714 return false;
715 }
716
717 if (!autosetup_capture_mode_v4l2())
718 {
719 if (errno != EBUSY)
720 {
721 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): Pixel format of incoming image is unsupported by OpenCV");
722 }
723 return false;
724 }
725
726 /* try to set framerate */
727 setFps(fps);
728
729 /* Buggy driver paranoia. */
730 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
731 // TODO: add size adjustment if needed
732 } else {
733 unsigned int min;
734
735 min = form.fmt.pix.width * 2;
736
737 if (form.fmt.pix.bytesperline < min)
738 form.fmt.pix.bytesperline = min;
739
740 min = form.fmt.pix.bytesperline * form.fmt.pix.height;
741
742 if (form.fmt.pix.sizeimage < min)
743 form.fmt.pix.sizeimage = min;
744 }
745
746 if (V4L2_TYPE_IS_MULTIPLANAR(type))
747 num_planes = form.fmt.pix_mp.num_planes;
748 else
749 num_planes = 1;
750
751 if (!requestBuffers())
752 return false;
753
754 if (!createBuffers()) {
755 /* free capture, and returns an error code */
756 releaseBuffers();
757 return false;
758 }
759
760 // reinitialize buffers
761 FirstCapture = true;
762
763 return true;
764};
765
766bool CvCaptureCAM_V4L::requestBuffers()
767{
768 unsigned int buffer_number = bufferSize;
769 while (buffer_number > 0) {
770 if (requestBuffers(buffer_number) && req.count >= buffer_number)
771 {
772 break;
773 }
774
775 buffer_number--;
776 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Insufficient buffer memory -- decreasing buffers: " << buffer_number);
777 }
778 if (buffer_number < 1) {
779 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): Insufficient buffer memory");
780 return false;
781 }
782 bufferSize = req.count;
783 return true;
784}
785
786bool CvCaptureCAM_V4L::requestBuffers(unsigned int buffer_number)
787{
788 if (!isOpened())
789 return false;
790
791 req = v4l2_requestbuffers();
792 req.count = buffer_number;
793 req.type = type;
794 req.memory = V4L2_MEMORY_MMAP;
795
796 if (!tryIoctl(VIDIOC_REQBUFS, parameter: &req)) {
797 int err = errno;
798 if (EINVAL == err)
799 {
800 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): no support for memory mapping");
801 }
802 else
803 {
804 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_REQBUFS: errno=" << err << " (" << strerror(err) << ")");
805 }
806 return false;
807 }
808 v4l_buffersRequested = true;
809 return true;
810}
811
812bool CvCaptureCAM_V4L::createBuffers()
813{
814 size_t maxLength = 0;
815 for (unsigned int n_buffers = 0; n_buffers < req.count; ++n_buffers) {
816 v4l2_buffer buf = v4l2_buffer();
817 v4l2_plane mplanes[VIDEO_MAX_PLANES];
818 size_t length = 0;
819 off_t offset = 0;
820 buf.type = type;
821 buf.memory = V4L2_MEMORY_MMAP;
822 buf.index = n_buffers;
823 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
824 buf.m.planes = mplanes;
825 buf.length = VIDEO_MAX_PLANES;
826 }
827
828 if (!tryIoctl(VIDIOC_QUERYBUF, parameter: &buf)) {
829 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QUERYBUF: errno=" << errno << " (" << strerror(errno) << ")");
830 return false;
831 }
832
833 CV_Assert(1 <= num_planes && num_planes <= VIDEO_MAX_PLANES);
834 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
835 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
836 length = buf.m.planes[n_planes].length;
837 offset = buf.m.planes[n_planes].m.mem_offset;
838 } else {
839 length = buf.length;
840 offset = buf.m.offset;
841 }
842
843 buffers[n_buffers].memories[n_planes].length = length;
844 buffers[n_buffers].memories[n_planes].start =
845 mmap(NULL /* start anywhere */,
846 len: length,
847 PROT_READ /* required */,
848 MAP_SHARED /* recommended */,
849 fd: deviceHandle, offset: offset);
850 if (MAP_FAILED == buffers[n_buffers].memories[n_planes].start) {
851 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed mmap(" << length << "): errno=" << errno << " (" << strerror(errno) << ")");
852 return false;
853 }
854 }
855
856 maxLength = maxLength > length ? maxLength : length;
857 }
858 if (maxLength > 0) {
859 maxLength *= num_planes;
860 buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start = malloc(size: maxLength);
861 buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].length = maxLength;
862 buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start = malloc(size: maxLength);
863 buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].length = maxLength;
864 }
865 return (buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start != 0) &&
866 (buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start != 0);
867}
868
869/**
870 * some properties can not be changed while the device is in streaming mode.
871 * this method closes and re-opens the device to re-start the stream.
872 * this also causes buffers to be reallocated if the frame size was changed.
873 */
874bool CvCaptureCAM_V4L::v4l2_reset()
875{
876 streaming(startStream: false);
877 releaseBuffers();
878 return initCapture();
879}
880
881bool CvCaptureCAM_V4L::open(int _index)
882{
883 cv::String name;
884 /* Select camera, or rather, V4L video source */
885 if (_index < 0) // Asking for the first device available
886 {
887 for (int autoindex = 0; autoindex < MAX_CAMERAS; ++autoindex)
888 {
889 name = cv::format(fmt: "/dev/video%d", autoindex);
890 /* Test using an open to see if this new device name really does exists. */
891 int h = ::open(file: name.c_str(), O_RDONLY);
892 if (h != -1)
893 {
894 ::close(fd: h);
895 _index = autoindex;
896 break;
897 }
898 }
899 if (_index < 0)
900 {
901 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2): can't find camera device");
902 name.clear();
903 return false;
904 }
905 }
906 else
907 {
908 name = cv::format(fmt: "/dev/video%d", _index);
909 }
910
911 bool res = open(filename: name);
912 if (!res)
913 {
914 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): can't open camera by index");
915 }
916 return res;
917}
918
919bool CvCaptureCAM_V4L::open(const std::string & _deviceName)
920{
921 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << _deviceName << "): opening...");
922 FirstCapture = true;
923 width = utils::getConfigurationParameterSizeT(name: "OPENCV_VIDEOIO_V4L_DEFAULT_WIDTH", DEFAULT_V4L_WIDTH);
924 height = utils::getConfigurationParameterSizeT(name: "OPENCV_VIDEOIO_V4L_DEFAULT_HEIGHT", DEFAULT_V4L_HEIGHT);
925 width_set = height_set = 0;
926 bufferSize = DEFAULT_V4L_BUFFERS;
927 fps = DEFAULT_V4L_FPS;
928 convert_rgb = true;
929 deviceName = _deviceName;
930 returnFrame = true;
931 normalizePropRange = utils::getConfigurationParameterBool(name: "OPENCV_VIDEOIO_V4L_RANGE_NORMALIZED", defaultValue: false);
932 channelNumber = -1;
933 bufferIndex = -1;
934
935 deviceHandle = ::open(file: deviceName.c_str(), O_RDWR /* required */ | O_NONBLOCK, 0);
936 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << _deviceName << "): deviceHandle=" << deviceHandle);
937 if (deviceHandle == -1)
938 return false;
939
940 return initCapture();
941}
942
943bool CvCaptureCAM_V4L::read_frame_v4l2()
944{
945 v4l2_buffer buf = v4l2_buffer();
946 v4l2_plane mplanes[VIDEO_MAX_PLANES];
947 buf.type = type;
948 buf.memory = V4L2_MEMORY_MMAP;
949 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
950 buf.m.planes = mplanes;
951 buf.length = VIDEO_MAX_PLANES;
952 }
953
954 while (!tryIoctl(VIDIOC_DQBUF, parameter: &buf)) {
955 int err = errno;
956 if (err == EIO && !(buf.flags & (V4L2_BUF_FLAG_QUEUED | V4L2_BUF_FLAG_DONE))) {
957 // Maybe buffer not in the queue? Try to put there
958 if (!tryIoctl(VIDIOC_QBUF, parameter: &buf))
959 return false;
960 continue;
961 }
962 /* display the error and stop processing */
963 returnFrame = false;
964 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): can't read frame (VIDIOC_DQBUF): errno=" << err << " (" << strerror(err) << ")");
965 return false;
966 }
967
968 CV_Assert(buf.index < req.count);
969
970 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
971 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++)
972 CV_Assert(buffers[buf.index].memories[n_planes].length == buf.m.planes[n_planes].length);
973 } else
974 CV_Assert(buffers[buf.index].memories[MEMORY_ORIG].length == buf.length);
975
976 //We shouldn't use this buffer in the queue while not retrieve frame from it.
977 buffers[buf.index].buffer = buf;
978 bufferIndex = buf.index;
979
980 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
981 __u32 offset = 0;
982
983 buffers[buf.index].buffer.m.planes = buffers[buf.index].planes;
984 memcpy(dest: buffers[buf.index].planes, src: buf.m.planes, n: sizeof(mplanes));
985
986 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
987 __u32 bytesused;
988 bytesused = buffers[buf.index].planes[n_planes].bytesused -
989 buffers[buf.index].planes[n_planes].data_offset;
990 offset += bytesused;
991 }
992 buffers[buf.index].bytesused = offset;
993 } else
994 buffers[buf.index].bytesused = buffers[buf.index].buffer.bytesused;
995
996 //set timestamp in capture struct to be timestamp of most recent frame
997 timestamp = buf.timestamp;
998 return true;
999}
1000
1001bool CvCaptureCAM_V4L::tryIoctl(unsigned long ioctlCode, void *parameter, bool failIfBusy, int attempts) const
1002{
1003 CV_Assert(attempts > 0);
1004 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): tryIoctl(" << deviceHandle << ", "
1005 << decode_ioctl_code(ioctlCode) << "(" << ioctlCode << "), failIfBusy=" << failIfBusy << ")"
1006 );
1007 while (true)
1008 {
1009 errno = 0;
1010 int result = ioctl(fd: deviceHandle, request: ioctlCode, parameter);
1011 int err = errno;
1012 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): call ioctl(" << deviceHandle << ", "
1013 << decode_ioctl_code(ioctlCode) << "(" << ioctlCode << "), ...) => "
1014 << result << " errno=" << err << " (" << strerror(err) << ")"
1015 );
1016
1017 if (result != -1)
1018 return true; // success
1019
1020 const bool isBusy = (err == EBUSY);
1021 if (isBusy && failIfBusy)
1022 {
1023 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): ioctl returns with errno=EBUSY");
1024 return false;
1025 }
1026 if (!(isBusy || errno == EAGAIN))
1027 return false;
1028
1029 if (--attempts == 0) {
1030 return false;
1031 }
1032
1033 fd_set fds;
1034 FD_ZERO(&fds);
1035 FD_SET(deviceHandle, &fds);
1036
1037 /* Timeout. */
1038 static int param_v4l_select_timeout = (int)utils::getConfigurationParameterSizeT(name: "OPENCV_VIDEOIO_V4L_SELECT_TIMEOUT", defaultValue: 10);
1039 struct timeval tv;
1040 tv.tv_sec = param_v4l_select_timeout;
1041 tv.tv_usec = 0;
1042
1043 errno = 0;
1044 result = select(nfds: deviceHandle + 1, readfds: &fds, NULL, NULL, timeout: &tv);
1045 err = errno;
1046
1047 if (0 == result)
1048 {
1049 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): select() timeout.");
1050 return false;
1051 }
1052
1053 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): select(" << deviceHandle << ") => "
1054 << result << " errno = " << err << " (" << strerror(err) << ")"
1055 );
1056
1057 if (EINTR == err) // don't loop if signal occurred, like Ctrl+C
1058 {
1059 return false;
1060 }
1061 }
1062 return true;
1063}
1064
1065bool CvCaptureCAM_V4L::grabFrame()
1066{
1067 if (havePendingFrame) // frame has been already grabbed during preroll
1068 {
1069 return true;
1070 }
1071
1072 if (FirstCapture)
1073 {
1074 /* Some general initialization must take place the first time through */
1075
1076 /* This is just a technicality, but all buffers must be filled up before any
1077 staggered SYNC is applied. SO, filler up. (see V4L HowTo) */
1078 bufferIndex = -1;
1079 for (__u32 index = 0; index < req.count; ++index) {
1080 v4l2_buffer buf = v4l2_buffer();
1081 v4l2_plane mplanes[VIDEO_MAX_PLANES];
1082
1083 buf.type = type;
1084 buf.memory = V4L2_MEMORY_MMAP;
1085 buf.index = index;
1086 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
1087 buf.m.planes = mplanes;
1088 buf.length = VIDEO_MAX_PLANES;
1089 }
1090
1091 if (!tryIoctl(VIDIOC_QBUF, parameter: &buf)) {
1092 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QBUF (buffer=" << index << "): errno=" << errno << " (" << strerror(errno) << ")");
1093 return false;
1094 }
1095 }
1096
1097 if (!streaming(startStream: true)) {
1098 return false;
1099 }
1100
1101 // No need to skip this if the first read returns false
1102 /* preparation is ok */
1103 FirstCapture = false;
1104
1105#if defined(V4L_ABORT_BADJPEG)
1106 // skip first frame. it is often bad -- this is unnotied in traditional apps,
1107 // but could be fatal if bad jpeg is enabled
1108 if (!read_frame_v4l2())
1109 return false;
1110#endif
1111 }
1112 // In the case that the grab frame was without retrieveFrame
1113 if (bufferIndex >= 0)
1114 {
1115 if (!tryIoctl(VIDIOC_QBUF, parameter: &buffers[bufferIndex].buffer))
1116 {
1117 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QBUF (buffer=" << bufferIndex << "): errno=" << errno << " (" << strerror(errno) << ")");
1118 }
1119 }
1120 return read_frame_v4l2();
1121}
1122
1123/*
1124 * Turn a YUV4:2:0 block into an RGB block
1125 *
1126 * Video4Linux seems to use the blue, green, red channel
1127 * order convention-- rgb[0] is blue, rgb[1] is green, rgb[2] is red.
1128 *
1129 * Color space conversion coefficients taken from the excellent
1130 * http://www.inforamp.net/~poynton/ColorFAQ.html
1131 * In his terminology, this is a CCIR 601.1 YCbCr -> RGB.
1132 * Y values are given for all 4 pixels, but the U (Pb)
1133 * and V (Pr) are assumed constant over the 2x2 block.
1134 *
1135 * To avoid floating point arithmetic, the color conversion
1136 * coefficients are scaled into 16.16 fixed-point integers.
1137 * They were determined as follows:
1138 *
1139 * double brightness = 1.0; (0->black; 1->full scale)
1140 * double saturation = 1.0; (0->greyscale; 1->full color)
1141 * double fixScale = brightness * 256 * 256;
1142 * int rvScale = (int)(1.402 * saturation * fixScale);
1143 * int guScale = (int)(-0.344136 * saturation * fixScale);
1144 * int gvScale = (int)(-0.714136 * saturation * fixScale);
1145 * int buScale = (int)(1.772 * saturation * fixScale);
1146 * int yScale = (int)(fixScale);
1147 */
1148
1149/* LIMIT: convert a 16.16 fixed-point value to a byte, with clipping. */
1150#define LIMIT(x) ((x)>0xffffff?0xff: ((x)<=0xffff?0:((x)>>16)))
1151
1152static inline void
1153move_411_block(int yTL, int yTR, int yBL, int yBR, int u, int v,
1154 int /*rowPixels*/, unsigned char * rgb)
1155{
1156 const int rvScale = 91881;
1157 const int guScale = -22553;
1158 const int gvScale = -46801;
1159 const int buScale = 116129;
1160 const int yScale = 65536;
1161 int r, g, b;
1162
1163 g = guScale * u + gvScale * v;
1164 // if (force_rgb) {
1165 // r = buScale * u;
1166 // b = rvScale * v;
1167 // } else {
1168 r = rvScale * v;
1169 b = buScale * u;
1170 // }
1171
1172 yTL *= yScale; yTR *= yScale;
1173 yBL *= yScale; yBR *= yScale;
1174
1175 /* Write out top two first pixels */
1176 rgb[0] = LIMIT(b+yTL); rgb[1] = LIMIT(g+yTL);
1177 rgb[2] = LIMIT(r+yTL);
1178
1179 rgb[3] = LIMIT(b+yTR); rgb[4] = LIMIT(g+yTR);
1180 rgb[5] = LIMIT(r+yTR);
1181
1182 /* Write out top two last pixels */
1183 rgb += 6;
1184 rgb[0] = LIMIT(b+yBL); rgb[1] = LIMIT(g+yBL);
1185 rgb[2] = LIMIT(r+yBL);
1186
1187 rgb[3] = LIMIT(b+yBR); rgb[4] = LIMIT(g+yBR);
1188 rgb[5] = LIMIT(r+yBR);
1189}
1190
1191// Consider a YUV411P image of 8x2 pixels.
1192//
1193// A plane of Y values as before.
1194//
1195// A plane of U values 1 2
1196// 3 4
1197//
1198// A plane of V values 1 2
1199// 3 4
1200//
1201// The U1/V1 samples correspond to the ABCD pixels.
1202// U2/V2 samples correspond to the EFGH pixels.
1203//
1204/* Converts from planar YUV411P to RGB24. */
1205/* [FD] untested... */
1206static void
1207yuv411p_to_rgb24(int width, int height,
1208 unsigned char *pIn0, unsigned char *pOut0)
1209{
1210 const int numpix = width * height;
1211 const int bytes = 24 >> 3;
1212 int i, j, y00, y01, y10, y11, u, v;
1213 unsigned char *pY = pIn0;
1214 unsigned char *pU = pY + numpix;
1215 unsigned char *pV = pU + numpix / 4;
1216 unsigned char *pOut = pOut0;
1217
1218 for (j = 0; j <= height; j++) {
1219 for (i = 0; i <= width - 4; i += 4) {
1220 y00 = *pY;
1221 y01 = *(pY + 1);
1222 y10 = *(pY + 2);
1223 y11 = *(pY + 3);
1224 u = (*pU++) - 128;
1225 v = (*pV++) - 128;
1226
1227 move_411_block(yTL: y00, yTR: y01, yBL: y10, yBR: y11, u, v,
1228 width, rgb: pOut);
1229
1230 pY += 4;
1231 pOut += 4 * bytes;
1232
1233 }
1234 }
1235}
1236
1237#define CLAMP(x) ((x)<0?0:((x)>255)?255:(x))
1238
1239typedef struct {
1240 int is_abs;
1241 int len;
1242 int val;
1243} code_table_t;
1244
1245
1246/* local storage */
1247static code_table_t table[256];
1248static int init_done = 0;
1249
1250
1251/*
1252 sonix_decompress_init
1253 =====================
1254 pre-calculates a locally stored table for efficient huffman-decoding.
1255
1256 Each entry at index x in the table represents the codeword
1257 present at the MSB of byte x.
1258
1259 */
1260static void sonix_decompress_init(void)
1261{
1262 int i;
1263 int is_abs, val, len;
1264
1265 for (i = 0; i < 256; i++) {
1266 is_abs = 0;
1267 val = 0;
1268 len = 0;
1269 if ((i & 0x80) == 0) {
1270 /* code 0 */
1271 val = 0;
1272 len = 1;
1273 }
1274 else if ((i & 0xE0) == 0x80) {
1275 /* code 100 */
1276 val = +4;
1277 len = 3;
1278 }
1279 else if ((i & 0xE0) == 0xA0) {
1280 /* code 101 */
1281 val = -4;
1282 len = 3;
1283 }
1284 else if ((i & 0xF0) == 0xD0) {
1285 /* code 1101 */
1286 val = +11;
1287 len = 4;
1288 }
1289 else if ((i & 0xF0) == 0xF0) {
1290 /* code 1111 */
1291 val = -11;
1292 len = 4;
1293 }
1294 else if ((i & 0xF8) == 0xC8) {
1295 /* code 11001 */
1296 val = +20;
1297 len = 5;
1298 }
1299 else if ((i & 0xFC) == 0xC0) {
1300 /* code 110000 */
1301 val = -20;
1302 len = 6;
1303 }
1304 else if ((i & 0xFC) == 0xC4) {
1305 /* code 110001xx: unknown */
1306 val = 0;
1307 len = 8;
1308 }
1309 else if ((i & 0xF0) == 0xE0) {
1310 /* code 1110xxxx */
1311 is_abs = 1;
1312 val = (i & 0x0F) << 4;
1313 len = 8;
1314 }
1315 table[i].is_abs = is_abs;
1316 table[i].val = val;
1317 table[i].len = len;
1318 }
1319
1320 init_done = 1;
1321}
1322
1323
1324/*
1325 sonix_decompress
1326 ================
1327 decompresses an image encoded by a SN9C101 camera controller chip.
1328
1329 IN width
1330 height
1331 inp pointer to compressed frame (with header already stripped)
1332 OUT outp pointer to decompressed frame
1333
1334 Returns 0 if the operation was successful.
1335 Returns <0 if operation failed.
1336
1337 */
1338static int sonix_decompress(int width, int height, unsigned char *inp, unsigned char *outp)
1339{
1340 int row, col;
1341 int val;
1342 int bitpos;
1343 unsigned char code;
1344 unsigned char *addr;
1345
1346 if (!init_done) {
1347 /* do sonix_decompress_init first! */
1348 return -1;
1349 }
1350
1351 bitpos = 0;
1352 for (row = 0; row < height; row++) {
1353
1354 col = 0;
1355
1356
1357
1358 /* first two pixels in first two rows are stored as raw 8-bit */
1359 if (row < 2) {
1360 addr = inp + (bitpos >> 3);
1361 code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1362 bitpos += 8;
1363 *outp++ = code;
1364
1365 addr = inp + (bitpos >> 3);
1366 code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1367 bitpos += 8;
1368 *outp++ = code;
1369
1370 col += 2;
1371 }
1372
1373 while (col < width) {
1374 /* get bitcode from bitstream */
1375 addr = inp + (bitpos >> 3);
1376 code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1377
1378 /* update bit position */
1379 bitpos += table[code].len;
1380
1381 /* calculate pixel value */
1382 val = table[code].val;
1383 if (!table[code].is_abs) {
1384 /* value is relative to top and left pixel */
1385 if (col < 2) {
1386 /* left column: relative to top pixel */
1387 val += outp[-2*width];
1388 }
1389 else if (row < 2) {
1390 /* top row: relative to left pixel */
1391 val += outp[-2];
1392 }
1393 else {
1394 /* main area: average of left pixel and top pixel */
1395 val += (outp[-2] + outp[-2*width]) / 2;
1396 }
1397 }
1398
1399 /* store pixel */
1400 *outp++ = CLAMP(val);
1401 col++;
1402 }
1403 }
1404
1405 return 0;
1406}
1407
1408void CvCaptureCAM_V4L::convertToRgb(const Buffer &currentBuffer)
1409{
1410 cv::Size imageSize;
1411 unsigned char *start;
1412
1413 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
1414 __u32 offset = 0;
1415 start = (unsigned char*)buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start;
1416 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
1417 __u32 data_offset, bytesused;
1418 data_offset = currentBuffer.planes[n_planes].data_offset;
1419 bytesused = currentBuffer.planes[n_planes].bytesused - data_offset;
1420 memcpy(dest: start + offset, src: (char *)currentBuffer.memories[n_planes].start + data_offset,
1421 n: std::min(a: currentBuffer.memories[n_planes].length, b: (size_t)bytesused));
1422 offset += bytesused;
1423 }
1424
1425 imageSize = cv::Size(form.fmt.pix_mp.width, form.fmt.pix_mp.height);
1426 } else {
1427 start = (unsigned char*)currentBuffer.memories[MEMORY_ORIG].start;
1428
1429 imageSize = cv::Size(form.fmt.pix.width, form.fmt.pix.height);
1430 }
1431
1432 frame.create(size: imageSize, CV_8UC3);
1433
1434 switch (palette) {
1435 case V4L2_PIX_FMT_YUV411P:
1436 yuv411p_to_rgb24(width: imageSize.width, height: imageSize.height, pIn0: start, pOut0: frame.data);
1437 return;
1438 case V4L2_PIX_FMT_YVU420:
1439 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1440 code: COLOR_YUV2BGR_YV12);
1441 return;
1442 case V4L2_PIX_FMT_YUV420:
1443 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1444 code: COLOR_YUV2BGR_IYUV);
1445 return;
1446 case V4L2_PIX_FMT_NV12:
1447 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1448 code: COLOR_YUV2BGR_NV12);
1449 return;
1450 case V4L2_PIX_FMT_NV21:
1451 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1452 code: COLOR_YUV2BGR_NV21);
1453 return;
1454#ifdef HAVE_JPEG
1455 case V4L2_PIX_FMT_MJPEG:
1456 case V4L2_PIX_FMT_JPEG:
1457 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): decoding JPEG frame: size=" << currentBuffer.bytesused);
1458 cv::imdecode(buf: Mat(1, currentBuffer.bytesused, CV_8U, start), flags: IMREAD_COLOR, dst: &frame);
1459 return;
1460#endif
1461 case V4L2_PIX_FMT_YUYV:
1462 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC2, start), dst: frame, code: COLOR_YUV2BGR_YUYV);
1463 return;
1464 case V4L2_PIX_FMT_UYVY:
1465 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC2, start), dst: frame, code: COLOR_YUV2BGR_UYVY);
1466 return;
1467 case V4L2_PIX_FMT_RGB24:
1468 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC3, start), dst: frame, code: COLOR_RGB2BGR);
1469 return;
1470 case V4L2_PIX_FMT_Y16:
1471 {
1472 // https://www.kernel.org/doc/html/v4.10/media/uapi/v4l/pixfmt-y16.html
1473 // This is a grey-scale image with a depth of 16 bits per pixel. The least significant byte is stored at lower memory addresses (little-endian).
1474 // Note: 10-bits precision is not supported
1475 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1476 cv::extractChannel(src: cv::Mat(imageSize, CV_8UC2, start), dst: temp, coi: 1); // 1 - second channel
1477 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1478 return;
1479 }
1480 case V4L2_PIX_FMT_Y16_BE:
1481 {
1482 // https://www.kernel.org/doc/html/v4.10/media/uapi/v4l/pixfmt-y16-be.html
1483 // This is a grey-scale image with a depth of 16 bits per pixel. The most significant byte is stored at lower memory addresses (big-endian).
1484 // Note: 10-bits precision is not supported
1485 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1486 cv::extractChannel(src: cv::Mat(imageSize, CV_8UC2, start), dst: temp, coi: 0); // 0 - first channel
1487 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1488 return;
1489 }
1490 case V4L2_PIX_FMT_Y12:
1491 {
1492 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1493 cv::Mat(imageSize, CV_16UC1, start).convertTo(m: temp, CV_8U, alpha: 1.0 / 16);
1494 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1495 return;
1496 }
1497 case V4L2_PIX_FMT_Y10:
1498 {
1499 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1500 cv::Mat(imageSize, CV_16UC1, start).convertTo(m: temp, CV_8U, alpha: 1.0 / 4);
1501 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1502 return;
1503 }
1504 case V4L2_PIX_FMT_SN9C10X:
1505 {
1506 sonix_decompress_init();
1507 sonix_decompress(width: imageSize.width, height: imageSize.height,
1508 inp: start, outp: (unsigned char*)buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1509
1510 cv::Mat cv_buf(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1511 cv::cvtColor(src: cv_buf, dst: frame, code: COLOR_BayerRG2BGR);
1512 return;
1513 }
1514 case V4L2_PIX_FMT_SRGGB8:
1515 {
1516 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerBG2BGR);
1517 return;
1518 }
1519 case V4L2_PIX_FMT_SBGGR8:
1520 {
1521 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerRG2BGR);
1522 return;
1523 }
1524 case V4L2_PIX_FMT_SGBRG8:
1525 {
1526 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerGR2BGR);
1527 return;
1528 }
1529 case V4L2_PIX_FMT_SGRBG8:
1530 {
1531 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerGB2BGR);
1532 return;
1533 }
1534 case V4L2_PIX_FMT_GREY:
1535 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_GRAY2BGR);
1536 return;
1537 case V4L2_PIX_FMT_XBGR32:
1538 case V4L2_PIX_FMT_ABGR32:
1539 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC4, start), dst: frame, code: COLOR_BGRA2BGR);
1540 return;
1541 case V4L2_PIX_FMT_BGR24:
1542 default:
1543 Mat(1, currentBuffer.bytesused, CV_8U, start).reshape(cn: frame.channels(), rows: frame.rows).copyTo(m: frame);
1544 return;
1545 }
1546}
1547
1548static inline cv::String capPropertyName(int prop)
1549{
1550 switch (prop) {
1551 case cv::CAP_PROP_POS_MSEC:
1552 return "pos_msec";
1553 case cv::CAP_PROP_POS_FRAMES:
1554 return "pos_frames";
1555 case cv::CAP_PROP_POS_AVI_RATIO:
1556 return "pos_avi_ratio";
1557 case cv::CAP_PROP_FRAME_COUNT:
1558 return "frame_count";
1559 case cv::CAP_PROP_FRAME_HEIGHT:
1560 return "height";
1561 case cv::CAP_PROP_FRAME_WIDTH:
1562 return "width";
1563 case cv::CAP_PROP_CONVERT_RGB:
1564 return "convert_rgb";
1565 case cv::CAP_PROP_FORMAT:
1566 return "format";
1567 case cv::CAP_PROP_MODE:
1568 return "mode";
1569 case cv::CAP_PROP_FOURCC:
1570 return "fourcc";
1571 case cv::CAP_PROP_AUTO_EXPOSURE:
1572 return "auto_exposure";
1573 case cv::CAP_PROP_EXPOSURE:
1574 return "exposure";
1575 case cv::CAP_PROP_TEMPERATURE:
1576 return "temperature";
1577 case cv::CAP_PROP_FPS:
1578 return "fps";
1579 case cv::CAP_PROP_BRIGHTNESS:
1580 return "brightness";
1581 case cv::CAP_PROP_CONTRAST:
1582 return "contrast";
1583 case cv::CAP_PROP_SATURATION:
1584 return "saturation";
1585 case cv::CAP_PROP_HUE:
1586 return "hue";
1587 case cv::CAP_PROP_GAIN:
1588 return "gain";
1589 case cv::CAP_PROP_RECTIFICATION:
1590 return "rectification";
1591 case cv::CAP_PROP_MONOCHROME:
1592 return "monochrome";
1593 case cv::CAP_PROP_SHARPNESS:
1594 return "sharpness";
1595 case cv::CAP_PROP_GAMMA:
1596 return "gamma";
1597 case cv::CAP_PROP_TRIGGER:
1598 return "trigger";
1599 case cv::CAP_PROP_TRIGGER_DELAY:
1600 return "trigger_delay";
1601 case cv::CAP_PROP_WHITE_BALANCE_RED_V:
1602 return "white_balance_red_v";
1603 case cv::CAP_PROP_ZOOM:
1604 return "zoom";
1605 case cv::CAP_PROP_FOCUS:
1606 return "focus";
1607 case cv::CAP_PROP_GUID:
1608 return "guid";
1609 case cv::CAP_PROP_ISO_SPEED:
1610 return "iso_speed";
1611 case cv::CAP_PROP_BACKLIGHT:
1612 return "backlight";
1613 case cv::CAP_PROP_PAN:
1614 return "pan";
1615 case cv::CAP_PROP_TILT:
1616 return "tilt";
1617 case cv::CAP_PROP_ROLL:
1618 return "roll";
1619 case cv::CAP_PROP_IRIS:
1620 return "iris";
1621 case cv::CAP_PROP_SETTINGS:
1622 return "dialog_settings";
1623 case cv::CAP_PROP_BUFFERSIZE:
1624 return "buffersize";
1625 case cv::CAP_PROP_AUTOFOCUS:
1626 return "autofocus";
1627 case cv::CAP_PROP_WHITE_BALANCE_BLUE_U:
1628 return "white_balance_blue_u";
1629 case cv::CAP_PROP_SAR_NUM:
1630 return "sar_num";
1631 case cv::CAP_PROP_SAR_DEN:
1632 return "sar_den";
1633 case CAP_PROP_AUTO_WB:
1634 return "auto wb";
1635 case CAP_PROP_WB_TEMPERATURE:
1636 return "wb temperature";
1637 case CAP_PROP_ORIENTATION_META:
1638 return "orientation meta";
1639 case CAP_PROP_ORIENTATION_AUTO:
1640 return "orientation auto";
1641 default:
1642 return cv::format(fmt: "unknown (%d)", prop);
1643 }
1644}
1645
1646static inline int capPropertyToV4L2(int prop)
1647{
1648 switch (prop) {
1649 case cv::CAP_PROP_FPS:
1650 return -1;
1651 case cv::CAP_PROP_FOURCC:
1652 return -1;
1653 case cv::CAP_PROP_FRAME_COUNT:
1654 return V4L2_CID_MPEG_VIDEO_B_FRAMES;
1655 case cv::CAP_PROP_FORMAT:
1656 return -1;
1657 case cv::CAP_PROP_MODE:
1658 return -1;
1659 case cv::CAP_PROP_BRIGHTNESS:
1660 return V4L2_CID_BRIGHTNESS;
1661 case cv::CAP_PROP_CONTRAST:
1662 return V4L2_CID_CONTRAST;
1663 case cv::CAP_PROP_SATURATION:
1664 return V4L2_CID_SATURATION;
1665 case cv::CAP_PROP_HUE:
1666 return V4L2_CID_HUE;
1667 case cv::CAP_PROP_GAIN:
1668 return V4L2_CID_GAIN;
1669 case cv::CAP_PROP_EXPOSURE:
1670 return V4L2_CID_EXPOSURE_ABSOLUTE;
1671 case cv::CAP_PROP_CONVERT_RGB:
1672 return -1;
1673 case cv::CAP_PROP_WHITE_BALANCE_BLUE_U:
1674 return V4L2_CID_BLUE_BALANCE;
1675 case cv::CAP_PROP_RECTIFICATION:
1676 return -1;
1677 case cv::CAP_PROP_MONOCHROME:
1678 return -1;
1679 case cv::CAP_PROP_SHARPNESS:
1680 return V4L2_CID_SHARPNESS;
1681 case cv::CAP_PROP_AUTO_EXPOSURE:
1682 return V4L2_CID_EXPOSURE_AUTO;
1683 case cv::CAP_PROP_GAMMA:
1684 return V4L2_CID_GAMMA;
1685 case cv::CAP_PROP_TEMPERATURE:
1686 return V4L2_CID_WHITE_BALANCE_TEMPERATURE;
1687 case cv::CAP_PROP_TRIGGER:
1688 return -1;
1689 case cv::CAP_PROP_TRIGGER_DELAY:
1690 return -1;
1691 case cv::CAP_PROP_WHITE_BALANCE_RED_V:
1692 return V4L2_CID_RED_BALANCE;
1693 case cv::CAP_PROP_ZOOM:
1694 return V4L2_CID_ZOOM_ABSOLUTE;
1695 case cv::CAP_PROP_FOCUS:
1696 return V4L2_CID_FOCUS_ABSOLUTE;
1697 case cv::CAP_PROP_GUID:
1698 return -1;
1699 case cv::CAP_PROP_ISO_SPEED:
1700 return V4L2_CID_ISO_SENSITIVITY;
1701 case cv::CAP_PROP_BACKLIGHT:
1702 return V4L2_CID_BACKLIGHT_COMPENSATION;
1703 case cv::CAP_PROP_PAN:
1704 return V4L2_CID_PAN_ABSOLUTE;
1705 case cv::CAP_PROP_TILT:
1706 return V4L2_CID_TILT_ABSOLUTE;
1707 case cv::CAP_PROP_ROLL:
1708 return V4L2_CID_ROTATE;
1709 case cv::CAP_PROP_IRIS:
1710 return V4L2_CID_IRIS_ABSOLUTE;
1711 case cv::CAP_PROP_SETTINGS:
1712 return -1;
1713 case cv::CAP_PROP_BUFFERSIZE:
1714 return -1;
1715 case cv::CAP_PROP_AUTOFOCUS:
1716 return V4L2_CID_FOCUS_AUTO;
1717 case cv::CAP_PROP_SAR_NUM:
1718 return V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT;
1719 case cv::CAP_PROP_SAR_DEN:
1720 return V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH;
1721 case CAP_PROP_AUTO_WB:
1722 return V4L2_CID_AUTO_WHITE_BALANCE;
1723 case CAP_PROP_WB_TEMPERATURE:
1724 return V4L2_CID_WHITE_BALANCE_TEMPERATURE;
1725 default:
1726 break;
1727 }
1728 return -1;
1729}
1730
1731static inline bool compatibleRange(int property_id)
1732{
1733 switch (property_id) {
1734 case cv::CAP_PROP_BRIGHTNESS:
1735 case cv::CAP_PROP_CONTRAST:
1736 case cv::CAP_PROP_SATURATION:
1737 case cv::CAP_PROP_HUE:
1738 case cv::CAP_PROP_GAIN:
1739 case cv::CAP_PROP_EXPOSURE:
1740 case cv::CAP_PROP_FOCUS:
1741 case cv::CAP_PROP_AUTOFOCUS:
1742 case cv::CAP_PROP_AUTO_EXPOSURE:
1743 return true;
1744 default:
1745 break;
1746 }
1747 return false;
1748}
1749
1750bool CvCaptureCAM_V4L::controlInfo(int property_id, __u32 &_v4l2id, cv::Range &range) const
1751{
1752 /* initialisations */
1753 int v4l2id = capPropertyToV4L2(prop: property_id);
1754 v4l2_queryctrl queryctrl = v4l2_queryctrl();
1755 queryctrl.id = __u32(v4l2id);
1756 if (v4l2id == -1 || !tryIoctl(VIDIOC_QUERYCTRL, parameter: &queryctrl)) {
1757 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): property '" << capPropertyName(property_id) << "' is not supported");
1758 return false;
1759 }
1760 _v4l2id = __u32(v4l2id);
1761 range = cv::Range(queryctrl.minimum, queryctrl.maximum);
1762 if (normalizePropRange) {
1763 switch(property_id)
1764 {
1765 case CAP_PROP_WB_TEMPERATURE:
1766 case CAP_PROP_AUTO_WB:
1767 case CAP_PROP_AUTOFOCUS:
1768 range = Range(0, 1); // do not convert
1769 break;
1770 case CAP_PROP_AUTO_EXPOSURE:
1771 range = Range(0, 4);
1772 default:
1773 break;
1774 }
1775 }
1776 return true;
1777}
1778
1779bool CvCaptureCAM_V4L::icvControl(__u32 v4l2id, int &value, bool isSet) const
1780{
1781 /* set which control we want to set */
1782 v4l2_control control = v4l2_control();
1783 control.id = v4l2id;
1784 control.value = value;
1785
1786 /* The driver may clamp the value or return ERANGE, ignored here */
1787 if (!tryIoctl(ioctlCode: isSet ? VIDIOC_S_CTRL : VIDIOC_G_CTRL, parameter: &control)) {
1788 int err = errno;
1789 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed " << (isSet ? "VIDIOC_S_CTRL" : "VIDIOC_G_CTRL") << ": errno=" << err << " (" << strerror(err) << ")");
1790 switch (err) {
1791#ifndef NDEBUG
1792 case EINVAL:
1793 fprintf(stderr,
1794 "The struct v4l2_control id is invalid or the value is inappropriate for the given control (i.e. "
1795 "if a menu item is selected that is not supported by the driver according to VIDIOC_QUERYMENU).");
1796 break;
1797 case ERANGE:
1798 fprintf(stderr, "The struct v4l2_control value is out of bounds.");
1799 break;
1800 case EACCES:
1801 fprintf(stderr, "Attempt to set a read-only control or to get a write-only control.");
1802 break;
1803#endif
1804 default:
1805 break;
1806 }
1807 return false;
1808 }
1809 if (!isSet)
1810 value = control.value;
1811 return true;
1812}
1813
1814double CvCaptureCAM_V4L::getProperty(int property_id) const
1815{
1816 switch (property_id) {
1817 case cv::CAP_PROP_FRAME_WIDTH:
1818 if (V4L2_TYPE_IS_MULTIPLANAR(type))
1819 return form.fmt.pix_mp.width;
1820 else
1821 return form.fmt.pix.width;
1822 case cv::CAP_PROP_FRAME_HEIGHT:
1823 if (V4L2_TYPE_IS_MULTIPLANAR(type))
1824 return form.fmt.pix_mp.height;
1825 else
1826 return form.fmt.pix.height;
1827 case cv::CAP_PROP_FOURCC:
1828 return palette;
1829 case cv::CAP_PROP_FORMAT:
1830 return frame.type();
1831 case cv::CAP_PROP_MODE:
1832 if (normalizePropRange)
1833 return palette;
1834 return normalizePropRange;
1835 case cv::CAP_PROP_CONVERT_RGB:
1836 return convert_rgb;
1837 case cv::CAP_PROP_BUFFERSIZE:
1838 return bufferSize;
1839 case cv::CAP_PROP_FPS:
1840 {
1841 v4l2_streamparm sp = v4l2_streamparm();
1842 sp.type = type;
1843 if (!tryIoctl(VIDIOC_G_PARM, parameter: &sp)) {
1844 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): Unable to get camera FPS");
1845 return -1;
1846 }
1847 return sp.parm.capture.timeperframe.denominator / (double)sp.parm.capture.timeperframe.numerator;
1848 }
1849 case cv::CAP_PROP_POS_MSEC:
1850 if (FirstCapture)
1851 return 0;
1852
1853 return 1000 * timestamp.tv_sec + ((double)timestamp.tv_usec) / 1000;
1854 case cv::CAP_PROP_CHANNEL:
1855 return channelNumber;
1856 default:
1857 {
1858 cv::Range range;
1859 __u32 v4l2id;
1860 if(!controlInfo(property_id, v4l2id&: v4l2id, range))
1861 return -1.0;
1862 int value = 0;
1863 if(!icvControl(v4l2id, value, isSet: false))
1864 return -1.0;
1865 if (normalizePropRange && compatibleRange(property_id))
1866 return ((double)value - range.start) / range.size();
1867 return value;
1868 }
1869 }
1870}
1871
1872bool CvCaptureCAM_V4L::icvSetFrameSize(int _width, int _height)
1873{
1874 if (_width > 0)
1875 width_set = _width;
1876
1877 if (_height > 0)
1878 height_set = _height;
1879
1880 /* two subsequent calls setting WIDTH and HEIGHT will change
1881 the video size */
1882 if (width_set <= 0 || height_set <= 0)
1883 return true;
1884
1885 width = width_set;
1886 height = height_set;
1887 width_set = height_set = 0;
1888 return v4l2_reset();
1889}
1890
1891bool CvCaptureCAM_V4L::setProperty( int property_id, double _value )
1892{
1893 int value = cvRound(value: _value);
1894 switch (property_id) {
1895 case cv::CAP_PROP_FRAME_WIDTH:
1896 return icvSetFrameSize(width: value, height: 0);
1897 case cv::CAP_PROP_FRAME_HEIGHT:
1898 return icvSetFrameSize(width: 0, height: value);
1899 case cv::CAP_PROP_FPS:
1900 if (fps == static_cast<__u32>(value))
1901 return true;
1902 return setFps(value);
1903 case cv::CAP_PROP_CONVERT_RGB:
1904 if (bool(value)) {
1905 convert_rgb = convertableToRgb();
1906 return convert_rgb;
1907 }else{
1908 convert_rgb = false;
1909 return true;
1910 }
1911 case cv::CAP_PROP_FOURCC:
1912 {
1913 __u32 new_palette = static_cast<__u32>(_value);
1914 if (palette == new_palette)
1915 return true;
1916
1917 __u32 old_palette = palette;
1918 palette = new_palette;
1919
1920 if (v4l2_reset())
1921 return true;
1922
1923 palette = old_palette;
1924 v4l2_reset();
1925 return false;
1926 }
1927 case cv::CAP_PROP_MODE:
1928 normalizePropRange = bool(value);
1929 return true;
1930 case cv::CAP_PROP_BUFFERSIZE:
1931 if (bufferSize == value)
1932 return true;
1933
1934 if (value > MAX_V4L_BUFFERS || value < 1) {
1935 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): Bad buffer size " << value << ", buffer size must be from 1 to " << MAX_V4L_BUFFERS);
1936 return false;
1937 }
1938 bufferSize = value;
1939 return v4l2_reset();
1940 case cv::CAP_PROP_CHANNEL:
1941 {
1942 if (value < 0) {
1943 channelNumber = -1;
1944 return true;
1945 }
1946 if (channelNumber == value)
1947 return true;
1948
1949 int old_channel = channelNumber;
1950 channelNumber = value;
1951 if (v4l2_reset())
1952 return true;
1953
1954 channelNumber = old_channel;
1955 v4l2_reset();
1956 return false;
1957 }
1958 default:
1959 {
1960 cv::Range range;
1961 __u32 v4l2id;
1962 if (!controlInfo(property_id, v4l2id&: v4l2id, range))
1963 return false;
1964 if (normalizePropRange && compatibleRange(property_id))
1965 value = cv::saturate_cast<int>(v: _value * range.size() + range.start);
1966 return icvControl(v4l2id, value, isSet: true);
1967 }
1968 }
1969 return false;
1970}
1971
1972void CvCaptureCAM_V4L::releaseBuffers()
1973{
1974 if (buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start) {
1975 free(ptr: buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start);
1976 buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start = 0;
1977 }
1978
1979 if (buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start) {
1980 free(ptr: buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1981 buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start = 0;
1982 }
1983
1984 bufferIndex = -1;
1985 FirstCapture = true;
1986
1987 if (!v4l_buffersRequested)
1988 return;
1989 v4l_buffersRequested = false;
1990
1991 for (unsigned int n_buffers = 0; n_buffers < MAX_V4L_BUFFERS; ++n_buffers) {
1992 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
1993 if (buffers[n_buffers].memories[n_planes].start) {
1994 if (-1 == munmap(addr: buffers[n_buffers].memories[n_planes].start,
1995 len: buffers[n_buffers].memories[n_planes].length)) {
1996 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed munmap(): errno=" << errno << " (" << strerror(errno) << ")");
1997 } else {
1998 buffers[n_buffers].memories[n_planes].start = 0;
1999 }
2000 }
2001 }
2002 }
2003 //Applications can call ioctl VIDIOC_REQBUFS again to change the number of buffers,
2004 // however this cannot succeed when any buffers are still mapped. A count value of zero
2005 // frees all buffers, after aborting or finishing any DMA in progress, an implicit VIDIOC_STREAMOFF.
2006 requestBuffers(buffer_number: 0);
2007};
2008
2009bool CvCaptureCAM_V4L::streaming(bool startStream)
2010{
2011 if (startStream != v4l_streamStarted)
2012 {
2013 if (!isOpened())
2014 {
2015 CV_Assert(v4l_streamStarted == false);
2016 return !startStream;
2017 }
2018
2019 bool result = tryIoctl(ioctlCode: startStream ? VIDIOC_STREAMON : VIDIOC_STREAMOFF, parameter: &type);
2020 if (result)
2021 {
2022 v4l_streamStarted = startStream;
2023 return true;
2024 }
2025 if (startStream)
2026 {
2027 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_STREAMON: errno=" << errno << " (" << strerror(errno) << ")");
2028 }
2029 return false;
2030 }
2031 return startStream;
2032}
2033
2034void CvCaptureCAM_V4L::initFrameNonBGR()
2035{
2036 Size size;
2037 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
2038 CV_Assert(form.fmt.pix_mp.width <= (uint)std::numeric_limits<int>::max());
2039 CV_Assert(form.fmt.pix_mp.height <= (uint)std::numeric_limits<int>::max());
2040 size = Size{(int)form.fmt.pix_mp.width, (int)form.fmt.pix_mp.height};
2041 } else {
2042 CV_Assert(form.fmt.pix.width <= (uint)std::numeric_limits<int>::max());
2043 CV_Assert(form.fmt.pix.height <= (uint)std::numeric_limits<int>::max());
2044 size = Size{(int)form.fmt.pix.width, (int)form.fmt.pix.height};
2045 }
2046
2047 int image_type = CV_8UC3;
2048 switch (palette) {
2049 case V4L2_PIX_FMT_BGR24:
2050 case V4L2_PIX_FMT_RGB24:
2051 image_type = CV_8UC3;
2052 break;
2053 case V4L2_PIX_FMT_XBGR32:
2054 case V4L2_PIX_FMT_ABGR32:
2055 image_type = CV_8UC4;
2056 break;
2057 case V4L2_PIX_FMT_YUYV:
2058 case V4L2_PIX_FMT_UYVY:
2059 image_type = CV_8UC2;
2060 break;
2061 case V4L2_PIX_FMT_YVU420:
2062 case V4L2_PIX_FMT_YUV420:
2063 case V4L2_PIX_FMT_NV12:
2064 case V4L2_PIX_FMT_NV21:
2065 image_type = CV_8UC1;
2066 size.height = size.height * 3 / 2; // "1.5" channels
2067 break;
2068 case V4L2_PIX_FMT_Y16:
2069 case V4L2_PIX_FMT_Y16_BE:
2070 case V4L2_PIX_FMT_Y12:
2071 case V4L2_PIX_FMT_Y10:
2072 image_type = CV_16UC1;
2073 break;
2074 case V4L2_PIX_FMT_GREY:
2075 image_type = CV_8UC1;
2076 break;
2077 default:
2078 image_type = CV_8UC1;
2079 if(bufferIndex < 0)
2080 size = Size(buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].length, 1);
2081 else {
2082 __u32 bytesused = 0;
2083 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
2084 __u32 data_offset;
2085 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
2086 data_offset = buffers[bufferIndex].planes[n_planes].data_offset;
2087 bytesused += buffers[bufferIndex].planes[n_planes].bytesused - data_offset;
2088 }
2089 } else {
2090 bytesused = buffers[bufferIndex].buffer.bytesused;
2091 }
2092 size = Size(bytesused, 1);
2093 }
2094 break;
2095 }
2096 frame.create(size, type: image_type);
2097}
2098
2099bool CvCaptureCAM_V4L::retrieveFrame(int, OutputArray ret)
2100{
2101 havePendingFrame = false; // unlock .grab()
2102
2103 if (bufferIndex < 0)
2104 {
2105 frame.copyTo(m: ret);
2106 return true;
2107 }
2108
2109 /* Now get what has already been captured as a IplImage return */
2110 const Buffer &currentBuffer = buffers[bufferIndex];
2111 if (convert_rgb) {
2112 convertToRgb(currentBuffer);
2113 } else {
2114 // for mjpeg streams the size might change in between, so we have to change the header
2115 // We didn't allocate memory when not convert_rgb, but we have to recreate the header
2116 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): buffer input size=" << currentBuffer.bytesused);
2117
2118 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
2119 // calculate total size
2120 __u32 bytestotal = 0;
2121 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
2122 const v4l2_plane & cur_plane = currentBuffer.planes[n_planes];
2123 bytestotal += cur_plane.bytesused - cur_plane.data_offset;
2124 }
2125 // allocate frame data
2126 frame.create(size: Size(bytestotal, 1), CV_8U);
2127 // copy each plane to the frame
2128 __u32 offset = 0;
2129 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
2130 const v4l2_plane & cur_plane = currentBuffer.planes[n_planes];
2131 const Memory & cur_mem = currentBuffer.memories[n_planes];
2132 memcpy(dest: frame.data + offset,
2133 src: (char*)cur_mem.start + cur_plane.data_offset,
2134 n: std::min(a: currentBuffer.memories[n_planes].length, b: (size_t)cur_plane.bytesused));
2135 }
2136 } else {
2137 initFrameNonBGR();
2138 Mat(frame.size(), frame.type(), currentBuffer.memories[MEMORY_ORIG].start).copyTo(m: frame);
2139 }
2140 }
2141 //Revert buffer to the queue
2142 if (!tryIoctl(VIDIOC_QBUF, parameter: &buffers[bufferIndex].buffer))
2143 {
2144 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QBUF: errno=" << errno << " (" << strerror(errno) << ")");
2145 }
2146
2147 bufferIndex = -1;
2148 frame.copyTo(m: ret);
2149 return true;
2150}
2151
2152Ptr<IVideoCapture> create_V4L_capture_cam(int index)
2153{
2154 Ptr<CvCaptureCAM_V4L> ret = makePtr<CvCaptureCAM_V4L>();
2155 if (ret->open(index: index))
2156 return ret;
2157 return NULL;
2158}
2159
2160Ptr<IVideoCapture> create_V4L_capture_file(const std::string &filename)
2161{
2162 auto ret = makePtr<CvCaptureCAM_V4L>();
2163 if (ret->open(deviceName: filename))
2164 return ret;
2165 return NULL;
2166}
2167
2168static
2169bool VideoCapture_V4L_deviceHandlePoll(const std::vector<int>& deviceHandles, std::vector<int>& ready, int64 timeoutNs)
2170{
2171 CV_Assert(!deviceHandles.empty());
2172 const size_t N = deviceHandles.size();
2173
2174 ready.clear(); ready.reserve(n: N);
2175
2176 const auto poll_flags = POLLIN | POLLRDNORM | POLLERR;
2177
2178 std::vector<pollfd> fds; fds.reserve(n: N);
2179
2180 for (size_t i = 0; i < N; ++i)
2181 {
2182 int handle = deviceHandles[i];
2183 CV_LOG_DEBUG(NULL, "camera" << i << ": handle = " << handle);
2184 CV_Assert(handle != 0);
2185 fds.push_back(x: pollfd{.fd: handle, .events: poll_flags, .revents: 0});
2186 }
2187
2188 int timeoutMs = -1;
2189 if (timeoutNs > 0)
2190 {
2191 timeoutMs = saturate_cast<int>(v: (timeoutNs + 999999) / 1000000);
2192 }
2193
2194 int ret = poll(fds: fds.data(), nfds: N, timeout: timeoutMs);
2195 if (ret == -1)
2196 {
2197 perror(s: "poll error");
2198 return false;
2199 }
2200
2201 if (ret == 0)
2202 return 0; // just timeout
2203
2204 for (size_t i = 0; i < N; ++i)
2205 {
2206 const auto& fd = fds[i];
2207 CV_LOG_DEBUG(NULL, "camera" << i << ": fd.revents = 0x" << std::hex << fd.revents);
2208 if ((fd.revents & (POLLIN | POLLRDNORM)) != 0)
2209 {
2210 ready.push_back(x: i);
2211 }
2212 else if ((fd.revents & POLLERR) != 0)
2213 {
2214 CV_Error_(Error::StsError, ("Error is reported for camera stream: %d (handle = %d)", (int)i, deviceHandles[i]));
2215 }
2216 else
2217 {
2218 // not ready
2219 }
2220 }
2221 return true;
2222}
2223
2224bool VideoCapture_V4L_waitAny(const std::vector<VideoCapture>& streams, CV_OUT std::vector<int>& ready, int64 timeoutNs)
2225{
2226 CV_Assert(!streams.empty());
2227
2228 const size_t N = streams.size();
2229
2230 // unwrap internal API
2231 std::vector<CvCaptureCAM_V4L*> capPtr(N, NULL);
2232 for (size_t i = 0; i < N; ++i)
2233 {
2234 IVideoCapture* iCap = internal::VideoCapturePrivateAccessor::getIVideoCapture(cap: streams[i]);
2235 CvCaptureCAM_V4L *ptr_CvCaptureCAM_V4L = dynamic_cast<CvCaptureCAM_V4L*>(iCap);
2236 CV_Assert(ptr_CvCaptureCAM_V4L);
2237 capPtr[i] = ptr_CvCaptureCAM_V4L;
2238 }
2239
2240 // initialize cameras streams and get handles
2241 std::vector<int> deviceHandles; deviceHandles.reserve(n: N);
2242 for (size_t i = 0; i < N; ++i)
2243 {
2244 CvCaptureCAM_V4L *ptr = capPtr[i];
2245 if (ptr->FirstCapture)
2246 {
2247 ptr->havePendingFrame = ptr->grabFrame();
2248 CV_Assert(ptr->havePendingFrame);
2249 // TODO: Need to filter these cameras, because frame is available
2250 }
2251 CV_Assert(ptr->deviceHandle);
2252 deviceHandles.push_back(x: ptr->deviceHandle);
2253 }
2254
2255 bool res = VideoCapture_V4L_deviceHandlePoll(deviceHandles, ready, timeoutNs);
2256 for (size_t i = 0; i < ready.size(); ++i)
2257 {
2258 int idx = ready[i];
2259 CvCaptureCAM_V4L *ptr = capPtr[idx];
2260 ptr->havePendingFrame = ptr->grabFrame();
2261 CV_Assert(ptr->havePendingFrame);
2262 }
2263 return res;
2264}
2265
2266} // cv::
2267
2268#endif
2269

source code of opencv/modules/videoio/src/cap_v4l.cpp