1/* This is the contributed code:
2
3File: cvcap_v4l.cpp
4Current Location: ../opencv-0.9.6/otherlibs/videoio
5
6Original Version: 2003-03-12 Magnus Lundin lundin@mlu.mine.nu
7Original Comments:
8
9ML:This set of files adds support for firevre and usb cameras.
10First it tries to install a firewire camera,
11if that fails it tries a v4l/USB camera
12It has been tested with the motempl sample program
13
14First Patch: August 24, 2004 Travis Wood TravisOCV@tkwood.com
15For Release: OpenCV-Linux Beta4 opencv-0.9.6
16Tested On: LMLBT44 with 8 video inputs
17Patched Comments:
18
19TW: The cv cam utils that came with the initial release of OpenCV for LINUX Beta4
20were not working. I have rewritten them so they work for me. At the same time, trying
21to keep the original code as ML wrote it as unchanged as possible. No one likes to debug
22someone elses code, so I resisted changes as much as possible. I have tried to keep the
23same "ideas" where applicable, that is, where I could figure out what the previous author
24intended. Some areas I just could not help myself and had to "spiffy-it-up" my way.
25
26These drivers should work with other V4L frame capture cards other then my bttv
27driven frame capture card.
28
29Re Written driver for standard V4L mode. Tested using LMLBT44 video capture card.
30Standard bttv drivers are on the LMLBT44 with up to 8 Inputs.
31
32This utility was written with the help of the document:
33http://pages.cpsc.ucalgary.ca/~sayles/VFL_HowTo
34as a general guide for interfacing into the V4l standard.
35
36Made the index value passed for icvOpenCAM_V4L(index) be the number of the
37video device source in the /dev tree. The -1 uses original /dev/video.
38
39Index Device
40 0 /dev/video0
41 1 /dev/video1
42 2 /dev/video2
43 3 /dev/video3
44 ...
45 7 /dev/video7
46with
47 -1 /dev/video
48
49TW: You can select any video source, but this package was limited from the start to only
50ONE camera opened at any ONE time.
51This is an original program limitation.
52If you are interested, I will make my version available to other OpenCV users. The big
53difference in mine is you may pass the camera number as part of the cv argument, but this
54convention is non standard for current OpenCV calls and the camera number is not currently
55passed into the called routine.
56
57Second Patch: August 28, 2004 Sfuncia Fabio fiblan@yahoo.it
58For Release: OpenCV-Linux Beta4 Opencv-0.9.6
59
60FS: this patch fix not sequential index of device (unplugged device), and real numCameras.
61 for -1 index (icvOpenCAM_V4L) I don't use /dev/video but real device available, because
62 if /dev/video is a link to /dev/video0 and i unplugged device on /dev/video0, /dev/video
63 is a bad link. I search the first available device with indexList.
64
65Third Patch: December 9, 2004 Frederic Devernay Frederic.Devernay@inria.fr
66For Release: OpenCV-Linux Beta4 Opencv-0.9.6
67
68[FD] I modified the following:
69 - handle YUV420P, YUV420, and YUV411P palettes (for many webcams) without using floating-point
70 - cvGrabFrame should not wait for the end of the first frame, and should return quickly
71 (see videoio doc)
72 - cvRetrieveFrame should in turn wait for the end of frame capture, and should not
73 trigger the capture of the next frame (the user choses when to do it using GrabFrame)
74 To get the old behavior, re-call cvRetrieveFrame just after cvGrabFrame.
75 - having global bufferIndex and FirstCapture variables makes the code non-reentrant
76 (e.g. when using several cameras), put these in the CvCapture struct.
77 - according to V4L HowTo, incrementing the buffer index must be done before VIDIOCMCAPTURE.
78 - the VID_TYPE_SCALES stuff from V4L HowTo is wrong: image size can be changed
79 even if the hardware does not support scaling (e.g. webcams can have several
80 resolutions available). Just don't try to set the size at 640x480 if the hardware supports
81 scaling: open with the default (probably best) image size, and let the user scale it
82 using SetProperty.
83 - image size can be changed by two subsequent calls to SetProperty (for width and height)
84 - bug fix: if the image size changes, realloc the new image only when it is grabbed
85 - issue errors only when necessary, fix error message formatting.
86
87Fourth Patch: Sept 7, 2005 Csaba Kertesz sign@freemail.hu
88For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
89
90I modified the following:
91 - Additional Video4Linux2 support :)
92 - Use mmap functions (v4l2)
93 - New methods are internal:
94 try_palette_v4l2 -> rewrite try_palette for v4l2
95 mainloop_v4l2, read_image_v4l2 -> this methods are moved from official v4l2 capture.c example
96 try_init_v4l -> device v4l initialisation
97 try_init_v4l2 -> device v4l2 initialisation
98 autosetup_capture_mode_v4l -> autodetect capture modes for v4l
99 autosetup_capture_mode_v4l2 -> autodetect capture modes for v4l2
100 - Modifications are according with Video4Linux old codes
101 - Video4Linux handling is automatically if it does not recognize a Video4Linux2 device
102 - Tested successfully with Logitech Quickcam Express (V4L), Creative Vista (V4L) and Genius VideoCam Notebook (V4L2)
103 - Correct source lines with compiler warning messages
104 - Information message from v4l/v4l2 detection
105
106Fifth Patch: Sept 7, 2005 Csaba Kertesz sign@freemail.hu
107For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
108
109I modified the following:
110 - SN9C10x chip based webcams support
111 - New methods are internal:
112 bayer2rgb24, sonix_decompress -> decoder routines for SN9C10x decoding from Takafumi Mizuno <taka-qce@ls-a.jp> with his pleasure :)
113 - Tested successfully with Genius VideoCam Notebook (V4L2)
114
115Sixth Patch: Sept 10, 2005 Csaba Kertesz sign@freemail.hu
116For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
117
118I added the following:
119 - Add capture control support (hue, saturation, brightness, contrast, gain)
120 - Get and change V4L capture controls (hue, saturation, brightness, contrast)
121 - New method is internal:
122 icvSetControl -> set capture controls
123 - Tested successfully with Creative Vista (V4L)
124
125Seventh Patch: Sept 10, 2005 Csaba Kertesz sign@freemail.hu
126For Release: OpenCV-Linux Beta5 OpenCV-0.9.7
127
128I added the following:
129 - Detect, get and change V4L2 capture controls (hue, saturation, brightness, contrast, gain)
130 - New methods are internal:
131 v4l2_scan_controls_enumerate_menu, v4l2_scan_controls -> detect capture control intervals
132 - Tested successfully with Genius VideoCam Notebook (V4L2)
133
1348th patch: Jan 5, 2006, Olivier.Bornet@idiap.ch
135Add support of V4L2_PIX_FMT_YUYV and V4L2_PIX_FMT_MJPEG.
136With this patch, new webcams of Logitech, like QuickCam Fusion works.
137Note: For use these webcams, look at the UVC driver at
138http://linux-uvc.berlios.de/
139
1409th patch: Mar 4, 2006, Olivier.Bornet@idiap.ch
141- try V4L2 before V4L, because some devices are V4L2 by default,
142 but they try to implement the V4L compatibility layer.
143 So, I think this is better to support V4L2 before V4L.
144- better separation between V4L2 and V4L initialization. (this was needed to support
145 some drivers working, but not fully with V4L2. (so, we do not know when we
146 need to switch from V4L2 to V4L.
147
14810th patch: July 02, 2008, Mikhail Afanasyev fopencv@theamk.com
149Fix reliability problems with high-resolution UVC cameras on linux
150the symptoms were damaged image and 'Corrupt JPEG data: premature end of data segment' on stderr
151- V4L_ABORT_BADJPEG detects JPEG warnings and turns them into errors, so bad images
152 could be filtered out
153- USE_TEMP_BUFFER fixes the main problem (improper buffer management) and
154 prevents bad images in the first place
155
15611th patch: April 2, 2013, Forrest Reiling forrest.reiling@gmail.com
157Added v4l2 support for getting capture property CAP_PROP_POS_MSEC.
158Returns the millisecond timestamp of the last frame grabbed or 0 if no frames have been grabbed
159Used to successfully synchronize 2 Logitech C310 USB webcams to within 16 ms of one another
160
16112th patch: March 9, 2018, Taylor Lanclos <tlanclos@live.com>
162 added support for CAP_PROP_BUFFERSIZE
163
164make & enjoy!
165
166*/
167
168/*M///////////////////////////////////////////////////////////////////////////////////////
169//
170// IMPORTANT: READ BEFORE DOWNLOADING, COPYING, INSTALLING OR USING.
171//
172// By downloading, copying, installing or using the software you agree to this license.
173// If you do not agree to this license, do not download, install,
174// copy or use the software.
175//
176//
177// Intel License Agreement
178// For Open Source Computer Vision Library
179//
180// Copyright (C) 2000, Intel Corporation, all rights reserved.
181// Third party copyrights are property of their respective owners.
182//
183// Redistribution and use in source and binary forms, with or without modification,
184// are permitted provided that the following conditions are met:
185//
186// * Redistribution's of source code must retain the above copyright notice,
187// this list of conditions and the following disclaimer.
188//
189// * Redistribution's in binary form must reproduce the above copyright notice,
190// this list of conditions and the following disclaimer in the documentation
191// and/or other materials provided with the distribution.
192//
193// * The name of Intel Corporation may not be used to endorse or promote products
194// derived from this software without specific prior written permission.
195//
196// This software is provided by the copyright holders and contributors "as is" and
197// any express or implied warranties, including, but not limited to, the implied
198// warranties of merchantability and fitness for a particular purpose are disclaimed.
199// In no event shall the Intel Corporation or contributors be liable for any direct,
200// indirect, incidental, special, exemplary, or consequential damages
201// (including, but not limited to, procurement of substitute goods or services;
202// loss of use, data, or profits; or business interruption) however caused
203// and on any theory of liability, whether in contract, strict liability,
204// or tort (including negligence or otherwise) arising in any way out of
205// the use of this software, even if advised of the possibility of such damage.
206//
207//M*/
208
209#include "precomp.hpp"
210
211#if !defined _WIN32 && (defined HAVE_CAMV4L2 || defined HAVE_VIDEOIO)
212
213#include <stdio.h>
214#include <unistd.h>
215#include <fcntl.h>
216#include <errno.h>
217#include <sys/ioctl.h>
218#include <sys/types.h>
219#include <sys/mman.h>
220
221#include <string.h>
222#include <stdlib.h>
223#include <sys/stat.h>
224#include <sys/ioctl.h>
225#include <limits>
226
227#include <poll.h>
228
229#ifdef HAVE_CAMV4L2
230#include <asm/types.h> /* for videodev2.h */
231#include <linux/videodev2.h>
232#endif
233
234#ifdef HAVE_VIDEOIO
235// NetBSD compatibility layer with V4L2
236#include <sys/videoio.h>
237#endif
238
239#ifdef __OpenBSD__
240typedef uint32_t __u32;
241#endif
242
243// https://github.com/opencv/opencv/issues/13335
244#ifndef V4L2_CID_ISO_SENSITIVITY
245#define V4L2_CID_ISO_SENSITIVITY (V4L2_CID_CAMERA_CLASS_BASE+23)
246#endif
247
248// https://github.com/opencv/opencv/issues/13929
249#ifndef V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT
250#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT (V4L2_CID_MPEG_BASE+364)
251#endif
252#ifndef V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH
253#define V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH (V4L2_CID_MPEG_BASE+365)
254#endif
255
256#ifndef V4L2_CID_ROTATE
257#define V4L2_CID_ROTATE (V4L2_CID_BASE+34)
258#endif
259#ifndef V4L2_CID_IRIS_ABSOLUTE
260#define V4L2_CID_IRIS_ABSOLUTE (V4L2_CID_CAMERA_CLASS_BASE+17)
261#endif
262
263#ifndef v4l2_fourcc_be
264#define v4l2_fourcc_be(a, b, c, d) (v4l2_fourcc(a, b, c, d) | (1U << 31))
265#endif
266
267#ifndef V4L2_PIX_FMT_Y10
268#define V4L2_PIX_FMT_Y10 v4l2_fourcc('Y', '1', '0', ' ')
269#endif
270
271#ifndef V4L2_PIX_FMT_Y12
272#define V4L2_PIX_FMT_Y12 v4l2_fourcc('Y', '1', '2', ' ')
273#endif
274
275#ifndef V4L2_PIX_FMT_Y16
276#define V4L2_PIX_FMT_Y16 v4l2_fourcc('Y', '1', '6', ' ')
277#endif
278
279#ifndef V4L2_PIX_FMT_Y16_BE
280#define V4L2_PIX_FMT_Y16_BE v4l2_fourcc_be('Y', '1', '6', ' ')
281#endif
282
283#ifndef V4L2_PIX_FMT_ABGR32
284#define V4L2_PIX_FMT_ABGR32 v4l2_fourcc('A', 'R', '2', '4')
285#endif
286#ifndef V4L2_PIX_FMT_XBGR32
287#define V4L2_PIX_FMT_XBGR32 v4l2_fourcc('X', 'R', '2', '4')
288#endif
289
290/* Defaults - If your board can do better, set it here. Set for the most common type inputs. */
291#define DEFAULT_V4L_WIDTH 640
292#define DEFAULT_V4L_HEIGHT 480
293#define DEFAULT_V4L_FPS 30
294
295#define MAX_CAMERAS 8
296
297// default and maximum number of V4L buffers, not including last, 'special' buffer
298#define MAX_V4L_BUFFERS 10
299#define DEFAULT_V4L_BUFFERS 4
300
301// types of memory in 'special' buffer
302enum {
303 MEMORY_ORIG = 0, // Image data in original format.
304 MEMORY_RGB = 1, // Image data converted to RGB format.
305};
306
307// if enabled, then bad JPEG warnings become errors and cause NULL returned instead of image
308#define V4L_ABORT_BADJPEG
309
310namespace cv {
311
312static const char* decode_ioctl_code(unsigned long ioctlCode)
313{
314 switch (ioctlCode)
315 {
316#define CV_ADD_IOCTL_CODE(id) case id: return #id
317 CV_ADD_IOCTL_CODE(VIDIOC_G_FMT);
318 CV_ADD_IOCTL_CODE(VIDIOC_S_FMT);
319 CV_ADD_IOCTL_CODE(VIDIOC_REQBUFS);
320 CV_ADD_IOCTL_CODE(VIDIOC_DQBUF);
321 CV_ADD_IOCTL_CODE(VIDIOC_QUERYCAP);
322 CV_ADD_IOCTL_CODE(VIDIOC_S_PARM);
323 CV_ADD_IOCTL_CODE(VIDIOC_G_PARM);
324 CV_ADD_IOCTL_CODE(VIDIOC_QUERYBUF);
325 CV_ADD_IOCTL_CODE(VIDIOC_QBUF);
326 CV_ADD_IOCTL_CODE(VIDIOC_STREAMON);
327 CV_ADD_IOCTL_CODE(VIDIOC_STREAMOFF);
328 CV_ADD_IOCTL_CODE(VIDIOC_ENUMINPUT);
329 CV_ADD_IOCTL_CODE(VIDIOC_G_INPUT);
330 CV_ADD_IOCTL_CODE(VIDIOC_S_INPUT);
331 CV_ADD_IOCTL_CODE(VIDIOC_G_CTRL);
332 CV_ADD_IOCTL_CODE(VIDIOC_S_CTRL);
333#undef CV_ADD_IOCTL_CODE
334 }
335 return "unknown";
336}
337
338struct Memory
339{
340 void * start;
341 size_t length;
342
343 Memory() : start(NULL), length(0) {}
344};
345
346/* Device Capture Objects */
347/* V4L2 structure */
348struct Buffer
349{
350 Memory memories[VIDEO_MAX_PLANES];
351 v4l2_plane planes[VIDEO_MAX_PLANES] = {};
352 // Total number of bytes occupied by data in the all planes (payload)
353 __u32 bytesused;
354 // This is dequeued buffer. It used for to put it back in the queue.
355 // The buffer is valid only if capture->bufferIndex >= 0
356 v4l2_buffer buffer;
357
358 Buffer()
359 {
360 buffer = v4l2_buffer();
361 }
362};
363
364struct CvCaptureCAM_V4L CV_FINAL : public IVideoCapture
365{
366 int getCaptureDomain() /*const*/ CV_OVERRIDE { return cv::CAP_V4L; }
367
368 int deviceHandle;
369 bool v4l_buffersRequested;
370 bool v4l_streamStarted;
371
372 int bufferIndex;
373 bool FirstCapture;
374 String deviceName;
375
376 Mat frame;
377
378 __u32 palette;
379 int width, height;
380 int width_set, height_set;
381 int bufferSize;
382 __u32 fps;
383 bool convert_rgb;
384 bool returnFrame;
385 // To select a video input set cv::CAP_PROP_CHANNEL to channel number.
386 // If the new channel number is than 0, then a video input will not change
387 int channelNumber;
388 // Normalize properties. If set parameters will be converted to/from [0,1) range.
389 // Enabled by default (as OpenCV 3.x does).
390 // Value is initialized from the environment variable `OPENCV_VIDEOIO_V4L_RANGE_NORMALIZED`:
391 // To select real parameters mode after devise is open set cv::CAP_PROP_MODE to 0
392 // any other value revert the backward compatibility mode (with normalized properties).
393 // Range normalization affects the following parameters:
394 // cv::CAP_PROP_*: BRIGHTNESS,CONTRAST,SATURATION,HUE,GAIN,EXPOSURE,FOCUS,AUTOFOCUS,AUTO_EXPOSURE.
395 bool normalizePropRange;
396
397 /* V4L2 variables */
398 Buffer buffers[MAX_V4L_BUFFERS + 1];
399 v4l2_capability capability;
400 v4l2_input videoInput;
401 v4l2_format form;
402 v4l2_requestbuffers req;
403 v4l2_buf_type type;
404 unsigned char num_planes;
405
406 timeval timestamp;
407
408 bool open(int _index);
409 bool open(const std::string & filename);
410 bool isOpened() const CV_OVERRIDE;
411
412 void closeDevice();
413
414 virtual double getProperty(int) const CV_OVERRIDE;
415 virtual bool setProperty(int, double) CV_OVERRIDE;
416 virtual bool grabFrame() CV_OVERRIDE;
417 virtual bool retrieveFrame(int, OutputArray) CV_OVERRIDE;
418
419 CvCaptureCAM_V4L();
420 virtual ~CvCaptureCAM_V4L();
421 bool requestBuffers();
422 bool requestBuffers(unsigned int buffer_number);
423 bool createBuffers();
424 void releaseBuffers();
425 bool initCapture();
426 bool streaming(bool startStream);
427 bool setFps(int value);
428 bool tryIoctl(unsigned long ioctlCode, void *parameter, bool failIfBusy = true, int attempts = 10) const;
429 bool controlInfo(int property_id, __u32 &v4l2id, cv::Range &range) const;
430 bool icvControl(__u32 v4l2id, int &value, bool isSet) const;
431
432 bool icvSetFrameSize(int _width, int _height);
433 bool v4l2_reset();
434 bool setVideoInputChannel();
435 bool try_palette_v4l2();
436 bool try_init_v4l2();
437 bool autosetup_capture_mode_v4l2();
438 bool read_frame_v4l2();
439 bool convertableToRgb() const;
440 void convertToRgb(const Buffer &currentBuffer);
441
442 bool havePendingFrame; // true if next .grab() should be noop, .retrive() resets this flag
443};
444
445/*********************** Implementations ***************************************/
446
447CvCaptureCAM_V4L::CvCaptureCAM_V4L() :
448 deviceHandle(-1),
449 v4l_buffersRequested(false),
450 v4l_streamStarted(false),
451 bufferIndex(-1),
452 FirstCapture(true),
453 palette(0),
454 width(0), height(0), width_set(0), height_set(0),
455 bufferSize(DEFAULT_V4L_BUFFERS),
456 fps(0), convert_rgb(0), returnFrame(false),
457 channelNumber(-1), normalizePropRange(false),
458 type(V4L2_BUF_TYPE_VIDEO_CAPTURE),
459 num_planes(0),
460 havePendingFrame(false)
461{
462 memset(s: &timestamp, c: 0, n: sizeof(timestamp));
463}
464
465CvCaptureCAM_V4L::~CvCaptureCAM_V4L()
466{
467 try
468 {
469 closeDevice();
470 }
471 catch (...)
472 {
473 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2): unable properly close device: " << deviceName);
474 if (deviceHandle != -1)
475 close(fd: deviceHandle);
476 }
477}
478
479void CvCaptureCAM_V4L::closeDevice()
480{
481 if (v4l_streamStarted)
482 streaming(startStream: false);
483 if (v4l_buffersRequested)
484 releaseBuffers();
485 if(deviceHandle != -1)
486 {
487 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): close(" << deviceHandle << ")");
488 close(fd: deviceHandle);
489 }
490 deviceHandle = -1;
491}
492
493bool CvCaptureCAM_V4L::isOpened() const
494{
495 return deviceHandle != -1;
496}
497
498bool CvCaptureCAM_V4L::try_palette_v4l2()
499{
500 form = v4l2_format();
501 form.type = type;
502 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
503 form.fmt.pix_mp.pixelformat = palette;
504 form.fmt.pix_mp.field = V4L2_FIELD_ANY;
505 form.fmt.pix_mp.width = width;
506 form.fmt.pix_mp.height = height;
507 } else {
508 form.fmt.pix.pixelformat = palette;
509 form.fmt.pix.field = V4L2_FIELD_ANY;
510 form.fmt.pix.width = width;
511 form.fmt.pix.height = height;
512 }
513 if (!tryIoctl(VIDIOC_S_FMT, parameter: &form, failIfBusy: true))
514 {
515 return false;
516 }
517 if (V4L2_TYPE_IS_MULTIPLANAR(type))
518 return palette == form.fmt.pix_mp.pixelformat;
519 return palette == form.fmt.pix.pixelformat;
520}
521
522bool CvCaptureCAM_V4L::setVideoInputChannel()
523{
524 if(channelNumber < 0)
525 return true;
526 /* Query channels number */
527 int channel = 0;
528 if (!tryIoctl(VIDIOC_G_INPUT, parameter: &channel))
529 return false;
530
531 if(channel == channelNumber)
532 return true;
533
534 /* Query information about new input channel */
535 videoInput = v4l2_input();
536 videoInput.index = channelNumber;
537 if (!tryIoctl(VIDIOC_ENUMINPUT, parameter: &videoInput))
538 return false;
539
540 //To select a video input applications store the number of the desired input in an integer
541 // and call the VIDIOC_S_INPUT ioctl with a pointer to this integer. Side effects are possible.
542 // For example inputs may support different video standards, so the driver may implicitly
543 // switch the current standard.
544 // It is good practice to select an input before querying or negotiating any other parameters.
545 return tryIoctl(VIDIOC_S_INPUT, parameter: &channelNumber);
546}
547
548bool CvCaptureCAM_V4L::try_init_v4l2()
549{
550 /* The following code sets the CHANNEL_NUMBER of the video input. Some video sources
551 have sub "Channel Numbers". For a typical V4L TV capture card, this is usually 1.
552 I myself am using a simple NTSC video input capture card that uses the value of 1.
553 If you are not in North America or have a different video standard, you WILL have to change
554 the following settings and recompile/reinstall. This set of settings is based on
555 the most commonly encountered input video source types (like my bttv card) */
556
557 // The cv::CAP_PROP_MODE used for set the video input channel number
558 if (!setVideoInputChannel())
559 {
560 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Unable to set Video Input Channel");
561 return false;
562 }
563
564 // Test device for V4L2 compatibility
565 capability = v4l2_capability();
566 if (!tryIoctl(VIDIOC_QUERYCAP, parameter: &capability))
567 {
568 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Unable to query capability");
569 return false;
570 }
571
572 if ((capability.capabilities & (V4L2_CAP_VIDEO_CAPTURE | V4L2_CAP_VIDEO_CAPTURE_MPLANE)) == 0)
573 {
574 /* Nope. */
575 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): not supported - device is unable to capture video (missing V4L2_CAP_VIDEO_CAPTURE or V4L2_CAP_VIDEO_CAPTURE_MPLANE)");
576 return false;
577 }
578
579 if (capability.capabilities & V4L2_CAP_VIDEO_CAPTURE_MPLANE)
580 type = V4L2_BUF_TYPE_VIDEO_CAPTURE_MPLANE;
581 return true;
582}
583
584bool CvCaptureCAM_V4L::autosetup_capture_mode_v4l2()
585{
586 //in case palette is already set and works, no need to setup.
587 if (palette != 0)
588 {
589 if (try_palette_v4l2())
590 {
591 return true;
592 }
593 else if (errno == EBUSY)
594 {
595 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): device is busy");
596 closeDevice();
597 return false;
598 }
599 }
600 __u32 try_order[] = {
601 V4L2_PIX_FMT_BGR24,
602 V4L2_PIX_FMT_RGB24,
603 V4L2_PIX_FMT_YVU420,
604 V4L2_PIX_FMT_YUV420,
605 V4L2_PIX_FMT_YUV411P,
606 V4L2_PIX_FMT_YUYV,
607 V4L2_PIX_FMT_UYVY,
608 V4L2_PIX_FMT_NV12,
609 V4L2_PIX_FMT_NV21,
610 V4L2_PIX_FMT_SBGGR8,
611 V4L2_PIX_FMT_SGBRG8,
612 V4L2_PIX_FMT_SGRBG8,
613 V4L2_PIX_FMT_XBGR32,
614 V4L2_PIX_FMT_ABGR32,
615 V4L2_PIX_FMT_SN9C10X,
616#ifdef HAVE_JPEG
617 V4L2_PIX_FMT_MJPEG,
618 V4L2_PIX_FMT_JPEG,
619#endif
620 V4L2_PIX_FMT_Y16,
621 V4L2_PIX_FMT_Y16_BE,
622 V4L2_PIX_FMT_Y12,
623 V4L2_PIX_FMT_Y10,
624 V4L2_PIX_FMT_GREY,
625 };
626
627 for (size_t i = 0; i < sizeof(try_order) / sizeof(__u32); i++) {
628 palette = try_order[i];
629 if (try_palette_v4l2()) {
630 return true;
631 } else if (errno == EBUSY) {
632 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): device is busy");
633 closeDevice();
634 return false;
635 }
636 }
637 return false;
638}
639
640bool CvCaptureCAM_V4L::setFps(int value)
641{
642 if (!isOpened())
643 return false;
644
645 v4l2_streamparm streamparm = v4l2_streamparm();
646 streamparm.type = type;
647 streamparm.parm.capture.timeperframe.numerator = 1;
648 streamparm.parm.capture.timeperframe.denominator = __u32(value);
649 if (!tryIoctl(VIDIOC_S_PARM, parameter: &streamparm) || !tryIoctl(VIDIOC_G_PARM, parameter: &streamparm))
650 {
651 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): can't set FPS: " << value);
652 return false;
653 }
654
655 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): FPS="
656 << streamparm.parm.capture.timeperframe.denominator << "/"
657 << streamparm.parm.capture.timeperframe.numerator);
658 fps = streamparm.parm.capture.timeperframe.denominator; // TODO use numerator
659 return true;
660}
661
662bool CvCaptureCAM_V4L::convertableToRgb() const
663{
664 switch (palette) {
665 case V4L2_PIX_FMT_YVU420:
666 case V4L2_PIX_FMT_YUV420:
667 case V4L2_PIX_FMT_NV12:
668 case V4L2_PIX_FMT_NV21:
669 case V4L2_PIX_FMT_YUV411P:
670#ifdef HAVE_JPEG
671 case V4L2_PIX_FMT_MJPEG:
672 case V4L2_PIX_FMT_JPEG:
673#endif
674 case V4L2_PIX_FMT_YUYV:
675 case V4L2_PIX_FMT_UYVY:
676 case V4L2_PIX_FMT_SBGGR8:
677 case V4L2_PIX_FMT_SN9C10X:
678 case V4L2_PIX_FMT_SGBRG8:
679 case V4L2_PIX_FMT_SGRBG8:
680 case V4L2_PIX_FMT_RGB24:
681 case V4L2_PIX_FMT_Y16:
682 case V4L2_PIX_FMT_Y16_BE:
683 case V4L2_PIX_FMT_Y10:
684 case V4L2_PIX_FMT_GREY:
685 case V4L2_PIX_FMT_BGR24:
686 case V4L2_PIX_FMT_XBGR32:
687 case V4L2_PIX_FMT_ABGR32:
688 return true;
689 default:
690 break;
691 }
692 return false;
693}
694
695bool CvCaptureCAM_V4L::initCapture()
696{
697 if (!isOpened())
698 return false;
699
700 if (!try_init_v4l2())
701 {
702 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): init failed: errno=" << errno << " (" << strerror(errno) << ")");
703 return false;
704 }
705
706 /* Find Window info */
707 form = v4l2_format();
708 form.type = type;
709
710 if (!tryIoctl(VIDIOC_G_FMT, parameter: &form))
711 {
712 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Could not obtain specifics of capture window (VIDIOC_G_FMT): errno=" << errno << " (" << strerror(errno) << ")");
713 return false;
714 }
715
716 if (!autosetup_capture_mode_v4l2())
717 {
718 if (errno != EBUSY)
719 {
720 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): Pixel format of incoming image is unsupported by OpenCV");
721 }
722 return false;
723 }
724
725 /* try to set framerate */
726 setFps(fps);
727
728 /* Buggy driver paranoia. */
729 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
730 // TODO: add size adjustment if needed
731 } else {
732 unsigned int min;
733
734 min = form.fmt.pix.width * 2;
735
736 if (form.fmt.pix.bytesperline < min)
737 form.fmt.pix.bytesperline = min;
738
739 min = form.fmt.pix.bytesperline * form.fmt.pix.height;
740
741 if (form.fmt.pix.sizeimage < min)
742 form.fmt.pix.sizeimage = min;
743 }
744
745 if (V4L2_TYPE_IS_MULTIPLANAR(type))
746 num_planes = form.fmt.pix_mp.num_planes;
747 else
748 num_planes = 1;
749
750 if (!requestBuffers())
751 return false;
752
753 if (!createBuffers()) {
754 /* free capture, and returns an error code */
755 releaseBuffers();
756 return false;
757 }
758
759 // reinitialize buffers
760 FirstCapture = true;
761
762 return true;
763};
764
765bool CvCaptureCAM_V4L::requestBuffers()
766{
767 unsigned int buffer_number = bufferSize;
768 while (buffer_number > 0) {
769 if (requestBuffers(buffer_number) && req.count >= buffer_number)
770 {
771 break;
772 }
773
774 buffer_number--;
775 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): Insufficient buffer memory -- decreasing buffers: " << buffer_number);
776 }
777 if (buffer_number < 1) {
778 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): Insufficient buffer memory");
779 return false;
780 }
781 bufferSize = req.count;
782 return true;
783}
784
785bool CvCaptureCAM_V4L::requestBuffers(unsigned int buffer_number)
786{
787 if (!isOpened())
788 return false;
789
790 req = v4l2_requestbuffers();
791 req.count = buffer_number;
792 req.type = type;
793 req.memory = V4L2_MEMORY_MMAP;
794
795 if (!tryIoctl(VIDIOC_REQBUFS, parameter: &req)) {
796 int err = errno;
797 if (EINVAL == err)
798 {
799 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): no support for memory mapping");
800 }
801 else
802 {
803 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_REQBUFS: errno=" << err << " (" << strerror(err) << ")");
804 }
805 return false;
806 }
807 v4l_buffersRequested = true;
808 return true;
809}
810
811bool CvCaptureCAM_V4L::createBuffers()
812{
813 size_t maxLength = 0;
814 for (unsigned int n_buffers = 0; n_buffers < req.count; ++n_buffers) {
815 v4l2_buffer buf = v4l2_buffer();
816 v4l2_plane mplanes[VIDEO_MAX_PLANES];
817 size_t length = 0;
818 off_t offset = 0;
819 buf.type = type;
820 buf.memory = V4L2_MEMORY_MMAP;
821 buf.index = n_buffers;
822 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
823 buf.m.planes = mplanes;
824 buf.length = VIDEO_MAX_PLANES;
825 }
826
827 if (!tryIoctl(VIDIOC_QUERYBUF, parameter: &buf)) {
828 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QUERYBUF: errno=" << errno << " (" << strerror(errno) << ")");
829 return false;
830 }
831
832 CV_Assert(1 <= num_planes && num_planes <= VIDEO_MAX_PLANES);
833 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
834 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
835 length = buf.m.planes[n_planes].length;
836 offset = buf.m.planes[n_planes].m.mem_offset;
837 } else {
838 length = buf.length;
839 offset = buf.m.offset;
840 }
841
842 buffers[n_buffers].memories[n_planes].length = length;
843 buffers[n_buffers].memories[n_planes].start =
844 mmap(NULL /* start anywhere */,
845 len: length,
846 PROT_READ /* required */,
847 MAP_SHARED /* recommended */,
848 fd: deviceHandle, offset: offset);
849 if (MAP_FAILED == buffers[n_buffers].memories[n_planes].start) {
850 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed mmap(" << length << "): errno=" << errno << " (" << strerror(errno) << ")");
851 return false;
852 }
853 }
854
855 maxLength = maxLength > length ? maxLength : length;
856 }
857 if (maxLength > 0) {
858 maxLength *= num_planes;
859 buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start = malloc(size: maxLength);
860 buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].length = maxLength;
861 buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start = malloc(size: maxLength);
862 buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].length = maxLength;
863 }
864 return (buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start != 0) &&
865 (buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start != 0);
866}
867
868/**
869 * some properties can not be changed while the device is in streaming mode.
870 * this method closes and re-opens the device to re-start the stream.
871 * this also causes buffers to be reallocated if the frame size was changed.
872 */
873bool CvCaptureCAM_V4L::v4l2_reset()
874{
875 streaming(startStream: false);
876 releaseBuffers();
877 return initCapture();
878}
879
880bool CvCaptureCAM_V4L::open(int _index)
881{
882 cv::String name;
883 /* Select camera, or rather, V4L video source */
884 if (_index < 0) // Asking for the first device available
885 {
886 for (int autoindex = 0; autoindex < MAX_CAMERAS; ++autoindex)
887 {
888 name = cv::format(fmt: "/dev/video%d", autoindex);
889 /* Test using an open to see if this new device name really does exists. */
890 int h = ::open(file: name.c_str(), O_RDONLY);
891 if (h != -1)
892 {
893 ::close(fd: h);
894 _index = autoindex;
895 break;
896 }
897 }
898 if (_index < 0)
899 {
900 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2): can't find camera device");
901 name.clear();
902 return false;
903 }
904 }
905 else
906 {
907 name = cv::format(fmt: "/dev/video%d", _index);
908 }
909
910 bool res = open(filename: name);
911 if (!res)
912 {
913 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): can't open camera by index");
914 }
915 return res;
916}
917
918bool CvCaptureCAM_V4L::open(const std::string & _deviceName)
919{
920 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << _deviceName << "): opening...");
921 FirstCapture = true;
922 width = utils::getConfigurationParameterSizeT(name: "OPENCV_VIDEOIO_V4L_DEFAULT_WIDTH", DEFAULT_V4L_WIDTH);
923 height = utils::getConfigurationParameterSizeT(name: "OPENCV_VIDEOIO_V4L_DEFAULT_HEIGHT", DEFAULT_V4L_HEIGHT);
924 width_set = height_set = 0;
925 bufferSize = DEFAULT_V4L_BUFFERS;
926 fps = DEFAULT_V4L_FPS;
927 convert_rgb = true;
928 deviceName = _deviceName;
929 returnFrame = true;
930 normalizePropRange = utils::getConfigurationParameterBool(name: "OPENCV_VIDEOIO_V4L_RANGE_NORMALIZED", defaultValue: false);
931 channelNumber = -1;
932 bufferIndex = -1;
933
934 deviceHandle = ::open(file: deviceName.c_str(), O_RDWR /* required */ | O_NONBLOCK, 0);
935 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << _deviceName << "): deviceHandle=" << deviceHandle);
936 if (deviceHandle == -1)
937 return false;
938
939 return initCapture();
940}
941
942bool CvCaptureCAM_V4L::read_frame_v4l2()
943{
944 v4l2_buffer buf = v4l2_buffer();
945 v4l2_plane mplanes[VIDEO_MAX_PLANES];
946 buf.type = type;
947 buf.memory = V4L2_MEMORY_MMAP;
948 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
949 buf.m.planes = mplanes;
950 buf.length = VIDEO_MAX_PLANES;
951 }
952
953 while (!tryIoctl(VIDIOC_DQBUF, parameter: &buf)) {
954 int err = errno;
955 if (err == EIO && !(buf.flags & (V4L2_BUF_FLAG_QUEUED | V4L2_BUF_FLAG_DONE))) {
956 // Maybe buffer not in the queue? Try to put there
957 if (!tryIoctl(VIDIOC_QBUF, parameter: &buf))
958 return false;
959 continue;
960 }
961 /* display the error and stop processing */
962 returnFrame = false;
963 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): can't read frame (VIDIOC_DQBUF): errno=" << err << " (" << strerror(err) << ")");
964 return false;
965 }
966
967 CV_Assert(buf.index < req.count);
968
969 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
970 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++)
971 CV_Assert(buffers[buf.index].memories[n_planes].length == buf.m.planes[n_planes].length);
972 } else
973 CV_Assert(buffers[buf.index].memories[MEMORY_ORIG].length == buf.length);
974
975 //We shouldn't use this buffer in the queue while not retrieve frame from it.
976 buffers[buf.index].buffer = buf;
977 bufferIndex = buf.index;
978
979 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
980 __u32 offset = 0;
981
982 buffers[buf.index].buffer.m.planes = buffers[buf.index].planes;
983 memcpy(dest: buffers[buf.index].planes, src: buf.m.planes, n: sizeof(mplanes));
984
985 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
986 __u32 bytesused;
987 bytesused = buffers[buf.index].planes[n_planes].bytesused -
988 buffers[buf.index].planes[n_planes].data_offset;
989 offset += bytesused;
990 }
991 buffers[buf.index].bytesused = offset;
992 } else
993 buffers[buf.index].bytesused = buffers[buf.index].buffer.bytesused;
994
995 //set timestamp in capture struct to be timestamp of most recent frame
996 timestamp = buf.timestamp;
997 return true;
998}
999
1000bool CvCaptureCAM_V4L::tryIoctl(unsigned long ioctlCode, void *parameter, bool failIfBusy, int attempts) const
1001{
1002 CV_Assert(attempts > 0);
1003 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): tryIoctl(" << deviceHandle << ", "
1004 << decode_ioctl_code(ioctlCode) << "(" << ioctlCode << "), failIfBusy=" << failIfBusy << ")"
1005 );
1006 while (true)
1007 {
1008 errno = 0;
1009 int result = ioctl(fd: deviceHandle, request: ioctlCode, parameter);
1010 int err = errno;
1011 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): call ioctl(" << deviceHandle << ", "
1012 << decode_ioctl_code(ioctlCode) << "(" << ioctlCode << "), ...) => "
1013 << result << " errno=" << err << " (" << strerror(err) << ")"
1014 );
1015
1016 if (result != -1)
1017 return true; // success
1018
1019 const bool isBusy = (err == EBUSY);
1020 if (isBusy && failIfBusy)
1021 {
1022 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): ioctl returns with errno=EBUSY");
1023 return false;
1024 }
1025 if (!(isBusy || errno == EAGAIN))
1026 return false;
1027
1028 if (--attempts == 0) {
1029 return false;
1030 }
1031
1032 fd_set fds;
1033 FD_ZERO(&fds);
1034 FD_SET(deviceHandle, &fds);
1035
1036 /* Timeout. */
1037 static int param_v4l_select_timeout = (int)utils::getConfigurationParameterSizeT(name: "OPENCV_VIDEOIO_V4L_SELECT_TIMEOUT", defaultValue: 10);
1038 struct timeval tv;
1039 tv.tv_sec = param_v4l_select_timeout;
1040 tv.tv_usec = 0;
1041
1042 errno = 0;
1043 result = select(nfds: deviceHandle + 1, readfds: &fds, NULL, NULL, timeout: &tv);
1044 err = errno;
1045
1046 if (0 == result)
1047 {
1048 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): select() timeout.");
1049 return false;
1050 }
1051
1052 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): select(" << deviceHandle << ") => "
1053 << result << " errno = " << err << " (" << strerror(err) << ")"
1054 );
1055
1056 if (EINTR == err) // don't loop if signal occurred, like Ctrl+C
1057 {
1058 return false;
1059 }
1060 }
1061 return true;
1062}
1063
1064bool CvCaptureCAM_V4L::grabFrame()
1065{
1066 if (havePendingFrame) // frame has been already grabbed during preroll
1067 {
1068 return true;
1069 }
1070
1071 if (FirstCapture)
1072 {
1073 /* Some general initialization must take place the first time through */
1074
1075 /* This is just a technicality, but all buffers must be filled up before any
1076 staggered SYNC is applied. SO, filler up. (see V4L HowTo) */
1077 bufferIndex = -1;
1078 for (__u32 index = 0; index < req.count; ++index) {
1079 v4l2_buffer buf = v4l2_buffer();
1080 v4l2_plane mplanes[VIDEO_MAX_PLANES];
1081
1082 buf.type = type;
1083 buf.memory = V4L2_MEMORY_MMAP;
1084 buf.index = index;
1085 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
1086 buf.m.planes = mplanes;
1087 buf.length = VIDEO_MAX_PLANES;
1088 }
1089
1090 if (!tryIoctl(VIDIOC_QBUF, parameter: &buf)) {
1091 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QBUF (buffer=" << index << "): errno=" << errno << " (" << strerror(errno) << ")");
1092 return false;
1093 }
1094 }
1095
1096 if (!streaming(startStream: true)) {
1097 return false;
1098 }
1099
1100 // No need to skip this if the first read returns false
1101 /* preparation is ok */
1102 FirstCapture = false;
1103
1104#if defined(V4L_ABORT_BADJPEG)
1105 // skip first frame. it is often bad -- this is unnotied in traditional apps,
1106 // but could be fatal if bad jpeg is enabled
1107 if (!read_frame_v4l2())
1108 return false;
1109#endif
1110 }
1111 // In the case that the grab frame was without retrieveFrame
1112 if (bufferIndex >= 0)
1113 {
1114 if (!tryIoctl(VIDIOC_QBUF, parameter: &buffers[bufferIndex].buffer))
1115 {
1116 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QBUF (buffer=" << bufferIndex << "): errno=" << errno << " (" << strerror(errno) << ")");
1117 }
1118 }
1119 return read_frame_v4l2();
1120}
1121
1122/*
1123 * Turn a YUV4:2:0 block into an RGB block
1124 *
1125 * Video4Linux seems to use the blue, green, red channel
1126 * order convention-- rgb[0] is blue, rgb[1] is green, rgb[2] is red.
1127 *
1128 * Color space conversion coefficients taken from the excellent
1129 * http://www.inforamp.net/~poynton/ColorFAQ.html
1130 * In his terminology, this is a CCIR 601.1 YCbCr -> RGB.
1131 * Y values are given for all 4 pixels, but the U (Pb)
1132 * and V (Pr) are assumed constant over the 2x2 block.
1133 *
1134 * To avoid floating point arithmetic, the color conversion
1135 * coefficients are scaled into 16.16 fixed-point integers.
1136 * They were determined as follows:
1137 *
1138 * double brightness = 1.0; (0->black; 1->full scale)
1139 * double saturation = 1.0; (0->greyscale; 1->full color)
1140 * double fixScale = brightness * 256 * 256;
1141 * int rvScale = (int)(1.402 * saturation * fixScale);
1142 * int guScale = (int)(-0.344136 * saturation * fixScale);
1143 * int gvScale = (int)(-0.714136 * saturation * fixScale);
1144 * int buScale = (int)(1.772 * saturation * fixScale);
1145 * int yScale = (int)(fixScale);
1146 */
1147
1148/* LIMIT: convert a 16.16 fixed-point value to a byte, with clipping. */
1149#define LIMIT(x) ((x)>0xffffff?0xff: ((x)<=0xffff?0:((x)>>16)))
1150
1151static inline void
1152move_411_block(int yTL, int yTR, int yBL, int yBR, int u, int v,
1153 int /*rowPixels*/, unsigned char * rgb)
1154{
1155 const int rvScale = 91881;
1156 const int guScale = -22553;
1157 const int gvScale = -46801;
1158 const int buScale = 116129;
1159 const int yScale = 65536;
1160 int r, g, b;
1161
1162 g = guScale * u + gvScale * v;
1163 // if (force_rgb) {
1164 // r = buScale * u;
1165 // b = rvScale * v;
1166 // } else {
1167 r = rvScale * v;
1168 b = buScale * u;
1169 // }
1170
1171 yTL *= yScale; yTR *= yScale;
1172 yBL *= yScale; yBR *= yScale;
1173
1174 /* Write out top two first pixels */
1175 rgb[0] = LIMIT(b+yTL); rgb[1] = LIMIT(g+yTL);
1176 rgb[2] = LIMIT(r+yTL);
1177
1178 rgb[3] = LIMIT(b+yTR); rgb[4] = LIMIT(g+yTR);
1179 rgb[5] = LIMIT(r+yTR);
1180
1181 /* Write out top two last pixels */
1182 rgb += 6;
1183 rgb[0] = LIMIT(b+yBL); rgb[1] = LIMIT(g+yBL);
1184 rgb[2] = LIMIT(r+yBL);
1185
1186 rgb[3] = LIMIT(b+yBR); rgb[4] = LIMIT(g+yBR);
1187 rgb[5] = LIMIT(r+yBR);
1188}
1189
1190// Consider a YUV411P image of 8x2 pixels.
1191//
1192// A plane of Y values as before.
1193//
1194// A plane of U values 1 2
1195// 3 4
1196//
1197// A plane of V values 1 2
1198// 3 4
1199//
1200// The U1/V1 samples correspond to the ABCD pixels.
1201// U2/V2 samples correspond to the EFGH pixels.
1202//
1203/* Converts from planar YUV411P to RGB24. */
1204/* [FD] untested... */
1205static void
1206yuv411p_to_rgb24(int width, int height,
1207 unsigned char *pIn0, unsigned char *pOut0)
1208{
1209 const int numpix = width * height;
1210 const int bytes = 24 >> 3;
1211 int i, j, y00, y01, y10, y11, u, v;
1212 unsigned char *pY = pIn0;
1213 unsigned char *pU = pY + numpix;
1214 unsigned char *pV = pU + numpix / 4;
1215 unsigned char *pOut = pOut0;
1216
1217 for (j = 0; j <= height; j++) {
1218 for (i = 0; i <= width - 4; i += 4) {
1219 y00 = *pY;
1220 y01 = *(pY + 1);
1221 y10 = *(pY + 2);
1222 y11 = *(pY + 3);
1223 u = (*pU++) - 128;
1224 v = (*pV++) - 128;
1225
1226 move_411_block(yTL: y00, yTR: y01, yBL: y10, yBR: y11, u, v,
1227 width, rgb: pOut);
1228
1229 pY += 4;
1230 pOut += 4 * bytes;
1231
1232 }
1233 }
1234}
1235
1236#define CLAMP(x) ((x)<0?0:((x)>255)?255:(x))
1237
1238typedef struct {
1239 int is_abs;
1240 int len;
1241 int val;
1242} code_table_t;
1243
1244
1245/* local storage */
1246static code_table_t table[256];
1247static int init_done = 0;
1248
1249
1250/*
1251 sonix_decompress_init
1252 =====================
1253 pre-calculates a locally stored table for efficient huffman-decoding.
1254
1255 Each entry at index x in the table represents the codeword
1256 present at the MSB of byte x.
1257
1258 */
1259static void sonix_decompress_init(void)
1260{
1261 int i;
1262 int is_abs, val, len;
1263
1264 for (i = 0; i < 256; i++) {
1265 is_abs = 0;
1266 val = 0;
1267 len = 0;
1268 if ((i & 0x80) == 0) {
1269 /* code 0 */
1270 val = 0;
1271 len = 1;
1272 }
1273 else if ((i & 0xE0) == 0x80) {
1274 /* code 100 */
1275 val = +4;
1276 len = 3;
1277 }
1278 else if ((i & 0xE0) == 0xA0) {
1279 /* code 101 */
1280 val = -4;
1281 len = 3;
1282 }
1283 else if ((i & 0xF0) == 0xD0) {
1284 /* code 1101 */
1285 val = +11;
1286 len = 4;
1287 }
1288 else if ((i & 0xF0) == 0xF0) {
1289 /* code 1111 */
1290 val = -11;
1291 len = 4;
1292 }
1293 else if ((i & 0xF8) == 0xC8) {
1294 /* code 11001 */
1295 val = +20;
1296 len = 5;
1297 }
1298 else if ((i & 0xFC) == 0xC0) {
1299 /* code 110000 */
1300 val = -20;
1301 len = 6;
1302 }
1303 else if ((i & 0xFC) == 0xC4) {
1304 /* code 110001xx: unknown */
1305 val = 0;
1306 len = 8;
1307 }
1308 else if ((i & 0xF0) == 0xE0) {
1309 /* code 1110xxxx */
1310 is_abs = 1;
1311 val = (i & 0x0F) << 4;
1312 len = 8;
1313 }
1314 table[i].is_abs = is_abs;
1315 table[i].val = val;
1316 table[i].len = len;
1317 }
1318
1319 init_done = 1;
1320}
1321
1322
1323/*
1324 sonix_decompress
1325 ================
1326 decompresses an image encoded by a SN9C101 camera controller chip.
1327
1328 IN width
1329 height
1330 inp pointer to compressed frame (with header already stripped)
1331 OUT outp pointer to decompressed frame
1332
1333 Returns 0 if the operation was successful.
1334 Returns <0 if operation failed.
1335
1336 */
1337static int sonix_decompress(int width, int height, unsigned char *inp, unsigned char *outp)
1338{
1339 int row, col;
1340 int val;
1341 int bitpos;
1342 unsigned char code;
1343 unsigned char *addr;
1344
1345 if (!init_done) {
1346 /* do sonix_decompress_init first! */
1347 return -1;
1348 }
1349
1350 bitpos = 0;
1351 for (row = 0; row < height; row++) {
1352
1353 col = 0;
1354
1355
1356
1357 /* first two pixels in first two rows are stored as raw 8-bit */
1358 if (row < 2) {
1359 addr = inp + (bitpos >> 3);
1360 code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1361 bitpos += 8;
1362 *outp++ = code;
1363
1364 addr = inp + (bitpos >> 3);
1365 code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1366 bitpos += 8;
1367 *outp++ = code;
1368
1369 col += 2;
1370 }
1371
1372 while (col < width) {
1373 /* get bitcode from bitstream */
1374 addr = inp + (bitpos >> 3);
1375 code = (addr[0] << (bitpos & 7)) | (addr[1] >> (8 - (bitpos & 7)));
1376
1377 /* update bit position */
1378 bitpos += table[code].len;
1379
1380 /* calculate pixel value */
1381 val = table[code].val;
1382 if (!table[code].is_abs) {
1383 /* value is relative to top and left pixel */
1384 if (col < 2) {
1385 /* left column: relative to top pixel */
1386 val += outp[-2*width];
1387 }
1388 else if (row < 2) {
1389 /* top row: relative to left pixel */
1390 val += outp[-2];
1391 }
1392 else {
1393 /* main area: average of left pixel and top pixel */
1394 val += (outp[-2] + outp[-2*width]) / 2;
1395 }
1396 }
1397
1398 /* store pixel */
1399 *outp++ = CLAMP(val);
1400 col++;
1401 }
1402 }
1403
1404 return 0;
1405}
1406
1407void CvCaptureCAM_V4L::convertToRgb(const Buffer &currentBuffer)
1408{
1409 cv::Size imageSize;
1410 unsigned char *start;
1411
1412 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
1413 __u32 offset = 0;
1414 start = (unsigned char*)buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start;
1415 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
1416 __u32 data_offset, bytesused;
1417 data_offset = currentBuffer.planes[n_planes].data_offset;
1418 bytesused = currentBuffer.planes[n_planes].bytesused - data_offset;
1419 memcpy(dest: start + offset, src: (char *)currentBuffer.memories[n_planes].start + data_offset,
1420 n: std::min(a: currentBuffer.memories[n_planes].length, b: (size_t)bytesused));
1421 offset += bytesused;
1422 }
1423
1424 imageSize = cv::Size(form.fmt.pix_mp.width, form.fmt.pix_mp.height);
1425 } else {
1426 start = (unsigned char*)currentBuffer.memories[MEMORY_ORIG].start;
1427
1428 imageSize = cv::Size(form.fmt.pix.width, form.fmt.pix.height);
1429 }
1430 // Not found conversion
1431 switch (palette)
1432 {
1433 case V4L2_PIX_FMT_YUV411P:
1434 frame.create(size: imageSize, CV_8UC3);
1435 yuv411p_to_rgb24(width: imageSize.width, height: imageSize.height, pIn0: start, pOut0: frame.data);
1436 return;
1437 default:
1438 break;
1439 }
1440 // Converted by cvtColor or imdecode
1441 switch (palette) {
1442 case V4L2_PIX_FMT_YVU420:
1443 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1444 code: COLOR_YUV2BGR_YV12);
1445 return;
1446 case V4L2_PIX_FMT_YUV420:
1447 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1448 code: COLOR_YUV2BGR_IYUV);
1449 return;
1450 case V4L2_PIX_FMT_NV12:
1451 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1452 code: COLOR_YUV2BGR_NV12);
1453 return;
1454 case V4L2_PIX_FMT_NV21:
1455 cv::cvtColor(src: cv::Mat(imageSize.height * 3 / 2, imageSize.width, CV_8U, start), dst: frame,
1456 code: COLOR_YUV2BGR_NV21);
1457 return;
1458#ifdef HAVE_JPEG
1459 case V4L2_PIX_FMT_MJPEG:
1460 case V4L2_PIX_FMT_JPEG:
1461 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): decoding JPEG frame: size=" << currentBuffer.bytesused);
1462 cv::imdecode(buf: Mat(1, currentBuffer.bytesused, CV_8U, start), flags: IMREAD_COLOR, dst: &frame);
1463 return;
1464#endif
1465 case V4L2_PIX_FMT_YUYV:
1466 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC2, start), dst: frame, code: COLOR_YUV2BGR_YUYV);
1467 return;
1468 case V4L2_PIX_FMT_UYVY:
1469 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC2, start), dst: frame, code: COLOR_YUV2BGR_UYVY);
1470 return;
1471 case V4L2_PIX_FMT_RGB24:
1472 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC3, start), dst: frame, code: COLOR_RGB2BGR);
1473 return;
1474 case V4L2_PIX_FMT_Y16:
1475 {
1476 // https://www.kernel.org/doc/html/v4.10/media/uapi/v4l/pixfmt-y16.html
1477 // This is a grey-scale image with a depth of 16 bits per pixel. The least significant byte is stored at lower memory addresses (little-endian).
1478 // Note: 10-bits precision is not supported
1479 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1480 cv::extractChannel(src: cv::Mat(imageSize, CV_8UC2, start), dst: temp, coi: 1); // 1 - second channel
1481 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1482 return;
1483 }
1484 case V4L2_PIX_FMT_Y16_BE:
1485 {
1486 // https://www.kernel.org/doc/html/v4.10/media/uapi/v4l/pixfmt-y16-be.html
1487 // This is a grey-scale image with a depth of 16 bits per pixel. The most significant byte is stored at lower memory addresses (big-endian).
1488 // Note: 10-bits precision is not supported
1489 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1490 cv::extractChannel(src: cv::Mat(imageSize, CV_8UC2, start), dst: temp, coi: 0); // 0 - first channel
1491 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1492 return;
1493 }
1494 case V4L2_PIX_FMT_Y12:
1495 {
1496 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1497 cv::Mat(imageSize, CV_16UC1, start).convertTo(m: temp, CV_8U, alpha: 1.0 / 16);
1498 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1499 return;
1500 }
1501 case V4L2_PIX_FMT_Y10:
1502 {
1503 cv::Mat temp(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1504 cv::Mat(imageSize, CV_16UC1, start).convertTo(m: temp, CV_8U, alpha: 1.0 / 4);
1505 cv::cvtColor(src: temp, dst: frame, code: COLOR_GRAY2BGR);
1506 return;
1507 }
1508 case V4L2_PIX_FMT_SN9C10X:
1509 {
1510 sonix_decompress_init();
1511 sonix_decompress(width: imageSize.width, height: imageSize.height,
1512 inp: start, outp: (unsigned char*)buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1513
1514 cv::Mat cv_buf(imageSize, CV_8UC1, buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1515 cv::cvtColor(src: cv_buf, dst: frame, code: COLOR_BayerRG2BGR);
1516 return;
1517 }
1518 case V4L2_PIX_FMT_SRGGB8:
1519 {
1520 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerBG2BGR);
1521 return;
1522 }
1523 case V4L2_PIX_FMT_SBGGR8:
1524 {
1525 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerRG2BGR);
1526 return;
1527 }
1528 case V4L2_PIX_FMT_SGBRG8:
1529 {
1530 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerGR2BGR);
1531 return;
1532 }
1533 case V4L2_PIX_FMT_SGRBG8:
1534 {
1535 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_BayerGB2BGR);
1536 return;
1537 }
1538 case V4L2_PIX_FMT_GREY:
1539 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC1, start), dst: frame, code: COLOR_GRAY2BGR);
1540 break;
1541 case V4L2_PIX_FMT_XBGR32:
1542 case V4L2_PIX_FMT_ABGR32:
1543 cv::cvtColor(src: cv::Mat(imageSize, CV_8UC4, start), dst: frame, code: COLOR_BGRA2BGR);
1544 break;
1545 case V4L2_PIX_FMT_BGR24:
1546 default:
1547 Mat(1, currentBuffer.bytesused, CV_8U, start).copyTo(m: frame);
1548 break;
1549 }
1550}
1551
1552static inline cv::String capPropertyName(int prop)
1553{
1554 switch (prop) {
1555 case cv::CAP_PROP_POS_MSEC:
1556 return "pos_msec";
1557 case cv::CAP_PROP_POS_FRAMES:
1558 return "pos_frames";
1559 case cv::CAP_PROP_POS_AVI_RATIO:
1560 return "pos_avi_ratio";
1561 case cv::CAP_PROP_FRAME_COUNT:
1562 return "frame_count";
1563 case cv::CAP_PROP_FRAME_HEIGHT:
1564 return "height";
1565 case cv::CAP_PROP_FRAME_WIDTH:
1566 return "width";
1567 case cv::CAP_PROP_CONVERT_RGB:
1568 return "convert_rgb";
1569 case cv::CAP_PROP_FORMAT:
1570 return "format";
1571 case cv::CAP_PROP_MODE:
1572 return "mode";
1573 case cv::CAP_PROP_FOURCC:
1574 return "fourcc";
1575 case cv::CAP_PROP_AUTO_EXPOSURE:
1576 return "auto_exposure";
1577 case cv::CAP_PROP_EXPOSURE:
1578 return "exposure";
1579 case cv::CAP_PROP_TEMPERATURE:
1580 return "temperature";
1581 case cv::CAP_PROP_FPS:
1582 return "fps";
1583 case cv::CAP_PROP_BRIGHTNESS:
1584 return "brightness";
1585 case cv::CAP_PROP_CONTRAST:
1586 return "contrast";
1587 case cv::CAP_PROP_SATURATION:
1588 return "saturation";
1589 case cv::CAP_PROP_HUE:
1590 return "hue";
1591 case cv::CAP_PROP_GAIN:
1592 return "gain";
1593 case cv::CAP_PROP_RECTIFICATION:
1594 return "rectification";
1595 case cv::CAP_PROP_MONOCHROME:
1596 return "monochrome";
1597 case cv::CAP_PROP_SHARPNESS:
1598 return "sharpness";
1599 case cv::CAP_PROP_GAMMA:
1600 return "gamma";
1601 case cv::CAP_PROP_TRIGGER:
1602 return "trigger";
1603 case cv::CAP_PROP_TRIGGER_DELAY:
1604 return "trigger_delay";
1605 case cv::CAP_PROP_WHITE_BALANCE_RED_V:
1606 return "white_balance_red_v";
1607 case cv::CAP_PROP_ZOOM:
1608 return "zoom";
1609 case cv::CAP_PROP_FOCUS:
1610 return "focus";
1611 case cv::CAP_PROP_GUID:
1612 return "guid";
1613 case cv::CAP_PROP_ISO_SPEED:
1614 return "iso_speed";
1615 case cv::CAP_PROP_BACKLIGHT:
1616 return "backlight";
1617 case cv::CAP_PROP_PAN:
1618 return "pan";
1619 case cv::CAP_PROP_TILT:
1620 return "tilt";
1621 case cv::CAP_PROP_ROLL:
1622 return "roll";
1623 case cv::CAP_PROP_IRIS:
1624 return "iris";
1625 case cv::CAP_PROP_SETTINGS:
1626 return "dialog_settings";
1627 case cv::CAP_PROP_BUFFERSIZE:
1628 return "buffersize";
1629 case cv::CAP_PROP_AUTOFOCUS:
1630 return "autofocus";
1631 case cv::CAP_PROP_WHITE_BALANCE_BLUE_U:
1632 return "white_balance_blue_u";
1633 case cv::CAP_PROP_SAR_NUM:
1634 return "sar_num";
1635 case cv::CAP_PROP_SAR_DEN:
1636 return "sar_den";
1637 case CAP_PROP_AUTO_WB:
1638 return "auto wb";
1639 case CAP_PROP_WB_TEMPERATURE:
1640 return "wb temperature";
1641 case CAP_PROP_ORIENTATION_META:
1642 return "orientation meta";
1643 case CAP_PROP_ORIENTATION_AUTO:
1644 return "orientation auto";
1645 default:
1646 return cv::format(fmt: "unknown (%d)", prop);
1647 }
1648}
1649
1650static inline int capPropertyToV4L2(int prop)
1651{
1652 switch (prop) {
1653 case cv::CAP_PROP_FPS:
1654 return -1;
1655 case cv::CAP_PROP_FOURCC:
1656 return -1;
1657 case cv::CAP_PROP_FRAME_COUNT:
1658 return V4L2_CID_MPEG_VIDEO_B_FRAMES;
1659 case cv::CAP_PROP_FORMAT:
1660 return -1;
1661 case cv::CAP_PROP_MODE:
1662 return -1;
1663 case cv::CAP_PROP_BRIGHTNESS:
1664 return V4L2_CID_BRIGHTNESS;
1665 case cv::CAP_PROP_CONTRAST:
1666 return V4L2_CID_CONTRAST;
1667 case cv::CAP_PROP_SATURATION:
1668 return V4L2_CID_SATURATION;
1669 case cv::CAP_PROP_HUE:
1670 return V4L2_CID_HUE;
1671 case cv::CAP_PROP_GAIN:
1672 return V4L2_CID_GAIN;
1673 case cv::CAP_PROP_EXPOSURE:
1674 return V4L2_CID_EXPOSURE_ABSOLUTE;
1675 case cv::CAP_PROP_CONVERT_RGB:
1676 return -1;
1677 case cv::CAP_PROP_WHITE_BALANCE_BLUE_U:
1678 return V4L2_CID_BLUE_BALANCE;
1679 case cv::CAP_PROP_RECTIFICATION:
1680 return -1;
1681 case cv::CAP_PROP_MONOCHROME:
1682 return -1;
1683 case cv::CAP_PROP_SHARPNESS:
1684 return V4L2_CID_SHARPNESS;
1685 case cv::CAP_PROP_AUTO_EXPOSURE:
1686 return V4L2_CID_EXPOSURE_AUTO;
1687 case cv::CAP_PROP_GAMMA:
1688 return V4L2_CID_GAMMA;
1689 case cv::CAP_PROP_TEMPERATURE:
1690 return V4L2_CID_WHITE_BALANCE_TEMPERATURE;
1691 case cv::CAP_PROP_TRIGGER:
1692 return -1;
1693 case cv::CAP_PROP_TRIGGER_DELAY:
1694 return -1;
1695 case cv::CAP_PROP_WHITE_BALANCE_RED_V:
1696 return V4L2_CID_RED_BALANCE;
1697 case cv::CAP_PROP_ZOOM:
1698 return V4L2_CID_ZOOM_ABSOLUTE;
1699 case cv::CAP_PROP_FOCUS:
1700 return V4L2_CID_FOCUS_ABSOLUTE;
1701 case cv::CAP_PROP_GUID:
1702 return -1;
1703 case cv::CAP_PROP_ISO_SPEED:
1704 return V4L2_CID_ISO_SENSITIVITY;
1705 case cv::CAP_PROP_BACKLIGHT:
1706 return V4L2_CID_BACKLIGHT_COMPENSATION;
1707 case cv::CAP_PROP_PAN:
1708 return V4L2_CID_PAN_ABSOLUTE;
1709 case cv::CAP_PROP_TILT:
1710 return V4L2_CID_TILT_ABSOLUTE;
1711 case cv::CAP_PROP_ROLL:
1712 return V4L2_CID_ROTATE;
1713 case cv::CAP_PROP_IRIS:
1714 return V4L2_CID_IRIS_ABSOLUTE;
1715 case cv::CAP_PROP_SETTINGS:
1716 return -1;
1717 case cv::CAP_PROP_BUFFERSIZE:
1718 return -1;
1719 case cv::CAP_PROP_AUTOFOCUS:
1720 return V4L2_CID_FOCUS_AUTO;
1721 case cv::CAP_PROP_SAR_NUM:
1722 return V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_HEIGHT;
1723 case cv::CAP_PROP_SAR_DEN:
1724 return V4L2_CID_MPEG_VIDEO_H264_VUI_EXT_SAR_WIDTH;
1725 case CAP_PROP_AUTO_WB:
1726 return V4L2_CID_AUTO_WHITE_BALANCE;
1727 case CAP_PROP_WB_TEMPERATURE:
1728 return V4L2_CID_WHITE_BALANCE_TEMPERATURE;
1729 default:
1730 break;
1731 }
1732 return -1;
1733}
1734
1735static inline bool compatibleRange(int property_id)
1736{
1737 switch (property_id) {
1738 case cv::CAP_PROP_BRIGHTNESS:
1739 case cv::CAP_PROP_CONTRAST:
1740 case cv::CAP_PROP_SATURATION:
1741 case cv::CAP_PROP_HUE:
1742 case cv::CAP_PROP_GAIN:
1743 case cv::CAP_PROP_EXPOSURE:
1744 case cv::CAP_PROP_FOCUS:
1745 case cv::CAP_PROP_AUTOFOCUS:
1746 case cv::CAP_PROP_AUTO_EXPOSURE:
1747 return true;
1748 default:
1749 break;
1750 }
1751 return false;
1752}
1753
1754bool CvCaptureCAM_V4L::controlInfo(int property_id, __u32 &_v4l2id, cv::Range &range) const
1755{
1756 /* initialisations */
1757 int v4l2id = capPropertyToV4L2(prop: property_id);
1758 v4l2_queryctrl queryctrl = v4l2_queryctrl();
1759 queryctrl.id = __u32(v4l2id);
1760 if (v4l2id == -1 || !tryIoctl(VIDIOC_QUERYCTRL, parameter: &queryctrl)) {
1761 CV_LOG_INFO(NULL, "VIDEOIO(V4L2:" << deviceName << "): property '" << capPropertyName(property_id) << "' is not supported");
1762 return false;
1763 }
1764 _v4l2id = __u32(v4l2id);
1765 range = cv::Range(queryctrl.minimum, queryctrl.maximum);
1766 if (normalizePropRange) {
1767 switch(property_id)
1768 {
1769 case CAP_PROP_WB_TEMPERATURE:
1770 case CAP_PROP_AUTO_WB:
1771 case CAP_PROP_AUTOFOCUS:
1772 range = Range(0, 1); // do not convert
1773 break;
1774 case CAP_PROP_AUTO_EXPOSURE:
1775 range = Range(0, 4);
1776 default:
1777 break;
1778 }
1779 }
1780 return true;
1781}
1782
1783bool CvCaptureCAM_V4L::icvControl(__u32 v4l2id, int &value, bool isSet) const
1784{
1785 /* set which control we want to set */
1786 v4l2_control control = v4l2_control();
1787 control.id = v4l2id;
1788 control.value = value;
1789
1790 /* The driver may clamp the value or return ERANGE, ignored here */
1791 if (!tryIoctl(ioctlCode: isSet ? VIDIOC_S_CTRL : VIDIOC_G_CTRL, parameter: &control)) {
1792 int err = errno;
1793 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed " << (isSet ? "VIDIOC_S_CTRL" : "VIDIOC_G_CTRL") << ": errno=" << err << " (" << strerror(err) << ")");
1794 switch (err) {
1795#ifndef NDEBUG
1796 case EINVAL:
1797 fprintf(stderr,
1798 "The struct v4l2_control id is invalid or the value is inappropriate for the given control (i.e. "
1799 "if a menu item is selected that is not supported by the driver according to VIDIOC_QUERYMENU).");
1800 break;
1801 case ERANGE:
1802 fprintf(stderr, "The struct v4l2_control value is out of bounds.");
1803 break;
1804 case EACCES:
1805 fprintf(stderr, "Attempt to set a read-only control or to get a write-only control.");
1806 break;
1807#endif
1808 default:
1809 break;
1810 }
1811 return false;
1812 }
1813 if (!isSet)
1814 value = control.value;
1815 return true;
1816}
1817
1818double CvCaptureCAM_V4L::getProperty(int property_id) const
1819{
1820 switch (property_id) {
1821 case cv::CAP_PROP_FRAME_WIDTH:
1822 if (V4L2_TYPE_IS_MULTIPLANAR(type))
1823 return form.fmt.pix_mp.width;
1824 else
1825 return form.fmt.pix.width;
1826 case cv::CAP_PROP_FRAME_HEIGHT:
1827 if (V4L2_TYPE_IS_MULTIPLANAR(type))
1828 return form.fmt.pix_mp.height;
1829 else
1830 return form.fmt.pix.height;
1831 case cv::CAP_PROP_FOURCC:
1832 return palette;
1833 case cv::CAP_PROP_FORMAT:
1834 return frame.type();
1835 case cv::CAP_PROP_MODE:
1836 if (normalizePropRange)
1837 return palette;
1838 return normalizePropRange;
1839 case cv::CAP_PROP_CONVERT_RGB:
1840 return convert_rgb;
1841 case cv::CAP_PROP_BUFFERSIZE:
1842 return bufferSize;
1843 case cv::CAP_PROP_FPS:
1844 {
1845 v4l2_streamparm sp = v4l2_streamparm();
1846 sp.type = type;
1847 if (!tryIoctl(VIDIOC_G_PARM, parameter: &sp)) {
1848 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): Unable to get camera FPS");
1849 return -1;
1850 }
1851 return sp.parm.capture.timeperframe.denominator / (double)sp.parm.capture.timeperframe.numerator;
1852 }
1853 case cv::CAP_PROP_POS_MSEC:
1854 if (FirstCapture)
1855 return 0;
1856
1857 return 1000 * timestamp.tv_sec + ((double)timestamp.tv_usec) / 1000;
1858 case cv::CAP_PROP_CHANNEL:
1859 return channelNumber;
1860 default:
1861 {
1862 cv::Range range;
1863 __u32 v4l2id;
1864 if(!controlInfo(property_id, v4l2id&: v4l2id, range))
1865 return -1.0;
1866 int value = 0;
1867 if(!icvControl(v4l2id, value, isSet: false))
1868 return -1.0;
1869 if (normalizePropRange && compatibleRange(property_id))
1870 return ((double)value - range.start) / range.size();
1871 return value;
1872 }
1873 }
1874}
1875
1876bool CvCaptureCAM_V4L::icvSetFrameSize(int _width, int _height)
1877{
1878 if (_width > 0)
1879 width_set = _width;
1880
1881 if (_height > 0)
1882 height_set = _height;
1883
1884 /* two subsequent calls setting WIDTH and HEIGHT will change
1885 the video size */
1886 if (width_set <= 0 || height_set <= 0)
1887 return true;
1888
1889 width = width_set;
1890 height = height_set;
1891 width_set = height_set = 0;
1892 return v4l2_reset();
1893}
1894
1895bool CvCaptureCAM_V4L::setProperty( int property_id, double _value )
1896{
1897 int value = cvRound(value: _value);
1898 switch (property_id) {
1899 case cv::CAP_PROP_FRAME_WIDTH:
1900 return icvSetFrameSize(width: value, height: 0);
1901 case cv::CAP_PROP_FRAME_HEIGHT:
1902 return icvSetFrameSize(width: 0, height: value);
1903 case cv::CAP_PROP_FPS:
1904 if (fps == static_cast<__u32>(value))
1905 return true;
1906 return setFps(value);
1907 case cv::CAP_PROP_CONVERT_RGB:
1908 if (bool(value)) {
1909 convert_rgb = convertableToRgb();
1910 return convert_rgb;
1911 }else{
1912 convert_rgb = false;
1913 return true;
1914 }
1915 case cv::CAP_PROP_FOURCC:
1916 {
1917 if (palette == static_cast<__u32>(value))
1918 return true;
1919
1920 __u32 old_palette = palette;
1921 palette = static_cast<__u32>(value);
1922 if (v4l2_reset())
1923 return true;
1924
1925 palette = old_palette;
1926 v4l2_reset();
1927 return false;
1928 }
1929 case cv::CAP_PROP_MODE:
1930 normalizePropRange = bool(value);
1931 return true;
1932 case cv::CAP_PROP_BUFFERSIZE:
1933 if (bufferSize == value)
1934 return true;
1935
1936 if (value > MAX_V4L_BUFFERS || value < 1) {
1937 CV_LOG_WARNING(NULL, "VIDEOIO(V4L2:" << deviceName << "): Bad buffer size " << value << ", buffer size must be from 1 to " << MAX_V4L_BUFFERS);
1938 return false;
1939 }
1940 bufferSize = value;
1941 return v4l2_reset();
1942 case cv::CAP_PROP_CHANNEL:
1943 {
1944 if (value < 0) {
1945 channelNumber = -1;
1946 return true;
1947 }
1948 if (channelNumber == value)
1949 return true;
1950
1951 int old_channel = channelNumber;
1952 channelNumber = value;
1953 if (v4l2_reset())
1954 return true;
1955
1956 channelNumber = old_channel;
1957 v4l2_reset();
1958 return false;
1959 }
1960 default:
1961 {
1962 cv::Range range;
1963 __u32 v4l2id;
1964 if (!controlInfo(property_id, v4l2id&: v4l2id, range))
1965 return false;
1966 if (normalizePropRange && compatibleRange(property_id))
1967 value = cv::saturate_cast<int>(v: _value * range.size() + range.start);
1968 return icvControl(v4l2id, value, isSet: true);
1969 }
1970 }
1971 return false;
1972}
1973
1974void CvCaptureCAM_V4L::releaseBuffers()
1975{
1976 if (buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start) {
1977 free(ptr: buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start);
1978 buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].start = 0;
1979 }
1980
1981 if (buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start) {
1982 free(ptr: buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start);
1983 buffers[MAX_V4L_BUFFERS].memories[MEMORY_RGB].start = 0;
1984 }
1985
1986 bufferIndex = -1;
1987 FirstCapture = true;
1988
1989 if (!v4l_buffersRequested)
1990 return;
1991 v4l_buffersRequested = false;
1992
1993 for (unsigned int n_buffers = 0; n_buffers < MAX_V4L_BUFFERS; ++n_buffers) {
1994 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
1995 if (buffers[n_buffers].memories[n_planes].start) {
1996 if (-1 == munmap(addr: buffers[n_buffers].memories[n_planes].start,
1997 len: buffers[n_buffers].memories[n_planes].length)) {
1998 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed munmap(): errno=" << errno << " (" << strerror(errno) << ")");
1999 } else {
2000 buffers[n_buffers].memories[n_planes].start = 0;
2001 }
2002 }
2003 }
2004 }
2005 //Applications can call ioctl VIDIOC_REQBUFS again to change the number of buffers,
2006 // however this cannot succeed when any buffers are still mapped. A count value of zero
2007 // frees all buffers, after aborting or finishing any DMA in progress, an implicit VIDIOC_STREAMOFF.
2008 requestBuffers(buffer_number: 0);
2009};
2010
2011bool CvCaptureCAM_V4L::streaming(bool startStream)
2012{
2013 if (startStream != v4l_streamStarted)
2014 {
2015 if (!isOpened())
2016 {
2017 CV_Assert(v4l_streamStarted == false);
2018 return !startStream;
2019 }
2020
2021 bool result = tryIoctl(ioctlCode: startStream ? VIDIOC_STREAMON : VIDIOC_STREAMOFF, parameter: &type);
2022 if (result)
2023 {
2024 v4l_streamStarted = startStream;
2025 return true;
2026 }
2027 if (startStream)
2028 {
2029 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_STREAMON: errno=" << errno << " (" << strerror(errno) << ")");
2030 }
2031 return false;
2032 }
2033 return startStream;
2034}
2035
2036bool CvCaptureCAM_V4L::retrieveFrame(int, OutputArray ret)
2037{
2038 havePendingFrame = false; // unlock .grab()
2039
2040 if (bufferIndex < 0)
2041 frame.copyTo(m: ret);
2042
2043 /* Now get what has already been captured as a IplImage return */
2044 const Buffer &currentBuffer = buffers[bufferIndex];
2045 if (convert_rgb) {
2046 convertToRgb(currentBuffer);
2047 } else {
2048 // for mjpeg streams the size might change in between, so we have to change the header
2049 // We didn't allocate memory when not convert_rgb, but we have to recreate the header
2050 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): buffer input size=" << currentBuffer.bytesused);
2051
2052 if (V4L2_TYPE_IS_MULTIPLANAR(type)) {
2053 // calculate total size
2054 __u32 bytestotal = 0;
2055 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
2056 const v4l2_plane & cur_plane = currentBuffer.planes[n_planes];
2057 bytestotal += cur_plane.bytesused - cur_plane.data_offset;
2058 }
2059 // allocate frame data
2060 frame.create(size: Size(bytestotal, 1), CV_8U);
2061 // copy each plane to the frame
2062 __u32 offset = 0;
2063 for (unsigned char n_planes = 0; n_planes < num_planes; n_planes++) {
2064 const v4l2_plane & cur_plane = currentBuffer.planes[n_planes];
2065 const Memory & cur_mem = currentBuffer.memories[n_planes];
2066 memcpy(dest: frame.data + offset,
2067 src: (char*)cur_mem.start + cur_plane.data_offset,
2068 n: std::min(a: currentBuffer.memories[n_planes].length, b: (size_t)cur_plane.bytesused));
2069 }
2070 } else {
2071 const Size sz(std::min(a: buffers[MAX_V4L_BUFFERS].memories[MEMORY_ORIG].length, b: (size_t)currentBuffer.buffer.bytesused), 1);
2072 frame = Mat(sz, CV_8U, currentBuffer.memories[MEMORY_ORIG].start);
2073 }
2074 }
2075 //Revert buffer to the queue
2076 if (!tryIoctl(VIDIOC_QBUF, parameter: &buffers[bufferIndex].buffer))
2077 {
2078 CV_LOG_DEBUG(NULL, "VIDEOIO(V4L2:" << deviceName << "): failed VIDIOC_QBUF: errno=" << errno << " (" << strerror(errno) << ")");
2079 }
2080
2081 bufferIndex = -1;
2082 frame.copyTo(m: ret);
2083 return true;
2084}
2085
2086Ptr<IVideoCapture> create_V4L_capture_cam(int index)
2087{
2088 Ptr<CvCaptureCAM_V4L> ret = makePtr<CvCaptureCAM_V4L>();
2089 if (ret->open(index: index))
2090 return ret;
2091 return NULL;
2092}
2093
2094Ptr<IVideoCapture> create_V4L_capture_file(const std::string &filename)
2095{
2096 auto ret = makePtr<CvCaptureCAM_V4L>();
2097 if (ret->open(deviceName: filename))
2098 return ret;
2099 return NULL;
2100}
2101
2102static
2103bool VideoCapture_V4L_deviceHandlePoll(const std::vector<int>& deviceHandles, std::vector<int>& ready, int64 timeoutNs)
2104{
2105 CV_Assert(!deviceHandles.empty());
2106 const size_t N = deviceHandles.size();
2107
2108 ready.clear(); ready.reserve(n: N);
2109
2110 const auto poll_flags = POLLIN | POLLRDNORM | POLLERR;
2111
2112 std::vector<pollfd> fds; fds.reserve(n: N);
2113
2114 for (size_t i = 0; i < N; ++i)
2115 {
2116 int handle = deviceHandles[i];
2117 CV_LOG_DEBUG(NULL, "camera" << i << ": handle = " << handle);
2118 CV_Assert(handle != 0);
2119 fds.push_back(x: pollfd{.fd: handle, .events: poll_flags, .revents: 0});
2120 }
2121
2122 int timeoutMs = -1;
2123 if (timeoutNs > 0)
2124 {
2125 timeoutMs = saturate_cast<int>(v: (timeoutNs + 999999) / 1000000);
2126 }
2127
2128 int ret = poll(fds: fds.data(), nfds: N, timeout: timeoutMs);
2129 if (ret == -1)
2130 {
2131 perror(s: "poll error");
2132 return false;
2133 }
2134
2135 if (ret == 0)
2136 return 0; // just timeout
2137
2138 for (size_t i = 0; i < N; ++i)
2139 {
2140 const auto& fd = fds[i];
2141 CV_LOG_DEBUG(NULL, "camera" << i << ": fd.revents = 0x" << std::hex << fd.revents);
2142 if ((fd.revents & (POLLIN | POLLRDNORM)) != 0)
2143 {
2144 ready.push_back(x: i);
2145 }
2146 else if ((fd.revents & POLLERR) != 0)
2147 {
2148 CV_Error_(Error::StsError, ("Error is reported for camera stream: %d (handle = %d)", (int)i, deviceHandles[i]));
2149 }
2150 else
2151 {
2152 // not ready
2153 }
2154 }
2155 return true;
2156}
2157
2158bool VideoCapture_V4L_waitAny(const std::vector<VideoCapture>& streams, CV_OUT std::vector<int>& ready, int64 timeoutNs)
2159{
2160 CV_Assert(!streams.empty());
2161
2162 const size_t N = streams.size();
2163
2164 // unwrap internal API
2165 std::vector<CvCaptureCAM_V4L*> capPtr(N, NULL);
2166 for (size_t i = 0; i < N; ++i)
2167 {
2168 IVideoCapture* iCap = internal::VideoCapturePrivateAccessor::getIVideoCapture(cap: streams[i]);
2169 CvCaptureCAM_V4L *ptr_CvCaptureCAM_V4L = dynamic_cast<CvCaptureCAM_V4L*>(iCap);
2170 CV_Assert(ptr_CvCaptureCAM_V4L);
2171 capPtr[i] = ptr_CvCaptureCAM_V4L;
2172 }
2173
2174 // initialize cameras streams and get handles
2175 std::vector<int> deviceHandles; deviceHandles.reserve(n: N);
2176 for (size_t i = 0; i < N; ++i)
2177 {
2178 CvCaptureCAM_V4L *ptr = capPtr[i];
2179 if (ptr->FirstCapture)
2180 {
2181 ptr->havePendingFrame = ptr->grabFrame();
2182 CV_Assert(ptr->havePendingFrame);
2183 // TODO: Need to filter these cameras, because frame is available
2184 }
2185 CV_Assert(ptr->deviceHandle);
2186 deviceHandles.push_back(x: ptr->deviceHandle);
2187 }
2188
2189 bool res = VideoCapture_V4L_deviceHandlePoll(deviceHandles, ready, timeoutNs);
2190 for (size_t i = 0; i < ready.size(); ++i)
2191 {
2192 int idx = ready[i];
2193 CvCaptureCAM_V4L *ptr = capPtr[idx];
2194 ptr->havePendingFrame = ptr->grabFrame();
2195 CV_Assert(ptr->havePendingFrame);
2196 }
2197 return res;
2198}
2199
2200} // cv::
2201
2202#endif
2203

Provided by KDAB

Privacy Policy
Improve your Profiling and Debugging skills
Find out more

source code of opencv/modules/videoio/src/cap_v4l.cpp