sailus: thinking more about V4L2_CID_STREAMING_TRIGGER_EXTERNAL, how bad would it be to allow a s_frame_interval with a zero denominator?
grohne: the issue with s_frame_interval is that it's an ill-defined API
most sensors don't have a concept of frame interval
they have a pixel array sampling clock rate, and vertical and horizontal blanking
which all together result in a given frame rate
so there are multiple ways to achieve a given frame rate
the use of that API is thus discouraged, in favour of exposing the individual parameters explicitly, except when the sensor exposes a frame rate natively
ok
(which is the case of some sensors containing an ISP, they expose the frame rate through a register and offer no way to control the clock rate and blankings individually)
and I expect external triggers to be mostly used with raw sensors
can you tell me what I was looking for instead? is that some CID?
V4L2_CID_VBLANK maybe?
V4L2_CID_HBLANK and V4L2_CID_VBLANK
and possibly V4L2_CID_LINK_FREQ if the driver supports changing the frequency
looking into them. thanks
so when targetting a particular frame rate, a user is supposed to retrieve the V4L2_CID_PIXEL_RATE, lookup the requested image dimensions w * h and determine values vblank, hblank within their respetive bounds such that (w + hblank) * (h + vblank) / pixel_rate becomes the desired frame interval?
is there any easier way to do this on the user side?
that's correct
someone has to perform that calculation, and having it on the driver side would effectively remove flexibility from applications that need fine-grained control of the sensor
that's why it's not considered to be the driver's responsibility
it kinda feels obvious in retrospect, but it wasn't. thank you.
is there some way to initiate a s_stream call from userspace on a subdev?
no, that's always done through a video node
hmm. can I do one without a whole vb2?
do you mean without actually capturing to memory ?
yes
err without capturing memory
in theory yes. what kind of hardware do you have, where do the frames produced by the camera sensor go to ?
it's a custom board. the camer sensor feeds into another hardware before the data gets into the cpu
shouldn't you model the whole pipeline with V4L2 subdevs, until the DMA engine that captures frames to memory, which should then be modelled as a video node, using vb2 ?
in theory, yes. that's a huge task however. so the idea was to work on the camera sensor and bolt the rest together
I actually guess that the dma engine has a mainline driver, but switching over to it is not exactly trivial
I've been working on getting this driver published for five years now.
it should be possible to hack something in the meantime, with a video node that doesn't actually perform capture
if you want the driver before the hardware is obsolete, I better hurry up :)
I actually do have a hacked up bolt-things-together-driver. it does the v4l2_async_notifier stuff to get the subdevs into /dev
does that mean I need a struct video_device?
yes
but you can implement the bare minimum of the ioctls
probably even just VIDIOC_S_STREAM
good that's up next then. thank you again
so I'll actually be able to test s_stream
pinchartl: sailus: we never maanaged to merge the pixfmt to string helper?
I can't find it.
I thought it was here already.
ezequielg: not sure, I haven't followed up
:'(
here it is https://lore.kernel.org/linux-media/20190916115207.GO843@valkosipuli.retiisi.org.uk/t/#m1dda89afb049bad0065e7be9dd3242af6e2165af
still stalled?
sailus: need help with those, i might be able to squeeze them if it's not a lot of work.
ezequielg: Uh-oh.
did I open pandora's box?
There were patches, but I don't quite remember where it ended up to.
Let me find them.
It seems to have stalled in my inbox, yes.
Sorry about that.
Here:
https://lore.kernel.org/linux-media/20200427145303.29943-1-sakari.ailus@linux.intel.com/
I think it actually should be close to merging.
Lots of small things to fix still. No wonder that something else took over this in priority. :-P
Hi. I would like to ask how to check if v4l2 m2m kernel driver is compilant with specification. I'm focused on "dynamic resolution change" as stated in https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/dev-decoder.html#dynamic-resolution-change.
I would like to check if decoder is capable to play crowd_run_1080X512_fr30_bd8_frm_resize_l3.webm stream playback from VP9 test suite. This stream uses frame resize feature, which changes resolution of decoded frames twice a second (https://www.webmproject.org/vp9/levels/). I try to check the stream on Odroid N2 with Armbian Buster with 5.8.16 mainline kernel.
I am trying to play with V4L2 GStreamer element and some abandoned patches. I see that the code is able to subscribe proper event and detect source resolution change. I also see expected "last buffer"/EPIPE behaviour, but I have a problem with streaming resuming. I think that maybe I should look into v4l2-compliance, but it seems that I can't use it.
or maybe I don't know how to use it
BTW. I tried to search logs of this channels on https://linuxtv.org/irc/irclogger_log_search/v4l?search=VP9&action=search&error=0, but I got the error "*** ERROR: agrep not found!***"
sailus: ah, right. Last time I looked I wasn't a fan of that solution, but I'm probably wrong since it's the one that gained more traction.
ezequielg: Not my first choice either, no, but it seems people like it.
And having a way to do that is really nice.
Although I have to admit that this way the end result can be move verbose and flexible than the one based on format specifier macros.
sailus: yes, I guess something is better than nothing and each driver does its own thing. the v3 is from April, do you plan to revive it ? :-)
ezequielg: I'm a little busy right now but I factored in the comments I got, need to test the end result still.
I pushed it to fourcc branc in my linuxtv.org tree if you want to play with it.
There are no guarantees.