[04:49] <grohne> hi. is there a good way to expose embedded image statistics from an image sensor (extra image rows containing non-image data, statistics about the gathered image in a chip-specific format) over v4l? I looked and couldn't find prior art. I'm considering to add it as extra formats (set_fmt). ideas?
[07:04] <postmodern> reading the documentation on v4l2_format, where they specify based on the type field, which element in the union to access. Which union field to do I access when type is VIDEO_OUTPUT_OVERLAY? .win for video overlays or .pix for video outputs?
[08:45] <postmodern> also am i correct that VIDIOC_G_PARM and VIDIOC_S_PARM only work on VIDEO_CAPTURE and VIDEO_OUTPUT type streams?
[11:32] *** prabhakarlad has left 
[13:33] <postmodern> when is BUF_TYPE_PRIVATE actually used?
[13:42] <hverkuil> postmodern: never. It's deprecated and no longer supported.
[13:42] <postmodern> excellent. one less thing to implement.
[13:55] <postmodern> if you write bindings for v4l2, particularly the VIDIOC interface and structs, which license can/should you use for the bindings library?
[14:14] <tfiga> postmodern: the kernel UAPI structs use __u32, because enum is not fully specified by ABIs
[14:14] <tfiga> while the UAPI must be stable
[14:14] <tfiga> the v4l2_sliced_vbi_format.service_set think is suspicious indeed, but I'd just suggest to skip the VBI interface completely
[14:15] <tfiga> there is a lot of legacy stuff in the UAPI, which aren't really relevant today
[14:15] <tfiga> we can't just remove them from the kernel, because of the UAPI stability promises
[14:15] <tfiga> but you'll help us and yourself a lot if you don't propagate them ;)
[14:16] <tfiga> basically from the buf_type, I'd implement support only for VIDEO_CAPTURE/OUTPUT(_MPLANE) and META_CAPTURE/OUTPUT
[14:17] <tfiga> overlays are a relic of the past and now handled by the DRM KMS interface
[14:17] <tfiga> VBI is also a relic of the analog days
[14:17] <postmodern> ah ha
[14:17] <tfiga> not sure about SDR
[14:19] <tfiga> in general I wonder if you actually want to expose V4L2 as is to the higher level code
[14:19] <tfiga> it's a low level API to be used by video/camera middleware
[14:20] <tfiga> for example, ffmpeg or gstreamer would use it and expose their own high level APIs
[14:21] <tfiga> today there is also a lot of effort being put into Pipewire
[14:21] <postmodern> i ended up implementing Stream in a generic way. i specify the BUF_TYPE and it just passes it down to the C API, then map the returned structs to the appropriate classes that provide access to the correct sub-structs
[14:21] <postmodern> i'll take a look at pipewire
[14:22] <tfiga> but of course it all depends on what you're trying to achieve :)
[14:22] <postmodern> my ultimate goal is to get raw access to the YUYV 422 frames and the pixels, to do very low-latency visual FX to live video
[14:23] <postmodern> i was trying to use ffmpeg video filters for this, but ffmpeg introduced latency, espically when i used multiple cameras. You would quickly see video de-sync between the two cameras.
[14:23] <postmodern> also was a good challenge to help me fully learn/get experience with the crystal language
[14:23] <tfiga> perhaps gstreamer could give you a better experience
[14:24] <tfiga> ndufresne would be the person most knowledged about it
[14:25] <tfiga> the problem with using V4L2 directly is that it's expected to evolve over time and if your library doesn't evolve together with it, you lose various improvements and keep exercising legacy code paths
[14:25] <tfiga> of course the legacy code paths would stay there, because of the Linux UAPI compatibility promise
[14:29] <ndufresne> postmodern: I see two options, GStreamer is one, it will give you  maximum flexibility compare to ffmpeg, and will handle the resync of all this even if your effects have different or veriable latency
[14:30] <postmodern> ndufresne, does gstreamer provide a pure C API that i can bind to, or do i also have to writing bindings for GObject/glib?
[14:31] <ndufresne> GStreamer (just like glib) is pure C
[14:31] <postmodern> ndufresne, and can you sync to video streams together? i suspect ffmpeg was dropping frames from one device due to latency
[14:31] <ndufresne> of course bindings exist for other langues, rust, python, c# etc.
[14:32] <postmodern> ndufresne, well i mean do i have to interact with GObject, or can i treat it like a void *
[14:32] <tfiga> FWIW, there is a tutorial
[14:32] <tfiga> https://gstreamer.freedesktop.org/documentation/tutorials/basic/hello-world.html?gi-language=c
[14:32] <ndufresne> yes, gstreamer have a model made to support latency being different on various branches
[14:33] <ndufresne> postmodern: now that's option 1, option 2 is much more VG style, ultra low latency, you'd be writing your effect in vulkan, it's called PipeWire
[14:33] <ndufresne> it has support for v4l2 capture already, and you can hook on a real-time thread various filters, some existing Vulkan filters exist
[14:34] <ndufresne> you can capture from pipewire (zero-copy, dmabuf or memfd depending on the context) and render in one of more applications
[14:35] <ndufresne> as this is ultra low latency solution, with a step operation model, the sync is not required
[14:35] <ndufresne> it's naturally on sync
[14:35] <ndufresne> it is a tad more early tech of course, but for VG style app, that's gonna perform better
[14:36] <ndufresne> it's basically the future JackD of the video
[14:36] <postmodern> well there is a crystal-gobject library, but nothing for gstreamer. Also it doesn't look like OBS supports GStreamer? In the past I used v4l2loopback to pipe ffmpeg back into OBS for streaming purposes.
[14:36] <ndufresne> (well pipewire also implement JackD and pulseaudio API)
[14:36] <postmodern> curious whether PipeWire will handle MIDI data as well?
[14:36] <ndufresne> indeed, OBS is using ffmpeg, or native code, you won't find any helpers
[14:37] <ndufresne> but OBS does have some pipewire support, since they recently started an experimental support for screen casting on Wayland (Gnome or KDE, sway is coming)
[14:38] <ndufresne> pipewire is used as a stream bridge between the compositor process and your authorized stream-cast software
[14:39] <ndufresne> OBS is plugin based, in theory it should be possible to use gstreamer or other mm stack in there, just that it didn't happen so far
[14:39] <ndufresne> also, OBS is multi-platform software, so if you only need to bundle ffmpeg, it's much simpler then gstreamer, as ffmpeg has a pretty low amount of external deps
[14:40] <ndufresne> pipewire is linux only
[14:41] <postmodern> there does appear to be an obs-gstreamer plugin, never messed with plugins though
[14:42] <postmodern> might look into gstreamer or pipewire if v4l2 doesn't work out. so far i've implemented most of the v4l2 bindings, and can at least pull JPEG frames.
[14:42] <postmodern> (via both mmap and userptr buffers)
[14:52] <ndufresne> let us know if you are out of ideas ;-P
[14:53] <postmodern> btw what license should v4l2 bindings be under? noticed some people use MIT, GPL 2, GPL 3
[14:59] *** indy has quit IRC (Quit: ZNC - http://znc.sourceforge.net)
[15:31] <tfiga> postmodern: I think it's up to you, this is your code :)
[15:32] <tfiga> https://www.kernel.org/doc/html/latest/process/license-rules.html
[16:11] <postmodern> https://github.com/postmodern/v4l2.cr btw where's my code. might have went overboard mapping in the full API...
[16:19] <shibboleth> there's been a lot of activity here lately, anyone ever come across and solved interlacing issues on bttv?
[16:20] <shibboleth> i found the demo videos from LML (lmlbt44) site, somehow they are 640x360 with hardly any interlacing
[16:20] <shibboleth> if i go above 384x288 then all bets are off
[16:20] <shibboleth> interlacing/combing/artifacts
[16:49] *** mort has quit IRC (Ping timeout: 258 seconds)
[20:49] <ndufresne> shibboleth: deinterlacing is likely not done by the HW, so if the kernel is affected, it means you have to check that it is signalled properly, and then it's up to userspace to deinterlace
[20:49] <shibboleth> go on, re checking?