[00:51] *** Bladelouse_ has quit IRC (*.net *.split)
[00:51] *** ribalda has quit IRC (*.net *.split)
[00:58] <sarnex> hi all, i am having an issue using v4l2 after upgrading to 4.16. i am trying to use v4l2loopback which creates a fake video device, and then i am trying to use x11grab with ffmpeg to send to it. in 4.15, it works fine. in 4.16, the VIDIOC_G_FMT IOCTL is returning invalid argument in ffmpeg. were there any changes here in 4.16?
[01:10] <sarnex> i dont know if ffmpeg is now wrong or v4l2loopback
[06:48] <mripard> sailus: pong
[07:18] <sailus> mripard: Bonjour! :-)
[07:18] <sailus> I was looking at the CSI-2 TX driver again and I have trouble understanding the context a bit.
[07:19] <sailus> What kind of device is it; where would you use that?
[07:19] <sailus> Some of the information is obtained from DT, and I was wondering whether what's there now is complete.
[07:20] <mripard> sailus: so far, this is an FPGA-only IP, meant to be synthesized in SoCs
[07:21] <mripard> that's also why the support is partial, some features (like the PHY support) are not available in the FPGAs or simulators yet
[07:22] <sailus> Ok. So you'd have DMA and PHY separate from this one?
[07:23] <sailus> I presume the CSI-2 transmitter would be directed out of the SoC, not the other way around, right?
[07:32] <mripard> sailus: it will never perform DMA, it's just a bridge between an RGB-like interface (with additional signals) and a CSI-2 bus
[07:32] <mripard> and yes, I guess the most common use-case would be to have the RGB interface internal to the SoCs, and the CSI-2 link coming out of it
[07:41] <sailus> We don't have generic PHY API for camera PHYs yet.
[07:41] <sailus> It might be nice to have one one day...
[07:45] <mripard> the PHY should be inside the SoC as well, but yeah that would be nice
[07:45] <sailus> I wonder what's the use case for CSI-2 TX in an SoC.
[07:46] <sailus> And where should the endpoint parsing take place.
[07:46] <sailus> I suppose that information would be required in both the transmitter and the PHY configuration.
[07:46] <mripard> that's also why I left it out of the TX for now
[07:47] <mripard> since the way it would interact with the rest of the system in an actual SoC is not really clear to me at the moment
[07:48] <mripard> the setup I have in simulation is basically an IP to capture the frames <- CSI2-RX <- CSI2-TX <- Pattern generator
[07:55] <pinchartl> mripard: apart from IP core testing in an FPGA, do you see any use case for *not* having the PHY in the SoC ?
[07:57] <sailus> I'm wondering how to meaningfully support such devices.
[07:57] <sailus> We only have had capture pipelines in the past, but in this case you'd have the capture portion taking place somewhere else.
[07:58] <sailus> Would that somewhere else be part of the same MC pipeline or not?
[07:58] <sailus> I guess how to manage such a pipeline would depend on what kind of systems this hardware block would be a part of.
[08:05] <mripard> pinchartl: I heard about three cases: without PHY (but afaik it's only going to be in our simulation / FPGA), with the PHY part of the CSI2-TX controller, or with a D-PHY provided by the rest of the SoC
[08:05] <mripard> pinchartl: so I'd say no
[08:16] <mripard> sailus: and yeah, I couldn't really make up my mind (or gather more informations on this), so I just assumed this was a regular subdev for now, and we will add the relevant part where we'll have a clear use case
[08:17] <mripard> we'll have to do more anyway, even just to support the multiplexed links
[08:17] <mripard> and for example add a notifier to retrieve the upstream's link and associated VC and DT
[09:59] <mchehab> hverkuil: got the error with https://www.mail-archive.com/linux-media@vger.kernel.org/msg128701.html
[10:06] <mchehab> fix is: https://patchwork.linuxtv.org/patch/48408/
[10:06] <mchehab> it is actually a bug that exists since ever... the bitmap array there is too short!
[10:07] <mchehab> VFL_TYPE_foo starts with 0, and VFL_TYPE_MAX is equal to the last one
[10:07] <mchehab> so, the array should have VFL_TYPE_MAX + 1 elements
[10:08] <mchehab> to be able to store up to VFL_TYPE_MAX.
[10:08] <mchehab> basically, it doesn't have space for VFL_TYPE_TOUCH
[10:08] <hverkuil> Actually, you introduced this bug in commit 4839c58f034ae41e2dfdd097240a69622cab4c73 :-)
[10:09] <mchehab> oh
[10:09] <mchehab> yeah, you're right
[10:09] <hverkuil> See my reply I just posted, it has some suggestions for improving this.
[10:10] <mchehab> that's actually a good news, as it doesn't need to go all the way down to -stable Kernels
[10:14] <mchehab> version 2 sent
[10:15] <mchehab> I'll add a:
[10:15] <mchehab> Fixes: 4839c58f034a ("media: v4l2-dev: convert VFL_TYPE_* into an enum")
[10:15] <mchehab> when applying it upstream
[10:15] <mchehab> (should have added it before sending the patch)
[10:15] <mchehab> $ git describe 4839c58f034a
[10:15] <mchehab> media/v4.15-2-222-g4839c58f034a
[10:16] <mchehab> ok, only Kernel 4.16 was affected
[10:19] <hverkuil> OK, that's not too bad.
[10:20] <mchehab> yep
[10:20] <mchehab> I was afraid that this had to go to all -stable Kernels
[10:21] <mchehab> with regards to pr_err(), I prefer to do such changes on a separate patch...
[10:21] <mchehab> there are several printk(KERNL_ERR there already
[10:21] <mchehab> better to solve them all at once
[10:24] <hverkuil> I've added a topic to https://www.linuxtv.org/wiki/index.php/Media_Open_Source_Projects:_Looking_for_Volunteers to add v4l-touch emulation to vivid.
[10:34] <mchehab> yeah, that would be great
[10:35] <mchehab> printk patch sent
[10:35] <mchehab> that doesn't need to be backported
[10:40] <mchehab> I'll add this patch to the patchset to be sent upstream next week
[10:40] <mchehab> (the one with the fixes, not the one with the printk changes :-) )
[10:52] <hverkuil> ack
[11:01] <sailus> hverkuil: I just posted an RFC on the mediatext test program.
[11:02] <sailus> The last patch as two vim2m tests (the subject says vivid, I'll fix that).
[11:03] <sailus> I'd think this would be very practical for testing different kind of use cases and functionality, not that much torturing the API itself.
[12:48] <lucaceresoli> hi, I'm adding S_SELECTION support to an existing driver, but I'm confused
[12:48] <lucaceresoli> the driver can currently output:
[12:48] <lucaceresoli> - 2160p, native resolution
[12:48] <lucaceresoli> - 1080p, with 2x2 downsampling
[12:49] <lucaceresoli> - 720p with some downsampling, but I don't dare about this one
[12:50] <lucaceresoli> the sensor can also send a sub-window (aka cropping), but it cannot rescale except for the couple mentioned subsampling ratios
[12:50] <lucaceresoli> thus I'd like to add this mode:
[12:51] <lucaceresoli> - 1080p output, native resolution, with cropping (pretty much like a 2x "digital zoom")
[12:51] <lucaceresoli> question: which APIs will my user be supposed to use?
[12:53] <lucaceresoli> apparently I should set 1080p both to set_fmt and set_selection; how to I tell it it should crop (and not downsample) from 2160p?
[15:04] <sailus> hverkuil: The mediatext vim2m tests pass with your v10, both with and without requests.
[15:23] *** benjiG has left 
[20:40] <tp4> I'm kinda confused, is it a driver or library, can you use webcam without v4l2?
[20:40] *** tp4 has left "Leaving"
[22:38] <tp4> How do you convert yuyv to rgb
[22:40] <tp4> I think there is an convertToRGB function
[22:44] <tp4> You don't need to call a convert function, you can just request your fav format, and v4l2 will do a conversion for you seemless as though your camera infact supports the format.
[22:56] <tp4> My camera, when I do v4l2-ctl --all, I see it support yuyv, but when I change the format in grabber.c example, from rgb24 to yuyv, it doesn't work
[23:43] <tp4> I think v4l2 is too technical for me. I need a library built on top which is more user friendly. Or an text book that teaches you how to use it in laymen terms.