[00:52] *** tfiga has quit IRC (Read error: Connection reset by peer)
[05:58] <jmondi> sailus: it really seems we should arrange a meeting time and date :)
[06:12] <LazyGrizzly> pinchartl: Hi, I wanted to ask when my patches will be merged or if there is anything blocking them.
[06:12] <LazyGrizzly> The patches in question are https://patchwork.linuxtv.org/patch/53833/ and https://patchwork.linuxtv.org/patch/53832/
[06:51] <sailus> jmondi: Do you think so?
[06:51] <sailus> No ad-hoc?
[06:59] <jmondi> sailus: well, we failed several times, but apparently we're both here now :)
[07:01] <sailus> jmondi: Yes.
[07:01] <sailus> Does now work for you?
[07:02] <jmondi> sailus: yes it does!
[07:02] <sailus> Great!
[07:02] <jmondi> please go ahead, I have read you emails, but I'm still a bit puzzled
[07:02] <sailus> I was going to ask you whether you had opens left... X-)
[07:02] <sailus> Let me read your last e-mail again.
[07:03] <jmondi> sure
[07:04] <sailus> To answer your question --- the data types aren't visible in uAPI; streams are, and streams have a regular V4L2 subdev format.
[07:05] <sailus> The user can enumerate the streams a transmitter sends, and then can configure the receiver to receive what's desired.
[07:05] <sailus> Does this get even close answering your question? :)
[07:05] <jmondi> sailus: so I think I'm a bit mixing apples and pears here...
[07:06] <sailus> How?
[07:06] <jmondi> I see the issue of configuring the receiver based on the transmitter as a separate issue
[07:06] <jmondi> the series uses the remote frame desc as exchange mechanisms
[07:06] <jmondi> and this could be discussed as well
[07:06] <sailus> If this is clear there's no need to.
[07:07] <sailus> So could you explain what did you see as a problem?
[07:07] <jmondi> but the problem of enabling/disabling a specific DT in the transmitter video data stream should not be addressed by using stream_id to map to a DT imho
[07:07] <jmondi> how would you enable emedded data on a CSI-2 transmitter?
[07:08] <jmondi> (embedded data as an example)
[07:08] <jmondi> btw, I spoke with neg and pinchartl  when we met ~1 week ago about this, and I know they are interested, so, ping neg: pinchartl:
[07:08] <sailus> The smiapp driver uses a separate stream for it, as it's on a separate data type.
[07:09] <sailus> We decided to postpone the patches when Niklas begon working on the set.
[07:09] <jmondi> so each DT maps to a stream_id, and to enable/disable the transmission of a DT we enable/disable one route directed to the CSI-2 source pad of the transmitter
[07:10] <sailus> jmondi: Each VC/DT pair.
[07:10] <jmondi> sailus: yes indeed
[07:10] <jmondi> and here it is where the (64 * 32) * (nr src_pad) number comes from
[07:11] <jmondi> enabling/disabling transmission of a DT is something the user should be able to control, right?
[07:11] <sailus> If the hardware supports that, yes.
[07:11] <jmondi> (32 because, as you rightfully said, CSI-2 on C-PHY has up to 32 VC iirc)
[07:12] <jmondi> how would a user know with stream_id maps to which [VC-DT] combination?
[07:12] <sailus> Yes; we don't have support for C-PHY in drivers yet but the V4L2 fwnode framework has it.
[07:12] <sailus> jmondi: Why would he user need to know DT or VC?
[07:13] <sailus> We don't have other low level bus specific parameters in the API either.
[07:13] <jmondi> to enable/disable the transmission of a DT ?
[07:13] <sailus> The user can enable or disable a stream.
[07:14] <sailus> So the user needs to be able to identify the streams based on their V4L2 subdev format, not VC and DT.
[07:14] <jmondi> sailus: wait for me: subdev_g_fmt is not stream aware, isn't it?
[07:15] <sailus> Not as such, but each pad that supports format operations has only one stream.
[07:15] <sailus> So the user knows the relation between the formats and streams.
[07:17] <jmondi> this implies modelling, say, embedded data, with a dedicated sink pad?
[07:17] <sailus> On the receiver side, yes.
[07:18] <sailus> But the receiver is unlikely to know it's embedded data.
[07:19] <jmondi> what about the transmitter side? how does an application find out which route to enable to control embedded data transmission?
[07:38] <sailus> jmondi: The format on the respective pad is an embedded data format.
[07:38] <sailus> I don't think we have any defined right now.
[07:39] <sailus> The problem is that also the receiver needs to support such a format, so the format library is a very nice thing to have. I think Maxime is working on one but it's memory formats only AFAIR.
[07:39] <jmondi> sailus: so there will be 1 source pad for embedded data and one source pad for video data on the transmitter
[07:39] <jmondi> I don't think so, otherwise retreiving the remote frame_desc won't work
[07:42] <sailus> You'll still need a separate mux sub-device. The CSI-2 transmitter only has a single source pad.
[07:44] <jmondi> ah! there will then be a CSI-2 mux entity between the transmitter and the receiver?
[07:49] <sailus> The transmitter is the mux in this case.
[07:51] <jmondi> ok, this does not change the fact that the sensor entity is separate from the csi-2 transmitter one
[07:55] <sailus> Correct.
[07:55] <jmondi> I see
[07:55] <jmondi> I've missed a lot of this part, hence my confusion
[07:55] <sailus> So a sensor has three sub-devices as a result.
[07:55] <sailus> \o/
[07:56] <sailus> It's often hard to remotely pin down a difference in understanding of a non-trivial system. :-P
[07:56] <sailus> Great!
[07:56] <jmondi> while this models what actually happens in most sensor (they output data on a parallel bus and they got muxed into CSI-2 internally) don't you think requiring to describe CSI-2 sensor with 2 separate entities?
[07:57] <sailus> The smiapp driver would have five, as it already had three to begin with.
[07:57] <sailus> Three entities, in fact. Or, I guess you could do with two in principle.
[07:57] <sailus> If you use two source pads on the other one.
[07:58] <jmondi> three including the receiver? I count the data source, the csi-2 mux and the csi-2 receiver
[07:58] <jmondi> for sure there will be some DMA engine on the receiver side, which -might- be modeled differently
[08:00] <sailus> Just the sensor.
[08:00] <sailus> The pixel array, embedded data source and mux.
[08:01] <sailus> You could combine the pixel array and the embedded data source, and use two source pads.
[08:01] <sailus> I think it'd be cleaner to separate them though.
[08:01] <sailus> At least that's how it's implemented in the smiapp driver.
[08:20] <jmondi> sailus: I see... Before you presented me this picture I was about to make a counter-proposal actually, that would model each supported VC in one available stream on the source pad and let the user control DT enablement/disablement with a control
[08:20] <jmondi> is controlling DT with a v4l2-ctl frowned upon in your opinion?
[08:22] <jmondi> in this way you would model a csi-2 capable transmitter with a single entity, and use routes to control to which VC direct what, and control the ancillary data enablement (such as embedded data)  with a control
[10:23] <pinchartl> LazyGrizzly: thanks for pinging me, I'll have a look
[12:18] *** LazyGrizzly has left 
[14:44] <KitsuWhooa> Hey, some time ago I asked about submitting some patches to the mailing list regarding qv4l2. I haven't had the time until now, but since I'm new to this, I read that I should CC someone while also sending the patch to the mailing list.
[14:44] <KitsuWhooa> Looking at existing emails in the mailing list I couldn't get a definitive answer on who/if I should CC
[14:46] <KitsuWhooa> Apologies if this seems obvious, but it would be appreciated if I could get an answer on if I should CC someone, and if so, who.
[15:12] <hverkuil> KitsuWhooa: just post the patches to the linux-media mailinglist. I'll pick them up.
[15:13] <hverkuil> No need to CC anyone.
[15:18] <KitsuWhooa> Alright. Thank you
[15:18] <KitsuWhooa> I'll get to it in a bit
[15:19] *** benjiG has left 
[16:43] <pinchartl> hverkuil: any opinion about factoring out the format conversion code from libv4l to a standalone library that doesn't require a V4L2 device node to operate ?
[18:27] <hverkuil> pinchartl: fine by me!
[18:55] <pinchartl> hverkuil: thanks. we need format conversion as part of libcamera, so I think I'll spin that off at some point