#v4l 2019-05-06,Mon

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
***tfiga has quit IRC (Read error: Connection reset by peer) [00:52]
.............................................................. (idle for 5h6mn)
jmondisailus: it really seems we should arrange a meeting time and date :) [05:58]
LazyGrizzlypinchartl: Hi, I wanted to ask when my patches will be merged or if there is anything blocking them.
The patches in question are https://patchwork.linuxtv.org/patch/53833/ and https://patchwork.linuxtv.org/patch/53832/
[06:12]
........ (idle for 39mn)
sailusjmondi: Do you think so?
No ad-hoc?
[06:51]
jmondisailus: well, we failed several times, but apparently we're both here now :) [06:59]
sailusjmondi: Yes.
Does now work for you?
[07:01]
jmondisailus: yes it does! [07:02]
sailusGreat! [07:02]
jmondiplease go ahead, I have read you emails, but I'm still a bit puzzled [07:02]
sailusI was going to ask you whether you had opens left... X-)
Let me read your last e-mail again.
[07:02]
jmondisure [07:03]
sailusTo answer your question --- the data types aren't visible in uAPI; streams are, and streams have a regular V4L2 subdev format.
The user can enumerate the streams a transmitter sends, and then can configure the receiver to receive what's desired.
Does this get even close answering your question? :)
[07:04]
jmondisailus: so I think I'm a bit mixing apples and pears here... [07:05]
sailusHow? [07:06]
jmondiI see the issue of configuring the receiver based on the transmitter as a separate issue
the series uses the remote frame desc as exchange mechanisms
and this could be discussed as well
[07:06]
sailusIf this is clear there's no need to.
So could you explain what did you see as a problem?
[07:06]
jmondibut the problem of enabling/disabling a specific DT in the transmitter video data stream should not be addressed by using stream_id to map to a DT imho
how would you enable emedded data on a CSI-2 transmitter?
(embedded data as an example)
btw, I spoke with neg and pinchartl when we met ~1 week ago about this, and I know they are interested, so, ping neg: pinchartl:
[07:07]
sailusThe smiapp driver uses a separate stream for it, as it's on a separate data type.
We decided to postpone the patches when Niklas begon working on the set.
[07:08]
jmondiso each DT maps to a stream_id, and to enable/disable the transmission of a DT we enable/disable one route directed to the CSI-2 source pad of the transmitter [07:09]
sailusjmondi: Each VC/DT pair. [07:10]
jmondisailus: yes indeed
and here it is where the (64 * 32) * (nr src_pad) number comes from
enabling/disabling transmission of a DT is something the user should be able to control, right?
[07:10]
sailusIf the hardware supports that, yes. [07:11]
jmondi(32 because, as you rightfully said, CSI-2 on C-PHY has up to 32 VC iirc)
how would a user know with stream_id maps to which [VC-DT] combination?
[07:11]
sailusYes; we don't have support for C-PHY in drivers yet but the V4L2 fwnode framework has it.
jmondi: Why would he user need to know DT or VC?
We don't have other low level bus specific parameters in the API either.
[07:12]
jmondito enable/disable the transmission of a DT ? [07:13]
sailusThe user can enable or disable a stream.
So the user needs to be able to identify the streams based on their V4L2 subdev format, not VC and DT.
[07:13]
jmondisailus: wait for me: subdev_g_fmt is not stream aware, isn't it? [07:14]
sailusNot as such, but each pad that supports format operations has only one stream.
So the user knows the relation between the formats and streams.
[07:15]
jmondithis implies modelling, say, embedded data, with a dedicated sink pad? [07:17]
sailusOn the receiver side, yes.
But the receiver is unlikely to know it's embedded data.
[07:17]
jmondiwhat about the transmitter side? how does an application find out which route to enable to control embedded data transmission? [07:19]
.... (idle for 19mn)
sailusjmondi: The format on the respective pad is an embedded data format.
I don't think we have any defined right now.
The problem is that also the receiver needs to support such a format, so the format library is a very nice thing to have. I think Maxime is working on one but it's memory formats only AFAIR.
[07:38]
jmondisailus: so there will be 1 source pad for embedded data and one source pad for video data on the transmitter
I don't think so, otherwise retreiving the remote frame_desc won't work
[07:39]
sailusYou'll still need a separate mux sub-device. The CSI-2 transmitter only has a single source pad. [07:42]
jmondiah! there will then be a CSI-2 mux entity between the transmitter and the receiver? [07:44]
sailusThe transmitter is the mux in this case. [07:49]
jmondiok, this does not change the fact that the sensor entity is separate from the csi-2 transmitter one [07:51]
sailusCorrect. [07:55]
jmondiI see
I've missed a lot of this part, hence my confusion
[07:55]
sailusSo a sensor has three sub-devices as a result.
\o/
It's often hard to remotely pin down a difference in understanding of a non-trivial system. :-P
Great!
[07:55]
jmondiwhile this models what actually happens in most sensor (they output data on a parallel bus and they got muxed into CSI-2 internally) don't you think requiring to describe CSI-2 sensor with 2 separate entities? [07:56]
sailusThe smiapp driver would have five, as it already had three to begin with.
Three entities, in fact. Or, I guess you could do with two in principle.
If you use two source pads on the other one.
[07:57]
jmondithree including the receiver? I count the data source, the csi-2 mux and the csi-2 receiver
for sure there will be some DMA engine on the receiver side, which -might- be modeled differently
[07:58]
sailusJust the sensor.
The pixel array, embedded data source and mux.
You could combine the pixel array and the embedded data source, and use two source pads.
I think it'd be cleaner to separate them though.
At least that's how it's implemented in the smiapp driver.
[08:00]
.... (idle for 19mn)
jmondisailus: I see... Before you presented me this picture I was about to make a counter-proposal actually, that would model each supported VC in one available stream on the source pad and let the user control DT enablement/disablement with a control
is controlling DT with a v4l2-ctl frowned upon in your opinion?
in this way you would model a csi-2 capable transmitter with a single entity, and use routes to control to which VC direct what, and control the ancillary data enablement (such as embedded data) with a control
[08:20]
......................... (idle for 2h1mn)
pinchartlLazyGrizzly: thanks for pinging me, I'll have a look [10:23]
........................ (idle for 1h55mn)
***LazyGrizzly has left [12:18]
.............................. (idle for 2h26mn)
KitsuWhooaHey, some time ago I asked about submitting some patches to the mailing list regarding qv4l2. I haven't had the time until now, but since I'm new to this, I read that I should CC someone while also sending the patch to the mailing list.
Looking at existing emails in the mailing list I couldn't get a definitive answer on who/if I should CC
Apologies if this seems obvious, but it would be appreciated if I could get an answer on if I should CC someone, and if so, who.
[14:44]
...... (idle for 26mn)
hverkuilKitsuWhooa: just post the patches to the linux-media mailinglist. I'll pick them up.
No need to CC anyone.
[15:12]
KitsuWhooaAlright. Thank you
I'll get to it in a bit
[15:18]
***benjiG has left [15:19]
................. (idle for 1h24mn)
pinchartlhverkuil: any opinion about factoring out the format conversion code from libv4l to a standalone library that doesn't require a V4L2 device node to operate ? [16:43]
..................... (idle for 1h44mn)
hverkuilpinchartl: fine by me! [18:27]
...... (idle for 28mn)
pinchartlhverkuil: thanks. we need format conversion as part of libcamera, so I think I'll spin that off at some point [18:55]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)