↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |
Who | What | When |
---|---|---|
*** | mchehab has quit IRC (Ping timeout: 246 seconds) | [05:32] |
.............................................................. (idle for 5h8mn) | ||
Whoopie has quit IRC (Ping timeout: 272 seconds) | [10:40] | |
..................................... (idle for 3h3mn) | ||
pinchartl | hverkuil: ping | [13:43] |
hverkuil | pinchartl: pong | [13:49] |
pinchartl | do you have a few minutes for a controls-related question ? | [13:49] |
hverkuil | sure | [13:50] |
pinchartl | I'm implementing support for a rotation control
the hardware support flipping, mirroring and 90° rotation first of all, how to map that to controls isn't very clear I need to expose the HFLIP and VFLIP controls but HFLIP + VFLIP + rotate(90) = rotate(270) and HFLIP + VFLIP + rotate(0) = rotate(180) I'm thinking about restricting the rotation control to 0 or 90 only 180 and 270 would be achieved using the rotation and [hv]flip controls does that seem fine to you ? | [13:51] |
hverkuil | You know that there is a V4L2_CID_ROTATE control, right? | [13:53] |
pinchartl | yes
that's the control I'm thinking about restricting to 0 and 90 | [13:54] |
hverkuil | Ah, OK. You said 'a rotation control' instead of 'the rotation control'. So that made me wonder. | [13:54] |
pinchartl | maybe I should have started with the second question actually
rotation can't be changed during streaming from a hardware point of view it could but the result would be messed up would you return an error when attempting to change the rotation control during streaming, or accept the change but only apply it when the stream is stopped ? | [13:55] |
hverkuil | Personally I think that it is confusing to limit the rotation control to 90 max. I believe the control should do exactly what it says, and the h/vflip would be in addition to that.
An app that wants to rotate just wants to put in the angle, not have to remember to flip as well if it want to do more than 90 degrees. | [13:57] |
pinchartl | you can change hflip and vflip while the stream is active
but you can't change the rotation or, rather you can't change it between 0/180 and 90/270 but you can change 0 <-> 180 or 90 <-> 270 as that's flipping it's an annoying control if I restrict it to 0 and 90 it gets easier, you can't change rotation at all during streaming | [13:58] |
hverkuil | Interesting. Normally you would block changes to a control when streaming, but here it depends on the value. | [14:00] |
pinchartl | it simplifies the userspace API | [14:00] |
hverkuil | I would just return EBUSY for invalid rotations. But the description of the control could be extended to explicitly mention the possibility that when streaming it can be possible on some hardware to switch between 0-180 and 90-270 on the fly.
This is something that is likely running in device specific code so it knows the limitations. I'm not sure you would even want to 'skip' the 90 degree rotates. As an end-user I would expect it to adjust anyway. Basically don't see a good use-case at the moment where it would be acceptable to do only flipping when streaming. | [14:07] |
pinchartl | I'm not sure if there's a use case, I agree
or rather I don't know what the use case would be but it has been requested maybe just to tick a box in a feature sheet though you're right that the driver will run with device-specific code (we're talking about the Renesas VSP) that's why I don't think that restricting rotation to 0 or 90 would be an issue userspace will flip or mirror as needed | [14:12] |
hverkuil | I would support all four angles, and (when streaming) return EBUSY for invalid angles. The device code can then decide what to do (just use 0-90 and flipping, or know that you can't use any angle when streaming).
The latter seems much easier to me in the applications since you only need to touch one control instead of three. | [14:14] |
pinchartl | that means I would need to put all three controls in a cluster then
and handle the 0/90 case separately I'll give it a go, thanks | [14:20] |
hverkuil | It's a bit more complex in the driver, but I think it is a lot easier that way in userspace. | [14:23] |
pinchartl | hverkuil: could you also look at my replies to "[PATCH/RFC v2 1/4] v4l: Add metadata buffer type and format" from Wednesday ?
I'd like to try and finalize that | [14:32] |
...... (idle for 29mn) | ||
*** | benjiG has left | [15:01] |
...... (idle for 25mn) | ||
hverkuil | pinchartl: I am a bit confused. You say that the metadata buffersize as reported for a video node doesn't depend on the width of the corresponding frame.
This is true if the metadata is copied from the framebuffer (which has the metadata in the first N lines). But what if you want to DMA the first N lines to the metadata buffer and the actual image data to the image buffer? Then it would be related to the image width. Or is this an unrealistic scenario? It would require a fancy DMA engine, so this may not happen in practice. | [15:26] |
pinchartl | the metadata size doesn't directly depend on the image size
the metadata size depends on the metadata format and it could be that in some cases the metadata format is linked to the image size as when metadata is transmitted embedded in an image but as you mentioned that would require a fancy dma engine, yes in practice what we see is metadata transmitted in lines before the image using a different CSI-2 data type and it happens that the sensor designer outputs metadata lines that have the same number of samples as the following image lines, for internal reasons | [15:34] |
hverkuil | and today the metadata is just memcpy'd from those lines into the metadata buffers, so the buffersize is equal to the actual metadata without any padding, right? | [15:40] |
pinchartl | today we have no support for metadata upstream
and my use case is a histogram generator | [15:41] |
hverkuil | s/today/with this proposal/ | [15:42] |
pinchartl | my hardware doesn't embed data in the image stream
you can't really memcpy because if you do it means your image will have a few lines of metadata at the beginning of the buffer and it will very likely mess sharing the buffer with other devices if the metadata is contained in the image stream and can't be demultiplexed by hardware then it will be provided to the application inside the image buffer without a metadata video node | [15:42] |
hverkuil | Ah, of course.
This proposal is for multiplexed streams. | [15:44] |
pinchartl | my patch series is for metadata in general
with one use case, which doesn't transmit metadata over a bus the VSP histogram generator stores data in registers which my driver memcpy's to the metadata buffer the OMAP3 ISP histogram generator works the same way but the driver uses a generic system mem-to-mem DMA engine to perform the memcpy | [15:47] |
hverkuil | I think you need acks from sailus and Guennadi for this patch. My concern is that this handles one specific case, and I would like to see that they agree that this would also work for the hardware that they know of. | [15:52] |
pinchartl | of course
but I'd like to make sure that I also address your concerns | [15:53] |
hverkuil | My concern is that I don't know enough :-) So if Sakari and Guennadi agree with the patch as well, then I am satisfied. | [15:59] |
pinchartl | ok :-)
I'll ask both of them for an ack Guennadi has sent a patch series for the uvcvideo driver that adds support for metadata using a metadata video node so I assume he's at least partly satisfied already :-) in the uvcvideo node metadata is transmitted over USB in packet headers we're talking about a few bytes per frame, it seems really overkill to use video buffers :-/ | [16:01] |
hverkuil | I'm not sure it makes sense to use this for small amounts of data. A read-only control(s) might be more appropriate. | [16:04] |
pinchartl | it's per-frame data
that need to be synchronized with frames | [16:05] |
hverkuil | How is this synchronization done anyway? How does an application know which meta data belongs to which frame? (Or if that relationship exists at all!)
Now that I think about it, I can't remember reading anything about that. Other than using timestamps, I guess. | [16:06] |
pinchartl | timestamps or sequence numbers | [16:11] |
hverkuil | today sequence numbers start at 0 when you start streaming. You'll need to change the spec to explicitly mention that metadata/video are synced.
I think a V4L2_BUF_FLAG_SEQNR_SYNCED would be useful. If both metadata and video set this, then the same seqnrs are used for both. (just brainstorming) timestamps are a poor choice, I think. It's hard to relate them, esp. since metadata can come at any time during a frame capture (or even before/after) uvc might be a good testbed for this. | [16:12] |
pinchartl | you can have multiple video streams and metadata streams in a device, I'm not sure a single flag would be sufficient | [16:16] |
hverkuil | I'm signing off for the day. | [16:22] |
pinchartl | have a nice evening | [16:23] |
hverkuil | you too!
sorry for not just Acking this, I know how you feel :-) Been there... Hmm, how about adding support for metadata to vivid? That can also be used to add tests to v4l2-compliance and v4l2-ctl. And vivid is ideal for prototyping, so this would be nice to have. | [16:23] |
pinchartl | no worries, as long as there's progress, I can be patient :-)
I think the biggest issue here is that we want to make sure our API works with a wide variety of use cases if I implement metadata support in vivid it will be for a single use case I'd prefer focusing on gathering use cases from Guennadi and Sakari first and then, after getting their approval, I can give vivid a go | [16:29] |
............ (idle for 56mn) | ||
hverkuil | well, you can implement any (or even multiple) use-case in vivid that you want. And it's great to use it to test applications without needing actual (and often difficult to find) hardware.
But collecting use-cases is a good starting point, I agree. | [17:26] |
*** | harrow has quit IRC (Ping timeout: 260 seconds) | [17:36] |
rjkm has quit IRC (Remote host closed the connection) | [17:43] | |
............................ (idle for 2h15mn) | ||
_franck__ has quit IRC (Ping timeout: 258 seconds) | [19:58] |
↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |