↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |
Who | What | When |
---|---|---|
hverkuil | ezequielg: what is the status of this rkvdec-vp9 patch series? https://patchwork.linuxtv.org/cover/60721/ | [10:19] |
..... (idle for 24mn) | ||
ezequielg | hverkuil: literally next on the todo list | [10:43] |
hverkuil | Can I mark it as 'Obsoleted' or 'Superseded'? | [10:43] |
ezequielg | Obsolete is fine, yes. | [10:44] |
hverkuil | done, thanks! | [10:46] |
ezequielg | hverkuil: how about https://patchwork.linuxtv.org/patch/61274/ ? | [10:50] |
hverkuil | ezequielg: processing that one as we speak. | [10:51] |
ezequielg | hverkuil: this was a nice cleanup https://patchwork.linuxtv.org/patch/50338/ but it seems it got stuck because we couldn't convert some old drivers.
i guess we can obsolete it, as well. | [10:53] |
*** | benjiG has left | [10:55] |
hverkuil | I'll obsolete that series, but it might be nice to revisit it at some point since it is a good idea. | [10:55] |
ezequielg | i will keep that in mind. | [11:04] |
....... (idle for 33mn) | ||
pinchartl | hverkuil: ping
what would you think of a version of v4l2_device_register_subdev_nodes() that would create read-only subdev nodes ? those would only expose query/enum/get ioctls. the use case is to expose the actual fine-grained configuration of a pipeline for video-node-based drivers what I need to expose in particular is the cropping, binning, skipping and scaling configuration of "mode-based" camera sensors mode-based in this case means drivers that don't allow configuration of the sensor parameters directly, but expose a list of hardcoded modes with blobs of register address and values there's a need for userspace to know, for each of those modes, how much cropping and scaling has been applied so I'd like the get selection ioctl to be used by userspace for that | [11:37] |
......... (idle for 41mn) | ||
sailus | pinchartl: How would you report to the user what do these modes consist of (i.e. enumerate them) and how would you select them?
The uAPI we have to configure these is lower level than what drivers actually implement... (I need to leave in a few minutes.) | [12:23] |
pinchartl | sailus: those drivers have a fixed set of modes, and you can pick one using S_FMT
(or to be precise the subdev .set_fmt() pad operation) | [12:24] |
sailus | What if the output size is the same on multiple modes?
You can get the same by cropping and binning, for instance. | [12:24] |
pinchartl | that's not supported today
I'm talking about existing drivers the vast majority of our sensor drivers have fixed modes | [12:25] |
sailus | Ah. | [12:25] |
pinchartl | and you select them through the output format | [12:25] |
sailus | Yeah, they're just silly. | [12:25] |
pinchartl | that's *not* the model I recommend :-)
but for those existing drivers | [12:25] |
sailus | Good question. | [12:25] |
pinchartl | I need a way to report the crop, binning, skipping and scaling parameters | [12:25] |
sailus | Hope for the best while fearing the worst? | [12:26] |
pinchartl | :-D
do you think a read-only userspace API for subdevs makes sense for this ? ideally the drivers should implement the full API but we can't ask for all existing drivers to be converted, especially when no datasheet is available (publicly, or available at all sometimes :-)) | [12:26] |
sailus | I think ideally, in V4L2 terms, the sensor should be able to convey all configuration possibilities individually to the user space.
Back later on... | [12:27] |
................. (idle for 1h20mn) | ||
hverkuil | mtretter: ping | [13:48] |
mtretter | hverkuil: pong | [13:49] |
hverkuil | that's quick :-)
I'm looking at your 18 part allegro series. How does patch 11/18 relate to https://patchwork.linuxtv.org/patch/60902/ ? I gather from 11/18 that s_parm sets both the encoding speed and writes that same value in the coded stream, right? I.e. basically how every other encoder does this today in v4l2. | [13:49] |
mtretter | Yes, that's how every other encoder does this today. In the end I want to do this via the FRAME_RATE control, but as I have to implement the legacy behavior anyway, I implemented it using s_parm for now. | [13:58] |
hverkuil | OK, no problem. | [13:59] |
pinchartl | hverkuil: ping, second try :-) | [14:09] |
hverkuil | pinchartl: pong | [14:19] |
pinchartl | hverkuil: I wanted to discuss read-only subdev devnodes with you
can you read the short discussion I had with Sakari above ? | [14:21] |
hverkuil | yeah, I read it. I don't like it, but I can't think of a better alternative either. | [14:22] |
pinchartl | :-)
same for me but I think it also solves the enumeration issue that Sakari mentioned, or at least part of it | [14:22] |
hverkuil | This is for a simple video pipeline, right? sensor -> bridge -> possibly some processing -> dma | [14:23] |
pinchartl | as we can set a TRY format (that would still be allowed on a read-only subdev)
and then get the TRY selection rectangles | [14:23] |
sailus | If requests were extended to cover formats and selections, you could pass back the rest of the information back to the user --- and even that would requires splitting the subdevs into something similar smiapp driver uses today. | [14:23] |
pinchartl | so userspace can enumerate the modes with all their parameters
the only thing that we wouldn't support is multiple modes having the same output resolution, but for that, I think we really need to configure the subdev from userspace fully, with a proper driver, not hardcoded mode | [14:23] |
sailus | Alternatively, selection API could be extended to convey the order of different steps back to the user. | [14:24] |
pinchartl | hverkuil: correct, sensor -> CSI-2 receiver -> DMA | [14:24] |
sailus | Neither is trivial to implement in the framework nor drivers. | [14:24] |
pinchartl | it's for video-devnode-centric drivers
where everything is configured by a S_FMT on the video node (and again, it's really about existing drivers, not new ones) | [14:25] |
hverkuil | Doing this also requires such a driver to create a media device since userspace has to be able to find the subdev. | [14:25] |
pinchartl | correct | [14:25] |
hverkuil | I really don't like that. | [14:25] |
pinchartl | the video-devnode-centric driver would need to be extended to do so
but it's fairly trivial uvcvideo does it already with ~120 lines of code | [14:26] |
sailus | pinchartl: I thought you meant MC-enabled drivers. | [14:26] |
pinchartl | there's no configuration of the links or subdevs from userspace | [14:26] |
hverkuil | Unless you create media devices for ALL devnode-centric drivers. (something that was always intended) | [14:26] |
pinchartl | yes, creating a media device for all drivers is best I think
but it can only be done automatically for the ones based on subdevs internally and would still require the subdevs to initialize their media entity, with pads so it can't be done fully automatically to expose read-only subdevs | [14:26] |
hverkuil | It's a lot of work... | [14:28] |
pinchartl | sailus: for drivers that use MC and the V4L2 subdev userspace API, we can already access the subdev nodes from userspace, so all is fine
hverkuil: that's why I think for now it should be enabled manually per-driver, perhaps with some core helper because otherwise we'll have to switch everything in one go and convert all subdev drivers, that's a bit too much :-) | [14:28] |
sailus | pinchartl: That's an interesting idea. | [14:32] |
pinchartl | I'll propose patches | [14:32] |
sailus | How do you determine what kind of subdevs to create, and which selections will be available and where?
I think drivers should have direct control of that, for this is somewhat device specific. | [14:32] |
hverkuil | I'm not sure this helps anyway: do we have a way to report binning/skipping at all today? | [14:33] |
sailus | Very probably there are common patterns though. | [14:33] |
pinchartl | I don't plan to create subdevs automatically :-)
we already have subdevs | [14:33] |
sailus | hverkuil: Smiapp does support configuring binning but not skipping. | [14:33] |
pinchartl | hverkuil: yes, but not fully documented in the API | [14:33] |
sailus | It'd require one or two more subdevs to implement everything with the current uAPI. | [14:33] |
pinchartl | it's configured through formats and selection rectangles on pads | [14:34] |
sailus | (Not counting embedded data.) | [14:34] |
hverkuil | I truly do not like the idea of read-only subdevs.
Since this is meant for simple pipelines, I'd much rather see some solution with G_SELECTION. | [14:35] |
pinchartl | but we can't do that
G_SELECTION isn't powerful enough we have multiple pads in the pipeline and they're all handled through a single subdev sorry, video node G_SELECTION doesn't take a pad parameter (and it shouldn't) there's cropping, binning, skipping and scaling in the sensor possibly with different points where cropping can be applied userspace needs to be able to retrieve all that information I don't think you can abstract all those consecutive operations, which can happen in an order that depends on the sensor, through a single G_SELECTION call it's not just about adding a selection target, it's really a pipeline with multiple stages | [14:37] |
hverkuil | Just two stages: the source and the DMA engine: those are the only places where you can crop/scale. Anything more complex should use the MC API. For such simple pipelines you can create selection targets that report the sensor crop and sensor scale. (for binning/skipping you might need controls) | [14:41] |
sailus | hverkuil: Sensor drivers implement that implicitly, without telling about it, and the user space cannot figure it out.
I believe it is a problem in certain use cases, where drivers still have not been converted to MC-centric. | [14:41] |
hverkuil | 'without telling about it': do you mean the sensor driver doesn't know? Or we just do not have a way to expose it? | [14:43] |
sailus | In reality, if sensor processing would be within consideration there, pretty much every bridge driver should be MC enabled.
hverkuil: The sensor drivers mostly work on register lists, but they usually have descriptive names for those lists. But they also don't expose that, because they cannot. (Or, could not, before the sub-device uAPI.) Unless I'm mistaken, pinchartl's primary concern is those drivers. No other driver than the smiapp does this btw. | [14:43] |
pinchartl | sailus: correct
it's those drivers based on register lists not the drivers that accept arbitrary cropping and scaling configurations. those are very good, but we have too few of them :-) | [14:44] |
hverkuil | OK, I'm getting confused :-) | [14:46] |
pinchartl | should I try to explain it again ? | [14:46] |
hverkuil | So we are talking old crappy sensors drivers that have a fixed number of modes, right?
And we want to be able to let userspace know what these modes do (cropping/scaling/binning/skipping)? | [14:46] |
sailus | hverkuil: More or less every sensor driver apart from smiapp. | [14:47] |
pinchartl | we're talking about sensor drivers, in drivers/media/i2c/, that are based on register lists, with a fixed number of "modes". a mode in this context is a full sensor configuration that includes cropping (possibly in multiple stages), binning, skipping and/or scaling, to produce an output resolution
it's not about "old crappy sensor drivers" though we have a majorite of this kind of drivers | [14:47] |
sailus | They're still crappy, just not old. | [14:47] |
pinchartl | and we regularly merge new ones
so it's not just legacy they're far from perfect, and ideally they should implement the subdev API fully and compute register values instead of hardcoding modes we're pushing for that, but we're still getting a majority of mode-based sensors due to a combination of factors, lack of datasheet, lack of support from the vendor, lazyness sometimes, ... | [14:47] |
hverkuil | OK. And I assume we DO know what each mode does, even though we may not know what the registers mean. So it may be a hardcoded list of meta information about the modes in these sensor drivers. | [14:48] |
pinchartl | fixing this is another issue
yes, let's assume the driver knows how much cropping, binning, ... is applied for each mode | [14:49] |
sailus | hverkuil: More or less so. | [14:49] |
pinchartl | we want to report that to userspace
(some drivers may not know, they may just have binary data without any knowledge, that's out of scope, we can't help them) so the question is how do we expose that information to userspace ? given the following - we have an internal API in the kernel to get and set those parameters, through formats and selection rectangles on subdev pads - we can expose that API to userspace throught V4L2 subdev nodes, but in this use case, it's not exposed (due to the main driver being videonode-centric) s/throught/through/ how those parameters are applied to the sensor is sensor-specific. some sensors could apply crop before scaling, some could do it the other way around, some could do both this is reflected, in the smiapp driver for instance, by multiple subdevs, to show the separate processing stages | [14:49] |
hverkuil | All the information you need is read-only and constant, right? I.e. the sensor has N modes, and for each mode you want to expose the mode properties. Based on that information userspace can pick a mode, which I think is done via VIDIOC_S_FMT.
Or is it frame size + frame rate that selects the mode in the sensor? | [15:00] |
pinchartl | in the cases I've seen, it's VIDIOC_S_FMT, so the information is fixed for a given format set by S_FMT
but I can't rule out that it could depend on the frame rate as I haven't analyzed all sensor drivers, just the few I'm working with right now | [15:03] |
hverkuil | I am wondering whether it isn't easier to create a read-only control array (or arrays) containing the mode information. It's useful regardless of whether the sensor is used in a MC or devnode-centric driver.
CID_SENSOR_MODES or whatever you call it. If I understand this correctly, that is really all you are interested in. | [15:05] |
pinchartl | I think that would be a bad API
it would create more differences between MC and non-MC what about smiapp for instance ? when it's used with a videonode-centric driver it will be configured by VIDIOC_S_FMT still, userspace will need to know about cropping and all other parameters the smiapp driver shouldn't report modes, as it doesn't have modes the videonode driver could create the CID_SENSOR_MODES control itself, sure but that's more complicated for the videonode driver would need to be done by all of them we have a subdev API within the kernel that can report the information and we have a way to expose that API to userspace | [15:07] |
hverkuil | One option is to just call VIDIOC_S_FMT, then read out the current mode information, i.e. what the sensor driver configured. | [15:10] |
pinchartl | why should we create a control that exposes information retrieved from that API inside the kernel ?
yes, but how do you read out the current mode information from userspace ? that's the question :-) | [15:10] |
hverkuil | It is a shitload of work for userspace to implement this with a media device and subdevs. It defeats the purpose of a devnode-centric interface. | [15:11] |
pinchartl | it's not that much work, and it will only be used by specialized userspace
furthermore, there's no definition of sensor mode that would fit all sensors as I said earlier, how a sensor applies the different processing parameters is sensor-specific | [15:11] |
hverkuil | I need to sleep on this :-) And I probably want to see an RFC with a bit more detail. Ping me tomorrow and we can discuss this further. | [15:23] |
pinchartl | hverkuil: thanks. I'll write the RFC, with a sample implementation | [15:23] |
hverkuil | That would be much appreciated. | [15:23] |
sailus | pinchartl: I'd like to see an implementation. | [15:24] |
pinchartl | I would also like to add that I think exposing read-only subdevs will make the videonode-centric and MC-centric cases closer to each other
the same API would be used by userspace in both cases | [15:24] |
sailus | This would not be perfect, but probably about as good as is possible, considering the existing interfaces. | [15:24] |
pinchartl | supporting both mode enumeration with the TRY API on subdevs, and retrieval of the current mode
while adding controls to support this would push the videonode and MC APIs more apart from eachother one last point is that, if we expose the sensor information on sensor subdev nodes, it's easier for userspace to handle, as the sensors can more easily be handled in isolation of the rest of the pipeline | [15:24] |
hverkuil | Also, I'm not clear on how userspace can know the binning/skipping state of the sensor, if you can add an example of how that's done, then that would be useful. If nothing else, I will have learned something new.
In any case, I have to leave. ttyl. | [15:28] |
pinchartl | thank you for your help
have a nice evening | [15:29] |
sailus | Kiitos samoin! | [15:37] |
↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |