#v4l 2020-02-25,Tue

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
hverkuilezequielg: what is the status of this rkvdec-vp9 patch series? https://patchwork.linuxtv.org/cover/60721/ [10:19]
..... (idle for 24mn)
ezequielghverkuil: literally next on the todo list [10:43]
hverkuilCan I mark it as 'Obsoleted' or 'Superseded'? [10:43]
ezequielgObsolete is fine, yes. [10:44]
hverkuildone, thanks! [10:46]
ezequielghverkuil: how about https://patchwork.linuxtv.org/patch/61274/ ? [10:50]
hverkuilezequielg: processing that one as we speak. [10:51]
ezequielghverkuil: this was a nice cleanup https://patchwork.linuxtv.org/patch/50338/ but it seems it got stuck because we couldn't convert some old drivers.
i guess we can obsolete it, as well.
[10:53]
***benjiG has left [10:55]
hverkuilI'll obsolete that series, but it might be nice to revisit it at some point since it is a good idea. [10:55]
ezequielgi will keep that in mind. [11:04]
....... (idle for 33mn)
pinchartlhverkuil: ping
what would you think of a version of v4l2_device_register_subdev_nodes() that would create read-only subdev nodes ? those would only expose query/enum/get ioctls. the use case is to expose the actual fine-grained configuration of a pipeline for video-node-based drivers
what I need to expose in particular is the cropping, binning, skipping and scaling configuration of "mode-based" camera sensors
mode-based in this case means drivers that don't allow configuration of the sensor parameters directly, but expose a list of hardcoded modes with blobs of register address and values
there's a need for userspace to know, for each of those modes, how much cropping and scaling has been applied
so I'd like the get selection ioctl to be used by userspace for that
[11:37]
......... (idle for 41mn)
sailuspinchartl: How would you report to the user what do these modes consist of (i.e. enumerate them) and how would you select them?
The uAPI we have to configure these is lower level than what drivers actually implement...
(I need to leave in a few minutes.)
[12:23]
pinchartlsailus: those drivers have a fixed set of modes, and you can pick one using S_FMT
(or to be precise the subdev .set_fmt() pad operation)
[12:24]
sailusWhat if the output size is the same on multiple modes?
You can get the same by cropping and binning, for instance.
[12:24]
pinchartlthat's not supported today
I'm talking about existing drivers
the vast majority of our sensor drivers have fixed modes
[12:25]
sailusAh. [12:25]
pinchartland you select them through the output format [12:25]
sailusYeah, they're just silly. [12:25]
pinchartlthat's *not* the model I recommend :-)
but for those existing drivers
[12:25]
sailusGood question. [12:25]
pinchartlI need a way to report the crop, binning, skipping and scaling parameters [12:25]
sailusHope for the best while fearing the worst? [12:26]
pinchartl:-D
do you think a read-only userspace API for subdevs makes sense for this ?
ideally the drivers should implement the full API
but we can't ask for all existing drivers to be converted, especially when no datasheet is available
(publicly, or available at all sometimes :-))
[12:26]
sailusI think ideally, in V4L2 terms, the sensor should be able to convey all configuration possibilities individually to the user space.
Back later on...
[12:27]
................. (idle for 1h20mn)
hverkuilmtretter: ping [13:48]
mtretterhverkuil: pong [13:49]
hverkuilthat's quick :-)
I'm looking at your 18 part allegro series.
How does patch 11/18 relate to https://patchwork.linuxtv.org/patch/60902/ ?
I gather from 11/18 that s_parm sets both the encoding speed and writes that same value in the coded stream, right?
I.e. basically how every other encoder does this today in v4l2.
[13:49]
mtretterYes, that's how every other encoder does this today. In the end I want to do this via the FRAME_RATE control, but as I have to implement the legacy behavior anyway, I implemented it using s_parm for now. [13:58]
hverkuilOK, no problem. [13:59]
pinchartlhverkuil: ping, second try :-) [14:09]
hverkuilpinchartl: pong [14:19]
pinchartlhverkuil: I wanted to discuss read-only subdev devnodes with you
can you read the short discussion I had with Sakari above ?
[14:21]
hverkuilyeah, I read it. I don't like it, but I can't think of a better alternative either. [14:22]
pinchartl:-)
same for me
but I think it also solves the enumeration issue that Sakari mentioned, or at least part of it
[14:22]
hverkuilThis is for a simple video pipeline, right? sensor -> bridge -> possibly some processing -> dma [14:23]
pinchartlas we can set a TRY format (that would still be allowed on a read-only subdev)
and then get the TRY selection rectangles
[14:23]
sailusIf requests were extended to cover formats and selections, you could pass back the rest of the information back to the user --- and even that would requires splitting the subdevs into something similar smiapp driver uses today. [14:23]
pinchartlso userspace can enumerate the modes with all their parameters
the only thing that we wouldn't support is multiple modes having the same output resolution, but for that, I think we really need to configure the subdev from userspace fully, with a proper driver, not hardcoded mode
[14:23]
sailusAlternatively, selection API could be extended to convey the order of different steps back to the user. [14:24]
pinchartlhverkuil: correct, sensor -> CSI-2 receiver -> DMA [14:24]
sailusNeither is trivial to implement in the framework nor drivers. [14:24]
pinchartlit's for video-devnode-centric drivers
where everything is configured by a S_FMT on the video node
(and again, it's really about existing drivers, not new ones)
[14:25]
hverkuilDoing this also requires such a driver to create a media device since userspace has to be able to find the subdev. [14:25]
pinchartlcorrect [14:25]
hverkuilI really don't like that. [14:25]
pinchartlthe video-devnode-centric driver would need to be extended to do so
but it's fairly trivial
uvcvideo does it already with ~120 lines of code
[14:26]
sailuspinchartl: I thought you meant MC-enabled drivers. [14:26]
pinchartlthere's no configuration of the links or subdevs from userspace [14:26]
hverkuilUnless you create media devices for ALL devnode-centric drivers. (something that was always intended) [14:26]
pinchartlyes, creating a media device for all drivers is best I think
but it can only be done automatically for the ones based on subdevs internally
and would still require the subdevs to initialize their media entity, with pads
so it can't be done fully automatically to expose read-only subdevs
[14:26]
hverkuilIt's a lot of work... [14:28]
pinchartlsailus: for drivers that use MC and the V4L2 subdev userspace API, we can already access the subdev nodes from userspace, so all is fine
hverkuil: that's why I think for now it should be enabled manually per-driver, perhaps with some core helper
because otherwise we'll have to switch everything in one go and convert all subdev drivers, that's a bit too much :-)
[14:28]
sailuspinchartl: That's an interesting idea. [14:32]
pinchartlI'll propose patches [14:32]
sailusHow do you determine what kind of subdevs to create, and which selections will be available and where?
I think drivers should have direct control of that, for this is somewhat device specific.
[14:32]
hverkuilI'm not sure this helps anyway: do we have a way to report binning/skipping at all today? [14:33]
sailusVery probably there are common patterns though. [14:33]
pinchartlI don't plan to create subdevs automatically :-)
we already have subdevs
[14:33]
sailushverkuil: Smiapp does support configuring binning but not skipping. [14:33]
pinchartlhverkuil: yes, but not fully documented in the API [14:33]
sailusIt'd require one or two more subdevs to implement everything with the current uAPI. [14:33]
pinchartlit's configured through formats and selection rectangles on pads [14:34]
sailus(Not counting embedded data.) [14:34]
hverkuilI truly do not like the idea of read-only subdevs.
Since this is meant for simple pipelines, I'd much rather see some solution with G_SELECTION.
[14:35]
pinchartlbut we can't do that
G_SELECTION isn't powerful enough
we have multiple pads in the pipeline
and they're all handled through a single subdev
sorry, video node
G_SELECTION doesn't take a pad parameter
(and it shouldn't)
there's cropping, binning, skipping and scaling in the sensor
possibly with different points where cropping can be applied
userspace needs to be able to retrieve all that information
I don't think you can abstract all those consecutive operations, which can happen in an order that depends on the sensor, through a single G_SELECTION call
it's not just about adding a selection target, it's really a pipeline with multiple stages
[14:37]
hverkuilJust two stages: the source and the DMA engine: those are the only places where you can crop/scale. Anything more complex should use the MC API. For such simple pipelines you can create selection targets that report the sensor crop and sensor scale. (for binning/skipping you might need controls) [14:41]
sailushverkuil: Sensor drivers implement that implicitly, without telling about it, and the user space cannot figure it out.
I believe it is a problem in certain use cases, where drivers still have not been converted to MC-centric.
[14:41]
hverkuil'without telling about it': do you mean the sensor driver doesn't know? Or we just do not have a way to expose it? [14:43]
sailusIn reality, if sensor processing would be within consideration there, pretty much every bridge driver should be MC enabled.
hverkuil: The sensor drivers mostly work on register lists, but they usually have descriptive names for those lists.
But they also don't expose that, because they cannot.
(Or, could not, before the sub-device uAPI.)
Unless I'm mistaken, pinchartl's primary concern is those drivers.
No other driver than the smiapp does this btw.
[14:43]
pinchartlsailus: correct
it's those drivers
based on register lists
not the drivers that accept arbitrary cropping and scaling configurations. those are very good, but we have too few of them :-)
[14:44]
hverkuilOK, I'm getting confused :-) [14:46]
pinchartlshould I try to explain it again ? [14:46]
hverkuilSo we are talking old crappy sensors drivers that have a fixed number of modes, right?
And we want to be able to let userspace know what these modes do (cropping/scaling/binning/skipping)?
[14:46]
sailushverkuil: More or less every sensor driver apart from smiapp. [14:47]
pinchartlwe're talking about sensor drivers, in drivers/media/i2c/, that are based on register lists, with a fixed number of "modes". a mode in this context is a full sensor configuration that includes cropping (possibly in multiple stages), binning, skipping and/or scaling, to produce an output resolution
it's not about "old crappy sensor drivers" though
we have a majorite of this kind of drivers
[14:47]
sailusThey're still crappy, just not old. [14:47]
pinchartland we regularly merge new ones
so it's not just legacy
they're far from perfect, and ideally they should implement the subdev API fully and compute register values instead of hardcoding modes
we're pushing for that, but we're still getting a majority of mode-based sensors
due to a combination of factors, lack of datasheet, lack of support from the vendor, lazyness sometimes, ...
[14:47]
hverkuilOK. And I assume we DO know what each mode does, even though we may not know what the registers mean. So it may be a hardcoded list of meta information about the modes in these sensor drivers. [14:48]
pinchartlfixing this is another issue
yes, let's assume the driver knows how much cropping, binning, ... is applied for each mode
[14:49]
sailushverkuil: More or less so. [14:49]
pinchartlwe want to report that to userspace
(some drivers may not know, they may just have binary data without any knowledge, that's out of scope, we can't help them)
so the question is
how do we expose that information to userspace ?
given the following
- we have an internal API in the kernel to get and set those parameters, through formats and selection rectangles on subdev pads
- we can expose that API to userspace throught V4L2 subdev nodes, but in this use case, it's not exposed
(due to the main driver being videonode-centric)
s/throught/through/
how those parameters are applied to the sensor is sensor-specific. some sensors could apply crop before scaling, some could do it the other way around, some could do both
this is reflected, in the smiapp driver for instance, by multiple subdevs, to show the separate processing stages
[14:49]
hverkuilAll the information you need is read-only and constant, right? I.e. the sensor has N modes, and for each mode you want to expose the mode properties. Based on that information userspace can pick a mode, which I think is done via VIDIOC_S_FMT.
Or is it frame size + frame rate that selects the mode in the sensor?
[15:00]
pinchartlin the cases I've seen, it's VIDIOC_S_FMT, so the information is fixed for a given format set by S_FMT
but I can't rule out that it could depend on the frame rate
as I haven't analyzed all sensor drivers, just the few I'm working with right now
[15:03]
hverkuilI am wondering whether it isn't easier to create a read-only control array (or arrays) containing the mode information. It's useful regardless of whether the sensor is used in a MC or devnode-centric driver.
CID_SENSOR_MODES or whatever you call it.
If I understand this correctly, that is really all you are interested in.
[15:05]
pinchartlI think that would be a bad API
it would create more differences between MC and non-MC
what about smiapp for instance ?
when it's used with a videonode-centric driver
it will be configured by VIDIOC_S_FMT
still, userspace will need to know about cropping and all other parameters
the smiapp driver shouldn't report modes, as it doesn't have modes
the videonode driver could create the CID_SENSOR_MODES control itself, sure
but that's more complicated for the videonode driver
would need to be done by all of them
we have a subdev API within the kernel that can report the information
and we have a way to expose that API to userspace
[15:07]
hverkuilOne option is to just call VIDIOC_S_FMT, then read out the current mode information, i.e. what the sensor driver configured. [15:10]
pinchartlwhy should we create a control that exposes information retrieved from that API inside the kernel ?
yes, but how do you read out the current mode information from userspace ? that's the question :-)
[15:10]
hverkuilIt is a shitload of work for userspace to implement this with a media device and subdevs. It defeats the purpose of a devnode-centric interface. [15:11]
pinchartlit's not that much work, and it will only be used by specialized userspace
furthermore, there's no definition of sensor mode that would fit all sensors
as I said earlier, how a sensor applies the different processing parameters is sensor-specific
[15:11]
hverkuilI need to sleep on this :-) And I probably want to see an RFC with a bit more detail. Ping me tomorrow and we can discuss this further. [15:23]
pinchartlhverkuil: thanks. I'll write the RFC, with a sample implementation [15:23]
hverkuilThat would be much appreciated. [15:23]
sailuspinchartl: I'd like to see an implementation. [15:24]
pinchartlI would also like to add that I think exposing read-only subdevs will make the videonode-centric and MC-centric cases closer to each other
the same API would be used by userspace in both cases
[15:24]
sailusThis would not be perfect, but probably about as good as is possible, considering the existing interfaces. [15:24]
pinchartlsupporting both mode enumeration with the TRY API on subdevs, and retrieval of the current mode
while adding controls to support this would push the videonode and MC APIs more apart from eachother
one last point is that, if we expose the sensor information on sensor subdev nodes, it's easier for userspace to handle, as the sensors can more easily be handled in isolation of the rest of the pipeline
[15:24]
hverkuilAlso, I'm not clear on how userspace can know the binning/skipping state of the sensor, if you can add an example of how that's done, then that would be useful. If nothing else, I will have learned something new.
In any case, I have to leave. ttyl.
[15:28]
pinchartlthank you for your help
have a nice evening
[15:29]
sailusKiitos samoin! [15:37]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)