↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |
Who | What | When |
---|---|---|
tfiga | hverkuil: I'm sorry, wasn't able to find time in the end ;/
have some urgent stuff for today, hopefully can look at other things from tomorrow | [04:14] |
...................................................................... (idle for 5h47mn) | ||
bbrezillon | hverkuil, pinchartl, tfiga: I think I have some basic v4l2-compliance tests for the EXT_FMT and EXT_BUF ioctls
+ vivid and vimc drivers patched to implement the ext hooks should I wait your review on the RFC before sending a new version? | [10:02] |
hverkuil | You might just as well post the latest version, then that's what I'll review. | [10:04] |
bbrezillon | ok, I'm preparing the patches | [10:13] |
koike, hverkuil: might be something someone already fixed, but I had to fix a NULL pointer dereference in vimc http://code.bulix.org/9i9ho4-649196 | [10:21] | |
*** | mmattice has quit IRC (Ping timeout: 268 seconds) | [10:33] |
....... (idle for 31mn) | ||
hverkuil | bbrezillon: weird, I haven't come across that vimc issue. koike, can you take a look? | [11:04] |
mripard: ping | [11:13] | |
mripard | hverkuil: pong
hi! | [11:16] |
hverkuil | I'm looking at your v6 of the h264 series. Is this the final version or is a v7 planned?
I actually see some small issues in v6 (will reply), so I think a v7 is needed anyway. | [11:17] |
mripard | there hasn't been any comment at the moment besides tfiga's reviewed-by
ok, then there will be a v7 :) | [11:18] |
hverkuil | OK. Did you look at what is needed for a stateless h264 encoder?
(does cedrus support h264 encoding?) | [11:18] |
mripard | not yet
we don't and it's not clear to me whether the encoder is stateful or stateless yet it looks stateful but that would be odd to have a stateless decoder and a stateful ecoder | [11:19] |
hverkuil | Based on what I've seen for vicodec and mpeg2 it would need almost the same data structures, except that they are filled by the driver.
The main difference is that userspace wouldn't keep references to reference frames (i.e. the timestamp field in v4l2_h264_dpb_entry) since that needs to be done internally in the driver. This made me wonder whether the timestamps should be stored in a separate control that is only created for stateless decoders. Otherwise you would have a field that is ignored when used by the stateless encoder. | [11:20] |
mripard | that makes sense | [11:26] |
.......... (idle for 49mn) | ||
koike | hverkuil: bbrezillon: this NULL pointer is weird, I'll take a look | [12:15] |
bbrezillon | koike: BTW, I think it's even simpler to match against dev instead of entity_name | [12:27] |
http://code.bulix.org/kzdyxj-649329 | [12:33] | |
hfr | sailus: Hi Sakari, are you there ? I want to discuss about the support of multiple subdev in DCMI to support CSI bridge: https://lkml.org/lkml/2019/4/1/298 | [12:40] |
sailus | hfr: Hi!
Yeah, I have a moment. Should I read the patches first? And please cc me on the next time. | [12:49] |
hfr | yes it's quite straightforward
it's related on what the stmipid02 you're currently reviewing this is the camera interface part | [12:51] |
sailus | Ok. | [12:52] |
hfr | I have several questions on how to deal with two subdevs now, compared to a single one previously | [12:52] |
sailus | Go ahead. | [12:53] |
hfr | for ex. for the formats and controls, previously I was exposing the single subdev ones, but now I must find the "camera sensor" subdev in the whole pipeline to do the same
media_device_for_each_entity(entity, &dcmi->mdev) if (entity->function == MEDIA_ENT_F_CAM_SENSOR) dcmi->entity.subdev = media_entity_to_v4l2_subdev(entity); in dcmi_graph_notify_complete() you already made the remark on stmipid02 that searching for a type of subdev is not good, what if not a camera sensor ? but I don't see how to do other way presently | [12:53] |
pinchartl | hfr: why does your CSI-2 receiver need to locate the camera sensor ?
it shouldn't matter much, it should just interact with whatever subdev is connected to its input regardless of whether it's a camera sensor, an HDMI to CSI-2 bridge or anything else | [12:55] |
hfr | for expose to expose camera sensor controls on V4L2 side: exposure, contrast, etc...
if I ask the subdev I'm connected on, it's the bridge, so no such controls... | [12:56] |
pinchartl | you shouldn't do so. controls should be exposed on the respective subdevs
or, alternatively, you can expose *all* controls for all subdevs on the video device nodes, using control handler inheritance but not just selected sensor controls it should be all or nothing (and I'd recommend nothing, exposing them on subdev nodes instead) | [12:57] |
hfr | let me explain what is current setup | [12:58] |
sailus | pinchartl: I think that could make sense on non-MC-enabled drivers, but not on the MC-enabled ones. | [12:59] |
hfr | DCMI => OV5640 //
quite simple V4L2 layer exposes camera sensor controls means G_/S_CTRLS | [12:59] |
pinchartl | sailus: "that" = "exposing them all on the video device node", or = "exposing them on their respective subdevs" ? | [12:59] |
sailus | pinchartl: On the video node. | [13:00] |
pinchartl | sailus: agreed | [13:00] |
sailus | That is also the current state. | [13:00] |
hfr | now I have bridge in between: but in my opinion this should change nothing on user side
it's just a matter of data transmission, it's not changing any features so I would expect that user sees the same controls as before | [13:01] |
sailus | That's a problematic situation, as you have an existing driver and you want to add more functionality.
One possibility could be to add a Kconfig option. Another is to accept the interface will be different. I think it depends on the users as well, on what do they expect. | [13:02] |
hfr | I just want to use the CSI version of OV5640n there is no change in functionalities at all | [13:03] |
sailus | Both interfaces are valid and fully supported. | [13:03] |
hfr | just device tree changes
to keep legacy I can expose all controls of all subdevs, including bridge is that OK ? (I don't know yet how I will redirect from one subdev to another subdev depeding on control but I will check that afterward) | [13:03] |
sailus | Depending whether you have MC-centric driver, you need changes to driver interfaces and functionality based on that; it's not only about controls. | [13:05] |
hfr | this is where I'm not clear | [13:06] |
sailus | It's a curious case where a piece of hardware that was not considered MC-centric suddenly is.
Another option could be a module parameter. The driver isn't overly complicated at the moment so I guess this is a possibility, too; it's nearly the same than a Kconfig option anyway. | [13:06] |
hfr | module parameter ot Kconfig to do what ? | [13:07] |
sailus | To change the driver to be MC-centric or not. | [13:07] |
hfr | MC-centric means that user may change its code and use now media-ctl to set formats and controls ? | [13:09] |
sailus | It means the user needs to configure the MC pipeline before streaming on the device. | [13:10] |
hfr | ok but user then will set formats and controls on V4L2 interface, what will happen then ?
for example "gst-launch v4l2src ..." command line | [13:11] |
sailus | Streaming may be started on the video node, just as with plain V4L2. | [13:11] |
hfr | first thing that will do v4l2src plugin is to negotiate format through G_/S_FMT api, and when all is negotiated STREAMON will be sent
no just STREAMON it's why I'm really confuse going to MC and subdev regarding what is done currently based on V4L2 | [13:13] |
sailus | Sounds like you'll need libcamera there. :-)
That's really the problem (or one of the problems) it's intended to address. pinchartl: How do things stand with libcamera nowadays? | [13:15] |
pinchartl | sailus: for capture pipelines without an ISP it would be entirely feasible to write a generic gstreamer element, libcamera isn't required | [13:17] |
sailus | pinchartl: Yeah, you could do that, too. | [13:17] |
pinchartl | (I haven't checked if there's an ISP in the STM pipeline) | [13:17] |
hfr | none | [13:18] |
pinchartl | but otherwise, libcamera is doing fine. better than the IPU3 driver in any case ;-) | [13:18] |
sailus | X-) | [13:19] |
hfr | will libcamera change all the GStreamer V4L2 calls to MC calls transparently ?
does someone tested this already ? | [13:19] |
sailus | I need to leave now, back tomorrow if not this evening. | [13:20] |
pinchartl | hfr: libcamera will require a sepcific gstreamer element | [13:20] |
sailus | Bye! | [13:20] |
hfr | ok thks Sakari, could we continue tomorrow ? | [13:20] |
pinchartl | it's work in progress, I don't expect support in gstreamer before Q4 | [13:20] |
hfr | ok thks Laurent
anyway I will try to keep legacy behaviour with basic V4L2 as much as I can, I don't feel it's a big deal to change DCMI Controls seems OK, I need now to dig more into formats negotiations in order that formats of bridge and sensor matches | [13:21] |
................................ (idle for 2h36mn) | ||
*** | benjiG has left | [15:59] |
bbrezillon | pinchartl, hverkuil: I'm currently trying to test the "multi-plane buf pointing to the same dmabuf, each plane at a different offset"
question is, and how should we allocate such a buffer from v4l2-compliance? should I extend the EXT_CREATE_BUF ioctl to support this case? | [16:07] |
pinchartl | I don't think so. this really aims at the import use case, I don't think we need to export such buffers | [16:11] |
bbrezillon | (add a flags field + a V4L2_EXT_CREATE_BUFS_FL_PLANES_SHARE_BUF flag) | [16:11] |
pinchartl | but that leaves your question unanswered :-) | [16:11] |
bbrezillon | there's UDMABUF | [16:11] |
pinchartl | we clearly need a central allocator... | [16:11] |
bbrezillon | never tried using it | [16:11] |
pinchartl | would UDMABUF support that ?
if it does, it's an option | [16:11] |
bbrezillon | and I don't know if it works the way we want
would be much simpler than trying to modify the VB2 core to support this use case :) second question is, should we expose the plane alignment constraints in v4l2_ext_format? I need to know it in order to know the buf size and then to pass the appropriate plane->start_offset to the EXT_QBUF request but is it something we expect userspace to figure out on its own, or should the framework expose it? | [16:11] |
pinchartl | I don't think we can meaningfully expose it in a generic way
it's a largely unsolved problem, kernel-wide or Linux-wide if you can find a solution everybody will love you :-) | [16:25] |
.... (idle for 16mn) | ||
bbrezillon | pinchartl: I thought the video dev would at least be able to expose its own alignment constraints | [16:41] |
pinchartl | bbrezillon: can you come up with a reasonable API that can express all possible alignment constraints ? :-) | [16:44] |
bbrezillon | depends on what you mean by all possible alignment constraints | [16:46] |
pinchartl | any constraint that a device may have | [16:46] |
bbrezillon | I'm just interested in plane buf alignment constraint right now :)
so it's basically "plane buf should be aligned on X bytes" where X would probably be a power of 2 pinchartl: maybe it's simpler if you give me one of those funky use case you have in mind :) | [16:46] |
pinchartl | can you guarantee that there will never be any device requiring other type of alignment constraints ?
there are existing devices that require planes to be allocated from different DRAM banks for instance | [16:49] |
bbrezillon | hm, not sure this qualify as an alignment constraint | [16:52] |
pinchartl | is qualifies as a constraint on memory allocation. I'm not sure reporting partial constraints would be that useful as it won't solve the overall problem | [16:52] |
bbrezillon | anyway, if we don't expose the alignment constraint what kind of policy should I use in the v4l2-compliance test supposed to test that? | [16:53] |
pinchartl | what do you want to test exactly ? | [16:54] |
bbrezillon | or should I give up on this generic "mutiplane buffer single dmabuf + different offsets"
test ? | [16:55] |
pinchartl | the get/set format API allows you to negotiate a bytesperline value, which reports some alignments constraints. could this be used ? | [16:55] |
bbrezillon | yep, it should work
it's definitely not encoding the real alignment constraint, but it should be large enough to work for most use cases | [16:57] |
.................... (idle for 1h37mn) | ||
*** | Whoopie has quit IRC (Quit: ZNC - http://znc.in) | [18:35] |
↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |