#v4l 2019-04-02,Tue

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)

WhoWhatWhen
tfigahverkuil: I'm sorry, wasn't able to find time in the end ;/
have some urgent stuff for today, hopefully can look at other things from tomorrow
[04:14]
...................................................................... (idle for 5h47mn)
bbrezillonhverkuil, pinchartl, tfiga: I think I have some basic v4l2-compliance tests for the EXT_FMT and EXT_BUF ioctls
+ vivid and vimc drivers patched to implement the ext hooks
should I wait your review on the RFC before sending a new version?
[10:02]
hverkuilYou might just as well post the latest version, then that's what I'll review. [10:04]
bbrezillonok, I'm preparing the patches [10:13]
koike, hverkuil: might be something someone already fixed, but I had to fix a NULL pointer dereference in vimc http://code.bulix.org/9i9ho4-649196 [10:21]
***mmattice has quit IRC (Ping timeout: 268 seconds) [10:33]
....... (idle for 31mn)
hverkuilbbrezillon: weird, I haven't come across that vimc issue. koike, can you take a look? [11:04]
mripard: ping [11:13]
mripardhverkuil: pong
hi!
[11:16]
hverkuilI'm looking at your v6 of the h264 series. Is this the final version or is a v7 planned?
I actually see some small issues in v6 (will reply), so I think a v7 is needed anyway.
[11:17]
mripardthere hasn't been any comment at the moment besides tfiga's reviewed-by
ok, then there will be a v7 :)
[11:18]
hverkuilOK. Did you look at what is needed for a stateless h264 encoder?
(does cedrus support h264 encoding?)
[11:18]
mripardnot yet
we don't
and it's not clear to me whether the encoder is stateful or stateless yet
it looks stateful but that would be odd to have a stateless decoder and a stateful ecoder
[11:19]
hverkuilBased on what I've seen for vicodec and mpeg2 it would need almost the same data structures, except that they are filled by the driver.
The main difference is that userspace wouldn't keep references to reference frames (i.e. the timestamp field in v4l2_h264_dpb_entry) since that needs to be done internally in the driver.
This made me wonder whether the timestamps should be stored in a separate control that is only created for stateless decoders. Otherwise you would have a field that is ignored when used by the stateless encoder.
[11:20]
mripardthat makes sense [11:26]
.......... (idle for 49mn)
koikehverkuil: bbrezillon: this NULL pointer is weird, I'll take a look [12:15]
bbrezillonkoike: BTW, I think it's even simpler to match against dev instead of entity_name [12:27]
http://code.bulix.org/kzdyxj-649329 [12:33]
hfrsailus: Hi Sakari, are you there ? I want to discuss about the support of multiple subdev in DCMI to support CSI bridge: https://lkml.org/lkml/2019/4/1/298 [12:40]
sailushfr: Hi!
Yeah, I have a moment. Should I read the patches first?
And please cc me on the next time.
[12:49]
hfryes it's quite straightforward
it's related on what the stmipid02 you're currently reviewing
this is the camera interface part
[12:51]
sailusOk. [12:52]
hfrI have several questions on how to deal with two subdevs now, compared to a single one previously [12:52]
sailusGo ahead. [12:53]
hfrfor ex. for the formats and controls, previously I was exposing the single subdev ones, but now I must find the "camera sensor" subdev in the whole pipeline to do the same
media_device_for_each_entity(entity, &dcmi->mdev)
if (entity->function == MEDIA_ENT_F_CAM_SENSOR)
dcmi->entity.subdev =
media_entity_to_v4l2_subdev(entity);
in dcmi_graph_notify_complete()
you already made the remark on stmipid02 that searching for a type of subdev is not good, what if not a camera sensor ?
but I don't see how to do other way presently
[12:53]
pinchartlhfr: why does your CSI-2 receiver need to locate the camera sensor ?
it shouldn't matter much, it should just interact with whatever subdev is connected to its input
regardless of whether it's a camera sensor, an HDMI to CSI-2 bridge or anything else
[12:55]
hfrfor expose to expose camera sensor controls on V4L2 side: exposure, contrast, etc...
if I ask the subdev I'm connected on, it's the bridge, so no such controls...
[12:56]
pinchartlyou shouldn't do so. controls should be exposed on the respective subdevs
or, alternatively, you can expose *all* controls for all subdevs on the video device nodes, using control handler inheritance
but not just selected sensor controls
it should be all or nothing
(and I'd recommend nothing, exposing them on subdev nodes instead)
[12:57]
hfrlet me explain what is current setup [12:58]
sailuspinchartl: I think that could make sense on non-MC-enabled drivers, but not on the MC-enabled ones. [12:59]
hfrDCMI => OV5640 //
quite simple
V4L2 layer exposes camera sensor controls
means G_/S_CTRLS
[12:59]
pinchartlsailus: "that" = "exposing them all on the video device node", or = "exposing them on their respective subdevs" ? [12:59]
sailuspinchartl: On the video node. [13:00]
pinchartlsailus: agreed [13:00]
sailusThat is also the current state. [13:00]
hfrnow I have bridge in between: but in my opinion this should change nothing on user side
it's just a matter of data transmission, it's not changing any features
so I would expect that user sees the same controls as before
[13:01]
sailusThat's a problematic situation, as you have an existing driver and you want to add more functionality.
One possibility could be to add a Kconfig option. Another is to accept the interface will be different.
I think it depends on the users as well, on what do they expect.
[13:02]
hfrI just want to use the CSI version of OV5640n there is no change in functionalities at all [13:03]
sailusBoth interfaces are valid and fully supported. [13:03]
hfrjust device tree changes
to keep legacy I can expose all controls
of all subdevs, including bridge
is that OK ?
(I don't know yet how I will redirect from one subdev to another subdev depeding on control but I will check that afterward)
[13:03]
sailusDepending whether you have MC-centric driver, you need changes to driver interfaces and functionality based on that; it's not only about controls. [13:05]
hfrthis is where I'm not clear [13:06]
sailusIt's a curious case where a piece of hardware that was not considered MC-centric suddenly is.
Another option could be a module parameter.
The driver isn't overly complicated at the moment so I guess this is a possibility, too; it's nearly the same than a Kconfig option anyway.
[13:06]
hfrmodule parameter ot Kconfig to do what ? [13:07]
sailusTo change the driver to be MC-centric or not. [13:07]
hfrMC-centric means that user may change its code and use now media-ctl to set formats and controls ? [13:09]
sailusIt means the user needs to configure the MC pipeline before streaming on the device. [13:10]
hfrok but user then will set formats and controls on V4L2 interface, what will happen then ?
for example "gst-launch v4l2src ..." command line
[13:11]
sailusStreaming may be started on the video node, just as with plain V4L2. [13:11]
hfrfirst thing that will do v4l2src plugin is to negotiate format through G_/S_FMT api, and when all is negotiated STREAMON will be sent
no just STREAMON
it's why I'm really confuse going to MC and subdev
regarding what is done currently based on V4L2
[13:13]
sailusSounds like you'll need libcamera there. :-)
That's really the problem (or one of the problems) it's intended to address.
pinchartl: How do things stand with libcamera nowadays?
[13:15]
pinchartlsailus: for capture pipelines without an ISP it would be entirely feasible to write a generic gstreamer element, libcamera isn't required [13:17]
sailuspinchartl: Yeah, you could do that, too. [13:17]
pinchartl(I haven't checked if there's an ISP in the STM pipeline) [13:17]
hfrnone [13:18]
pinchartlbut otherwise, libcamera is doing fine. better than the IPU3 driver in any case ;-) [13:18]
sailusX-) [13:19]
hfrwill libcamera change all the GStreamer V4L2 calls to MC calls transparently ?
does someone tested this already ?
[13:19]
sailusI need to leave now, back tomorrow if not this evening. [13:20]
pinchartlhfr: libcamera will require a sepcific gstreamer element [13:20]
sailusBye! [13:20]
hfrok thks Sakari, could we continue tomorrow ? [13:20]
pinchartlit's work in progress, I don't expect support in gstreamer before Q4 [13:20]
hfrok thks Laurent
anyway I will try to keep legacy behaviour with basic V4L2 as much as I can, I don't feel it's a big deal to change DCMI
Controls seems OK, I need now to dig more into formats negotiations in order that formats of bridge and sensor matches
[13:21]
................................ (idle for 2h36mn)
***benjiG has left [15:59]
bbrezillonpinchartl, hverkuil: I'm currently trying to test the "multi-plane buf pointing to the same dmabuf, each plane at a different offset"
question is, and how should we allocate such a buffer from v4l2-compliance?
should I extend the EXT_CREATE_BUF ioctl to support this case?
[16:07]
pinchartlI don't think so. this really aims at the import use case, I don't think we need to export such buffers [16:11]
bbrezillon(add a flags field + a V4L2_EXT_CREATE_BUFS_FL_PLANES_SHARE_BUF flag) [16:11]
pinchartlbut that leaves your question unanswered :-) [16:11]
bbrezillonthere's UDMABUF [16:11]
pinchartlwe clearly need a central allocator... [16:11]
bbrezillonnever tried using it [16:11]
pinchartlwould UDMABUF support that ?
if it does, it's an option
[16:11]
bbrezillonand I don't know if it works the way we want
would be much simpler than trying to modify the VB2 core to support this use case :)
second question is, should we expose the plane alignment constraints in v4l2_ext_format?
I need to know it in order to know the buf size and then to pass the appropriate plane->start_offset to the EXT_QBUF request
but is it something we expect userspace to figure out on its own, or should the framework expose it?
[16:11]
pinchartlI don't think we can meaningfully expose it in a generic way
it's a largely unsolved problem, kernel-wide
or Linux-wide
if you can find a solution everybody will love you :-)
[16:25]
.... (idle for 16mn)
bbrezillonpinchartl: I thought the video dev would at least be able to expose its own alignment constraints [16:41]
pinchartlbbrezillon: can you come up with a reasonable API that can express all possible alignment constraints ? :-) [16:44]
bbrezillondepends on what you mean by all possible alignment constraints [16:46]
pinchartlany constraint that a device may have [16:46]
bbrezillonI'm just interested in plane buf alignment constraint right now :)
so it's basically "plane buf should be aligned on X bytes" where X would probably be a power of 2
pinchartl: maybe it's simpler if you give me one of those funky use case you have in mind :)
[16:46]
pinchartlcan you guarantee that there will never be any device requiring other type of alignment constraints ?
there are existing devices that require planes to be allocated from different DRAM banks for instance
[16:49]
bbrezillonhm, not sure this qualify as an alignment constraint [16:52]
pinchartlis qualifies as a constraint on memory allocation. I'm not sure reporting partial constraints would be that useful as it won't solve the overall problem [16:52]
bbrezillonanyway, if we don't expose the alignment constraint what kind of policy should I use in the v4l2-compliance test supposed to test that? [16:53]
pinchartlwhat do you want to test exactly ? [16:54]
bbrezillonor should I give up on this generic "mutiplane buffer single dmabuf + different offsets"
test
?
[16:55]
pinchartlthe get/set format API allows you to negotiate a bytesperline value, which reports some alignments constraints. could this be used ? [16:55]
bbrezillonyep, it should work
it's definitely not encoding the real alignment constraint, but it should be large enough to work for most use cases
[16:57]
.................... (idle for 1h37mn)
***Whoopie has quit IRC (Quit: ZNC - http://znc.in) [18:35]

↑back Search ←Prev date Next date→ Show only urls(Click on time to select a line by its url)