[08:50] <grohne> I looked into this bridge driver thingy for a while now and observe that my bound callback is never called. now I'm wondering a) how do I debug that? b) do I need to use this media/pads stuff? [08:54] <javier__> grohne: are you using DT? [08:56] <grohne> javier__: good question. in my imager driver, yes. in the bridge driver I tried to initially reduce complexity and go without dt (adding it later) [08:56] <grohne> i.e. I am trying an i2c async notifier match with hard coded bus/address [08:57] <grohne> I'll likely have to look into dt graphs at some point, but getting the full complexity upfront makes for difficult testing and even with the much minimized version here essentially nothing works [08:59] <grohne> my current plan is to write an entirely ugly bridge driver with hard coded assumptions to be able to move the sensor driver to v4l2 async notifiers and clean up the sensor driver before delving into the bridge driver too much. is that reasonable? [09:00] <javier__> grohne: I think it is to first focus on getting the sensor driver right for upstreaming and then move to the bridge driver [09:00] <javier__> now getting to your original question, I would debug v4l2_async_register_subdev() then [09:00] <grohne> that's what I try to do, but without any bridge driver I have no chance of testing the sensor driver [09:01] <grohne> so all I need is as much bridge driver to be able to test the sensor driver and nothing more. [09:01] <javier__> v4l2_async_find_match() should match on i2c address for the V4L2_ASYNC_MATCH_I2C case [09:02] <javier__> grohne: yes, I meant to have a bridge driver even if isn't in an upstreamable state [09:03] <grohne> looks like we agree on the approach then. thank you [09:03] <javier__> grohne: but I would definitely go with Documentation/devicetree/bindings/media/video-interfaces.txt [09:04] <grohne> initially or eventually? [09:05] <grohne> the other pain point here is that I cannot route video data through the bridge driver yet. it isn't v4l2 yet, but has a second miscdevice driver [09:06] <snawrocki> ezeqiuelg: hi, I don't have your patches in my inbox and for some reason I receive only selected patches through nntp from gmane (just subscribed to linux-media again). I have noticed your patches: [09:07] <snawrocki> https://patchwork.linuxtv.org/patch/50358 , https://patchwork.linuxtv.org/patch/50358 [09:08] <snawrocki> ezeqielg: these patches seems seriously incorrect to me, VIDIOC_STREAMOFF should stop DMA, what was motivation behind those patches? [09:10] <snawrocki> with those patches it's possible we try to dequeue buffers from hardware while DMA is still running, isn't it? [09:10] <javier__> grohne: yes, since you may need to set bus configuration in the DT (bus-type, bus-width, {h,v}sync-active, etc) most bridge drivers expect those [09:10] <javier__> grohne: and really, there are nice helpers functions to use the OF graph / video interface bindings [09:11] <javier__> it should mostly be using v4l2_async_notifier_parse_fwnode_endpoints() and v4l2_async_notifier_register() [09:12] <javier__> and the good thing is that you will see how your DT should look like even if your bridge driver isn't really for upstream [09:12] <snawrocki> ezeqielg: https://www.linuxtv.org/downloads/legacy/video4linux/API/V4L2_API/spec/rn01re61.html - "The VIDIOC_STREAMOFF ioctl, apart of aborting or finishing any DMA in progress..." [09:12] <javier__> s/really/ready [09:13] <grohne> javier__: hmm. I tried to avoid it thus far because a) that's a lot of new apis where the amount of apis to learn is steep already b) it feels strange to add a new device node (for a platform device) that has no hardware attached to it and just references a sensor [09:14] <grohne> javier__: also I would not have any ports, because the data is routed "elsewhere" for the time being. I see that dt-graph might make sense when you have more than once subdev, but for one subdev? [09:15] <javier__> grohne: not sure if understood (b), you still need the device node to instantiate your platform device and probe your driver? or are you doing at module init ? [09:15] <javier__> grohne: your sensor has an output port that's your ISP bus [09:16] <javier__> grohne: agreed about (a), you are free to ingore my advice. I'm just telling you how I would do it :) [09:17] <grohne> we certainly agree on the final destination. I'm just looking for a manageable path to get there. [09:17] <javier__> grohne: yes, I got that. Is just that IME adding the DT as an after thought is always more complex [09:18] <javier__> and usually changing the DT binding in upstream ins't trivial because people expect backward compat [09:18] <grohne> that still sounds like a non-argument to me, because the sensor has dt bindings that's the part to be upstreamed [09:19] <javier__> yes, but you said that you are not adding those for now [09:20] <grohne> I'm trying to avoid bindings for the bridge driver, not the sensor driver [09:20] <grohne> s/avoid/defer/ [09:23] <javier__> grohne: yeah, I just think that it's maybe going to far but it may work indeed [09:24] <grohne> I wouldn't be able to tell now. it certainly is a tradeoff with disadvantages [09:26] <javier__> probably the only thing that your driver couldn't have is a struct v4l2_subdev_video_ops .g_mbus_config callback that lookups the bridge config [09:27] <javier__> but you can just hardcode to V4L2_MBUS_PARALLEL or something and only test that [09:30] <javier__> grohne: btw, I just was giving my opinion. It doens't mind that what I say is correct [09:30] <javier__> *mean [09:30] <grohne> that makes sense. so I'll likely have to add the dt-graph before posting the sensor driver, but getting there is still months [09:31] <grohne> but going from miscdevice to v4l2 async likely takes a while. I've rewritten it once already from platform_driver to i2c_driver... [09:32] <grohne> I guess I'll be rewriting it like 2 or 3 times... [11:16] <grohne> javier__: javier__ after adding V4L2_SUBDEV_FL_HAS_DEVNODE and writing a 100-line crap bridge driver, I was finally able to talk to my sensor via v4l2 from userspace. In retrospect, I think it was good to defer dt-graph, but I shouldn't defer it much longer. [11:17] <javier__> grohne: cool [12:11] <mripard> sailus: thanks for the review on the v3s driver :) [12:31] <sailus> mripard: You're welcome! [12:31] <sailus> It wasn't meant to take this long. :-I [12:32] <sailus> (I'll be back later.) [14:00] <ttomov> sailus: thanks for suggestion :) [15:30] <learningc> How do I read my camera? [15:56] *** prabhakarlad has left [16:03] <ndufresne> learningc, in english it would be from left to right ;-P (just kidding) [16:04] <learningc> ndufresne, well, seriously... [16:04] * ndufresne looking for a link for you [16:05] <ndufresne> this one is old, but accurate, https://lwn.net/Articles/203924/ [16:05] <ndufresne> full reference is here, https://linuxtv.org/downloads/v4l-dvb-apis-new/uapi/v4l/v4l2.html [16:07] <ndufresne> learningc, and this one too, https://linuxtv.org/downloads/v4l-dvb-apis-new/uapi/v4l/capture.c.html [16:09] <learningc> ndufresne, Thanks. Your links will prove helpful :) [16:11] <ndufresne> this is of course all pretty low level, you can use GStreamer if you need something faster, but I guess you really want to learn this right ? [16:12] <ndufresne> * something faster -> something higher level just to read from you cam without having to learn all the kernel linux low level stuff [16:20] *** benjiG has left [17:22] <learningc> I need the low level :) [17:25] <javier__> learningc: you can also check yavta [17:25] <javier__> http://git.ideasonboard.org/yavta.git/blob/HEAD:/yavta.c [19:23] *** philomath has left "WeeChat 2.0.1" [20:10] <grohne> is it ok to have both a hwmon and a v4l driver (driving the same i2c device) in the same file under drivers/media/i2c? [20:11] <grohne> that's certaily unsusual