[00:45] *** al has quit IRC (Quit: al) [09:59] *** prabhakarlad has left [11:03] *** bparrot has quit IRC (Remote host closed the connection) [11:57] <hverkuil> ezequielg: ping [14:22] *** prabhakarlad has left [14:41] *** prabhakarlad has left [15:13] <ezequielg> hverkuil: pong [15:13] <ezequielg> sorry for the late pong :-) [15:19] <ezequielg> hverkuil: oh, thanks for the feedback. [15:19] <ezequielg> late start today. [15:28] <hverkuil> ezequielg: I'd forgotten about my ping :-) [15:28] <hverkuil> Quick question regarding the rockchip patches: [15:29] <hverkuil> are the jpeg codec and the mpeg codec using the same video devices? [15:29] <hverkuil> i.e. the decoder (say) uses videoX and the output format enumeration gives both jpeg and mpeg formats? [15:34] <ezequielg> no. [15:34] <ezequielg> as the cover letter shows in the dot graph. [15:34] <ezequielg> encoding is e.g. videoX, decoding is e.g. videoY [15:35] <ezequielg> it's a major pain to implement (from the codec API point of view) otherwise. [15:35] <ezequielg> and a major pain for users. [15:38] <ezequielg> in other words, i'd really like to have video devices with only one encoded pad/side [15:40] <hverkuil> That's not what I mean. Given the video node for decoding (videoY in your example): can that be used for both JPEG decoding and MPEG decoding, depending on the selected output format? [15:40] <hverkuil> Or are the JPEG and MPEG decoders handled in separate video nodes? [15:40] <ezequielg> ah [15:40] <ezequielg> yes, a given video node handles multiple video formats for decoding. [15:42] <ezequielg> so, yes, it can be used for e.g. MPEG-2 decoding, and whatever other codec is supported by that VPU engine. [15:42] <ezequielg> hverkuil: note that rockchip has two different engines, one is called VPU and the other is called VDEC. [15:42] <ezequielg> both will be supported by the driver. [15:42] <hverkuil> OK. This may cause problems for applications since JPEG is quite different from MPEG (frame-by-frame decoding vs a bitstream) and this may be hard to detect in a generic way. [15:42] <ezequielg> Indeed. [15:43] <ezequielg> So, what I meant is: it will support multiple stateless decoders. [15:43] <hverkuil> I'm not aware of any codec drivers today that support both in a single video node. JPEG codecs are normally standalone devices. [15:43] <ezequielg> I don't have plans to support JPEG decoding tbh. [15:44] <ezequielg> but even if we do, I guess we can route that via its own video node. [15:44] <ezequielg> as long as it goes to a single m2m, to serialize the job in a single engine, things will work [15:45] <hverkuil> Ah, I see. The current rockchip driver supports a JPEG encoder and your patch series adds an MPEG decoder. Each has its own video node. Right? [15:45] <ezequielg> yup [15:46] <hverkuil> And if an MPEG encoder would be added as well, then it is probably a good idea to give that its own video node and not to mix it with the JPEG encoder. Ditto when adding support for an JPEG decoder. [15:49] <ezequielg> even if we have a single video node per codec, I don't think that affects anything. [15:50] <hverkuil> It should be fine to support h264/mpeg/etc in one video node, but adding jpeg to that mix is not a good idea. Just something to keep in mind for the future. [15:57] <ezequielg> unless, we add stateless JPEG decoding :-) [15:57] <ezequielg> with the controls we discussed a while ago. [15:57] <ezequielg> although, I am not sure I will open that can. [16:13] <ezequielg> hverkuil: if possible, please review the media controller changes first on that patchset. [16:14] <ezequielg> because you will dislike that the most :) [16:30] *** benjiG has left [16:31] <ndufresne> hverkuil, in your review, you seem to emphasis that using the M variant of the pixel format implies that the memory planes are non-contiguous [16:32] <ndufresne> hverkuil, I personally don't think the usage of the M plane variant should be bound to how many alloc you have [16:32] <ndufresne> as this API can be use to represent single allocatation plane with arbitrary padding between planes [16:37] <hverkuil> ndufresne: that's not the case today: the M variants all assume non-contiguous planes, each allocated separately. [16:38] <ndufresne> that is sad, does that mean yet a third API for the case I describe ? [16:38] <ndufresne> we really need something cleaner [16:39] <ndufresne> when comes time to import external buffer, it's such a pain for userspace, I spent several weeks on implementing this in gstreamer, we we still have case where I forgot to validate something [16:39] <hverkuil> We have plans to make something cleaner, yes, and yes, that will be a third unified API. [16:39] <ndufresne> will be be able to handle multiple formats in a single queue ? [16:40] <hverkuil> I agree completely: we messed up at the time, we should have bitten the bullet at the time and made new versions of the streaming I/O ioctls that can cope with such things. [16:40] <ndufresne> sorry, not format, but frame size ? [16:40] <hverkuil> That we can already with CREATE_BUFS. [16:40] <ndufresne> well, not really [16:40] <ndufresne> I don't think it's feasible to use the current API [16:41] <ndufresne> or at least, I don't see how [16:41] <ndufresne> and even if we could, you keep increasing the number of buffers, which is not what you want [16:42] <ndufresne> let's say you are encoding [16:42] <hverkuil> We're missing DELETE_BUF, I know. [16:42] <hverkuil> And CREATE_BUFS isn't perfect either. [16:42] <ndufresne> and you have a resolution change that you want to happen has gapless as possible, you want to simply start queuing/importing buffers of the new size [16:43] <ndufresne> And then when the encoder reach that new size, something should happen which may implicate the userspace, but you want the buffers to have been wrapped by the kernel already [16:43] *** pdemier has left "Leaving" [16:44] <hverkuil> Erm, let's get the codec API settled first, shall we? There is so much that can be improved, but one thing at a time. [16:44] <ndufresne> but ok, I'm clearly speculating here, I'll check the propose API when time comes [16:45] <ndufresne> hverkuil, btw, you should dig up why an multi-format codec can't have jpeg, it makes very little sense to me [16:45] <ndufresne> if you look at amlogic driver, it's multi-codec, and each codec has pixel format particularity, and it's surprisingly cleanly handled [16:47] <ndufresne> so far, it's separate for the only reason that all integrated HW have seperate (unrelated) IP [16:47] <ndufresne> and usespace isn't affected by that, because it first decide which formats it will use the node for, and then use the associate ioctl/ctrl subset for that format [16:48] <hverkuil> A JPEG codec behaves differently from a bitstream codec (esp. a stateless bitstream codec). While it is certainly possible to support them all in one video device, I am not convinced that it is a good idea. [16:49] <hverkuil> When we get a driver like that, I would have to take a good look at it. [16:49] <ndufresne> well, it is not necessarily stateless, and jpeg is a bitstream that you can parse from random location, so I don't understand why you say it's so different [16:49] * ndufresne just trying to find some meat behind this statement [16:51] <hverkuil> I'm pretty sure none of the JPEG codecs today can start at a random point. AFAIK they all expect a normal JPEG 'picture' to decode. [16:51] <ndufresne> same apply for all our H264 decoder, I only meant to describe the bitstream [16:54] <hverkuil> Anyway, we'll see how it will work out. It hasn't crystallized for me either, I just have a gut feeling that there may be problems if you attempt to mix them in a single video node. [17:02] <ezequielg> following the codec API, userspace first sets the encoded format, and then things just flow from there. As long as you have one encoded side (i.e. the node is a decoder only or encoder only), I believe it should be OK. [17:02] <ezequielg> but that might be naive. [17:02] <ezequielg> we still have to merge h264, hevc, vp8 and vp9 controls... so there is a lot to talk about. [17:03] <ezequielg> and we haven't discussed stateless encoders. [17:52] *** _abbenormal has quit IRC (Quit: Leaving) [19:01] <ndufresne> ezequielg, agreed [19:04] <ndufresne> ezequielg, I thought you wanted to rename v4l2_m2m_buf_copy_data, but I see you just use it as-is, did I missed something ? [19:06] <ezequielg> hm [19:06] <ezequielg> they are two different patchsets. [19:07] <ezequielg> I expect the rename to be rejected or accepted sooner than the mpeg-2. [19:07] <ezequielg> sorry if this is confusing, I am actually doing lots of changes and it's not easy to keep track of them all. [21:21] *** ana has left [21:36] <ezequielg> hverkuil: I just submitted a v2 for the pixel format helpers alone. Note that since we now have the helpers in v4l2-common.c, a small change is needed to rockchip_vpu_enc.c to avoid a collision. [22:39] *** bingbu has quit IRC (Ping timeout: 244 seconds)