↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |
Who | What | When |
---|---|---|
*** | al has quit IRC (Quit: al) | [00:45] |
............................................................................................................... (idle for 9h14mn) | ||
prabhakarlad has left | [09:59] | |
............. (idle for 1h4mn) | ||
bparrot has quit IRC (Remote host closed the connection) | [11:03] | |
........... (idle for 54mn) | ||
hverkuil | ezequielg: ping | [11:57] |
.............................. (idle for 2h25mn) | ||
*** | prabhakarlad has left | [14:22] |
.... (idle for 19mn) | ||
prabhakarlad has left | [14:41] | |
....... (idle for 32mn) | ||
ezequielg | hverkuil: pong
sorry for the late pong :-) | [15:13] |
hverkuil: oh, thanks for the feedback.
late start today. | [15:19] | |
hverkuil | ezequielg: I'd forgotten about my ping :-)
Quick question regarding the rockchip patches: are the jpeg codec and the mpeg codec using the same video devices? i.e. the decoder (say) uses videoX and the output format enumeration gives both jpeg and mpeg formats? | [15:28] |
ezequielg | no.
as the cover letter shows in the dot graph. encoding is e.g. videoX, decoding is e.g. videoY it's a major pain to implement (from the codec API point of view) otherwise. and a major pain for users. in other words, i'd really like to have video devices with only one encoded pad/side | [15:34] |
hverkuil | That's not what I mean. Given the video node for decoding (videoY in your example): can that be used for both JPEG decoding and MPEG decoding, depending on the selected output format?
Or are the JPEG and MPEG decoders handled in separate video nodes? | [15:40] |
ezequielg | ah
yes, a given video node handles multiple video formats for decoding. so, yes, it can be used for e.g. MPEG-2 decoding, and whatever other codec is supported by that VPU engine. hverkuil: note that rockchip has two different engines, one is called VPU and the other is called VDEC. both will be supported by the driver. | [15:40] |
hverkuil | OK. This may cause problems for applications since JPEG is quite different from MPEG (frame-by-frame decoding vs a bitstream) and this may be hard to detect in a generic way. | [15:42] |
ezequielg | Indeed.
So, what I meant is: it will support multiple stateless decoders. | [15:42] |
hverkuil | I'm not aware of any codec drivers today that support both in a single video node. JPEG codecs are normally standalone devices. | [15:43] |
ezequielg | I don't have plans to support JPEG decoding tbh.
but even if we do, I guess we can route that via its own video node. as long as it goes to a single m2m, to serialize the job in a single engine, things will work | [15:43] |
hverkuil | Ah, I see. The current rockchip driver supports a JPEG encoder and your patch series adds an MPEG decoder. Each has its own video node. Right? | [15:45] |
ezequielg | yup | [15:45] |
hverkuil | And if an MPEG encoder would be added as well, then it is probably a good idea to give that its own video node and not to mix it with the JPEG encoder. Ditto when adding support for an JPEG decoder. | [15:46] |
ezequielg | even if we have a single video node per codec, I don't think that affects anything. | [15:49] |
hverkuil | It should be fine to support h264/mpeg/etc in one video node, but adding jpeg to that mix is not a good idea. Just something to keep in mind for the future. | [15:50] |
ezequielg | unless, we add stateless JPEG decoding :-)
with the controls we discussed a while ago. although, I am not sure I will open that can. | [15:57] |
.... (idle for 16mn) | ||
hverkuil: if possible, please review the media controller changes first on that patchset.
because you will dislike that the most :) | [16:13] | |
.... (idle for 16mn) | ||
*** | benjiG has left | [16:30] |
ndufresne | hverkuil, in your review, you seem to emphasis that using the M variant of the pixel format implies that the memory planes are non-contiguous
hverkuil, I personally don't think the usage of the M plane variant should be bound to how many alloc you have as this API can be use to represent single allocatation plane with arbitrary padding between planes | [16:31] |
hverkuil | ndufresne: that's not the case today: the M variants all assume non-contiguous planes, each allocated separately. | [16:37] |
ndufresne | that is sad, does that mean yet a third API for the case I describe ?
we really need something cleaner when comes time to import external buffer, it's such a pain for userspace, I spent several weeks on implementing this in gstreamer, we we still have case where I forgot to validate something | [16:38] |
hverkuil | We have plans to make something cleaner, yes, and yes, that will be a third unified API. | [16:39] |
ndufresne | will be be able to handle multiple formats in a single queue ? | [16:39] |
hverkuil | I agree completely: we messed up at the time, we should have bitten the bullet at the time and made new versions of the streaming I/O ioctls that can cope with such things. | [16:40] |
ndufresne | sorry, not format, but frame size ? | [16:40] |
hverkuil | That we can already with CREATE_BUFS. | [16:40] |
ndufresne | well, not really
I don't think it's feasible to use the current API or at least, I don't see how and even if we could, you keep increasing the number of buffers, which is not what you want let's say you are encoding | [16:40] |
hverkuil | We're missing DELETE_BUF, I know.
And CREATE_BUFS isn't perfect either. | [16:42] |
ndufresne | and you have a resolution change that you want to happen has gapless as possible, you want to simply start queuing/importing buffers of the new size
And then when the encoder reach that new size, something should happen which may implicate the userspace, but you want the buffers to have been wrapped by the kernel already | [16:42] |
*** | pdemier has left "Leaving" | [16:43] |
hverkuil | Erm, let's get the codec API settled first, shall we? There is so much that can be improved, but one thing at a time. | [16:44] |
ndufresne | but ok, I'm clearly speculating here, I'll check the propose API when time comes
hverkuil, btw, you should dig up why an multi-format codec can't have jpeg, it makes very little sense to me if you look at amlogic driver, it's multi-codec, and each codec has pixel format particularity, and it's surprisingly cleanly handled so far, it's separate for the only reason that all integrated HW have seperate (unrelated) IP and usespace isn't affected by that, because it first decide which formats it will use the node for, and then use the associate ioctl/ctrl subset for that format | [16:44] |
hverkuil | A JPEG codec behaves differently from a bitstream codec (esp. a stateless bitstream codec). While it is certainly possible to support them all in one video device, I am not convinced that it is a good idea.
When we get a driver like that, I would have to take a good look at it. | [16:48] |
ndufresne | well, it is not necessarily stateless, and jpeg is a bitstream that you can parse from random location, so I don't understand why you say it's so different
ndufresne just trying to find some meat behind this statement | [16:49] |
hverkuil | I'm pretty sure none of the JPEG codecs today can start at a random point. AFAIK they all expect a normal JPEG 'picture' to decode. | [16:51] |
ndufresne | same apply for all our H264 decoder, I only meant to describe the bitstream | [16:51] |
hverkuil | Anyway, we'll see how it will work out. It hasn't crystallized for me either, I just have a gut feeling that there may be problems if you attempt to mix them in a single video node. | [16:54] |
ezequielg | following the codec API, userspace first sets the encoded format, and then things just flow from there. As long as you have one encoded side (i.e. the node is a decoder only or encoder only), I believe it should be OK.
but that might be naive. we still have to merge h264, hevc, vp8 and vp9 controls... so there is a lot to talk about. and we haven't discussed stateless encoders. | [17:02] |
.......... (idle for 49mn) | ||
*** | _abbenormal has quit IRC (Quit: Leaving) | [17:52] |
.............. (idle for 1h9mn) | ||
ndufresne | ezequielg, agreed
ezequielg, I thought you wanted to rename v4l2_m2m_buf_copy_data, but I see you just use it as-is, did I missed something ? | [19:01] |
ezequielg | hm
they are two different patchsets. I expect the rename to be rejected or accepted sooner than the mpeg-2. sorry if this is confusing, I am actually doing lots of changes and it's not easy to keep track of them all. | [19:06] |
........................... (idle for 2h14mn) | ||
*** | ana has left | [21:21] |
.... (idle for 15mn) | ||
ezequielg | hverkuil: I just submitted a v2 for the pixel format helpers alone. Note that since we now have the helpers in v4l2-common.c, a small change is needed to rockchip_vpu_enc.c to avoid a collision. | [21:36] |
............. (idle for 1h3mn) | ||
*** | bingbu has quit IRC (Ping timeout: 244 seconds) | [22:39] |
↑back Search ←Prev date Next date→ Show only urls | (Click on time to select a line by its url) |