<!-- Some styling for better description lists --><style type='text/css'>dt { font-weight: bold;float: left;display:inline;margin-right: 1em} dd { display:block; margin-left: 2em}</style>

   pinchartl: <u>andrzej_p</u>: no it's not, and it has always been rejected
   <br> mostly to avoid such a loopback driver being abused by vendors
   <br> and push them to implement open V4L2 kernel drivers instead of minimal kernel code exposing a custom API, and a binary userspace blob that would then handle the camera and expose the data through V4L2 loopback
   andrzej_p: <u>pinchartl</u>: very good explanation, thanks
   <br> <u>pinchartl</u>: I haven't thought about the dark side of what such a loopback device opens
   ndufresne: <u>andrzej_p</u>: pinchartl: Well, vivid have a loopback feature, it's less known, but it works quite well, with small hack to limit the available resolution you can get the same result
   andrzej_p: <u>ndufresne</u>: what's the difference then?
   <br> I mean between vivid and v4l2loopback.
   ndufresne: v4l2loopback handles limiting the formats of the capture base on the output queue
   <br> but is totally screwed in buffer manamgent and v4l2 compliance (and I would not touch this driver on a prod driver, even in a VM)
   <br> while vivid is pretty solid, but you have to match the right format/resolution for the loopback to work
   <br> otherwise you get a test pattern
   <br> what I've done so far, is monitor what the browser select, and then match this on the OUTPUT queue, side effect, there is a small amount of time were a test pattern is shown
   <br> you cannot just script kiddy loopback with vivid
   andrzej_p: <u>ndufresne</u>: thanks for explaining; now why vivid cannot be put to "unfair" use? I assume if it could, it would have never been merged in the first place.
   ndufresne: <u>andrzej_p</u>: I don't know, incompetence ?
   <br> <u>andrzej_p</u>: the truth is that v4l2 is complex, and vendor and close source drivers maintainer won't put the extra effort into abusing Linux
   <br> the one that do so, don't use V4L2, they actually write a custom drivers, with custom library on top that abstract the OS
   <br> think of Blackmagic, for the Decklink cards as an example
   <br> <u>andrzej_p</u>: I guess what pinchartl means is that feature like v4l2loopback or vivid loopback might possibly allows some ligit way of having proprietary drivers
   <br> It's quite a bad argument considering what people what's it for is to have a way to synthetise a camera
   <br> I've seen people writing blogs about how to use loopback with OBS, or people using RTSP ONVif security camera as Webcam
   <br> so to me, it's sad that some very ligit use cases are considered an abuse, but at the same time I find it rediculous to have to use the kernel for such case
   <br> There is light now, browser do screensharing, and with Wayland were you can no longer spy on peoples desktop without authorizaton came pipewire to tunnel streams from compositor to browsers
   <br> So to me, the way forward is to find a way to expose synthetic cameras through pipewire, and stop looping though the kernel
   andrzej_p: <u>ndufresne</u>: my use case is a smartphone acting as a video camera while connected through USB. There is an app (you can easily find it) which pushes video (and audio) through adb.
   <br> at the host side there's another app (that one with code which can be found on github) which demultiplexes the stream and - yes - wants a v4l2 loopback device so that tools such as OBS can broadcast the video.
   ndufresne: fun
   <br> I have also started a project here, which consist of using a OTG powered Allwinner + UVC Gadget
   andrzej_p: Very well, but I want to take advantage of the equipment I already have, which is a smartphone with a camera that is decent enough to provide nice 720p streams on YT
   ndufresne: so I can inject things from the network, and make them seen as UVC
   <br> that will probably only work over USB3 though
   <br> unless you compress/decompress ?
   andrzej_p: my use case works well with USB 2
   <br> but yes, I suspect some compression might be applied
   ndufresne: <u>andrzej_p</u>: remains that having to use the kernel here is obviously a hack ;-P
   pinchartl: <u>ndufresne</u>: sshhhhh don't mention the vivid loopback feature :-)
   <br> <u>andrzej_p</u>: how about using UVC gadget on the phone side ?
   andrzej_p: <u>pinchartl</u>: it runs Android, so not easy.
   <br> otherwise that of course would be the best option
   pinchartl: in theory you could write a libcamera pipeline handler for this. we're missing the ability to handle cameras without an MC device in the backend, but I think it could be a good path forward
   tfiga: <u>ndufresne</u>: andrzej_p: I'd say there isn't even any incentive for such hacks :)
   <br> any platform that requires V4L2 drivers wouldn't allow such an "implementation"
   <br> and other platforms are okay without V4L2 drivers
   <br> I won't name the platforms, but I guess it's pretty much clear :)
   ***: benjiG has left
   ndufresne: agreed
   ezequielg: <u>koike</u>: tonyk loopback ^^^
   pinchartl: <u>tfiga</u>: there are a few of them :-)
   ndufresne: <u>ezequielg</u>: now you just reminded me that koike wanted to support looback through MC
   ezequielg: i believe we have might have some patches even.
   ndufresne: considering the level of knowledge people scripting around v4l2loopback usually have, configuring VIMC might be too much
   tfiga: <u>pinchartl</u>: we made that mistake back in time and allowed one of the vendors to "emulate" V4L2 using libv4l2
   <br> we're paying for that even now
   <br> I think everyone needs to make a mistake like this once
   pinchartl: <u>tfiga</u>: we still have no standard solution to handle cameras not backed by a local hardware device. network protocols come to mind. but maybe that's the job of a higher-level framework
   ndufresne: <u>pinchartl</u>: we'd like to demonstrate this over pipewire, that's our goal, but there is still some plumbing
   <br> pipewire got the interface to expose cameras to browsers, at least the single stream way, but there is no interface to expose a camera that isn't implemented inside pipewire (that isn't a source)
   <br> basically it's missing the interface for the service that would be responsible for streaming from the remote cameras
   pinchartl: pipewire supports both plugins for internal providers, and sources from external processes, right ?
   ndufresne: but we can't open that door without opening the door to proprietary solutions, that's something we have to accept
   <br> <u>pinchartl</u>: it has generic nodes that you can configure to passthrough video streams between processes
   <br> inside these nodes, you can currently add vulkan filters, that's pretty much all it has for now
   pinchartl: pipewire is on my libcamera todo list, but currently through an external process, not a plugin
   <br> there's too many dynamic allocations today to qualify for a plugin
   ndufresne: <u>pinchartl</u>: it's  also not clear that the entire libcamera thing should be inside a pipeline node (realtime)