for you.. I believe it should be possible to get 50 fps progressive output from 50i material using vdpau deinterlacing.
yes, my PCI Geforce 8400 card on GPU G98 can do 1080p@50 but with the simplest bob deinterlacing only. I prefer to use more advanced deinterlacing algorithm - temporal_spatial or temporal. But with them I have jerky video
Hm, do you think you were only able to use simple deinterlacing because of limitations in CPU speed, bus bandwidth (only PCI - I've been thinking about one of the 8400 GSs myself so I can play with it in my PCI-only VDR box..), or power within the G98 GPU itself?
I don't know exactly there's report that on G98 temporal works well. I believe in it So, I suppose that PCI bus is limited
nvidia mentioned http://us.download.nvidia.com/XFree86/Linux-x86/185.18.29/README/appendix-h.... In order for either VDP_VIDEO_MIXER_FEATURE_DEINTERLACE_TEMPORAL or VDP_VIDEO_MIXER_FEATURE_DEINTERLACE_TEMPORAL_SPATIAL to operate correctly, the application must supply at least 2 past and 1 future fields to each VdpMixerRender call. If those fields are not provided, the VdpMixer will fall back to bob de-interlacing.
.. But you might get better results if you outputted 1:1 interlaced material using 1080i50 mode and let the LCD tv do the deinterlacing..
Yes, exactly what I want to try, But I couldn't run properly 1080i@50 - even in xorg.log I have the report about validated 1080i@50 mode - my LCD TV Philips 9703PFL reported about 1080p source from hdmi
:( Native output and letting the TV do the grunt work would be my prefernece, too - in what way doesn't it work? No output at all or very juddery output ?
I could reach more less good quality output for 1080i@50 , but I decided to come back to 1080p@50 because that mode was better than 1080i@50 (I repeat mt TV set always informed me about 1080p mode , never - about of 1080i)
Goga