I have a LCD projector with a native res of 800x600 connected via RGB
So on your projector you have 4/3 aspect ratio, on 16/9 content (HD) you get 800x450.
HD 720p content shows smaller, in the middle of the screen but looks amazing HD 1080i as well only fills the center of the screen and has
Because SD works full screen, I think this is something what downscaler does. Have you tried multiple xineliboutput settings? How about xinelib settings if any?
bad interlacing artifacts regardless of the deinterlacer settings I use. My video card is a nvidia 5500
I think with XVMC you don't get deinterlacing, because you forward sort of mpeg data to video card, and if videocard cannot do proper deinterlacing. Starting from Nvidia 6600 there is purevideo hardware deinterlacers which work ok on Windows side.
On possible direction would be investigating xv_deinterlace if that can be used. But probably not with XVMC.
BR, Jori
jori.hamalainen@teliasonera.com wrote:
I think with XVMC you don't get deinterlacing, because you forward sort of mpeg data to video card, and if videocard cannot do proper deinterlacing. Starting from Nvidia 6600 there is purevideo hardware deinterlacers which work ok on Windows side.
Here is a summary
http://www.nvidia.com/page/purevideo_support.html
For SDTV the "spatial temporal" deinterlacing is supported up from 6150 and 6200 chips. A 6600 is need for HD. Despite of rumours in the nvidia forum nothing has happened to support Linux :^(
On possible direction would be investigating xv_deinterlace if that can be used. But probably not with XVMC.
I have been able to activate in the past some kind of linear blend (loss of vertical resolution) and bob (some blinking) deinterlacers with xxmc. Since xine-lib CVS 1.1.3/4 and new Nvidia binary drivers I'm not able to activate deinterlacing any more.
This is a bit off-thread but I guess the pain in the *** interlaced video won't be killed in near future. Has anyone been able to output with xinelibout properly scaled video to 1080i over DVI/HDMI without need to de-interlace in PC but in the display device?
I need X.org and like to play games in the HTPC so fbdev is not an option for me.
BR, Seppo
In 457AAAAB.1080705@iki.fi, Seppo Ingalsuo wrote:
This is a bit off-thread but I guess the pain in the *** interlaced video won't be killed in near future. Has anyone been able to output with xinelibout properly scaled video to 1080i over DVI/HDMI without need to de-interlace in PC but in the display device?
I need X.org and like to play games in the HTPC so fbdev is not an option for me.
Apparently mythtv can sync playback to an X display by using OpenGL or DRI to provide vsync, which should allow interlaced videos to be played properly on a suitable display. I think it would only really work on a traditional PAL or NTSC TV though.
I've tried it out in an experimental video player I'm writing (which is easier than getting mythtv to work!). The vsync appears to work but I haven't tried it on a TV yet to see if it solves the interlacing problem. My VDR box uses a Matrox card with DirectFB now. I used to use an ATI card with a SCART adaptor for a while, so I tried it with my ATI-based laptop, but I couldn't get that to do PAL. I have got access to a desktop PC with an ATI card and old Linux installation, so I'll try taking the TV to the computer instead of the other way round...
If anyone else interested is using X on a TV with a VGA to RGB SCART adaptor, my code is available by svn from:
https://svn.sourceforge.net/svnroot/boxstar/tags/try1/boxav
You'll need dev packages of libavcodec/libavformat (ffmpeg), SDL and ALSA. Any other output plugins referred to aren't developed yet. It needs to be installed before it can be run so it can locate its plugins. Another note: I had to use --disable-opengl with the configure script on my laptop because the relevant GLX function was missing; ATI cards can use DRI instead.
The binary is called boxplay. It's still very crude, but should work well enough to see whether it can play interlaced videos correctly. It can play VDR files. To stop it when running just click the screen.
If it proves to work it would be a useful function to have in xine. I don't know xine's code well enough to hook it in, but perhaps Darren Salt could do it?
In 20061216225847.GA10797@realh.co.uk, Tony Houghton wrote:
I've tried it out in an experimental video player I'm writing (which is easier than getting mythtv to work!). The [DRI] vsync appears to work but I haven't tried it on a TV yet to see if it solves the interlacing problem.
I managed to try it out today. It doesn't quite work, but I think it has potential to be made to. The problem is that vsync interrupts are generated at 50Hz rather than 25Hz and I don't know how to distinguish between top and bottom fields. The mythtv developers might know though, so I'll ask on their mailing list.