On Tue, Jul 22, 2008 at 06:37:05PM +0200, Thomas Hilber wrote:
It appeared to be a privilege of so called full featured cards (expensive cards running proprietary firmware) to output true RGB PAL at variable framerate. Thus always providing full stream synchronicity.
I assume RGB NTSC should work as well.. ?
I live in Europe so PAL is the thing for me, but sometimes you have video in NTSC too..
After some further experimenting I finally found a solution to fine adjust the frame rate of my elderly Radeon type card. This time without any bad side effects on the screen.
Just trimming the length of a few scanlines during vertical retrace period does the trick.
<snip>
When xine-lib calls PutImage() it checks whether to increase/decrease Xservers frame rate. This way after a short adaption phase xine-lib can place it's PutImage() calls right in the middle between 2 adjacent vertical blanking intervals. This provides maximum immunity against jitter. And even better: no more frames/fields are lost due to stream and graphics card frequency drift.
Hmm.. can you explain what "increase/decrease Xservers frame rate" means?
I don't really know how xserver or display drivers work nowadays, but back in the days when I was coding graphics stuff in plain assembly (in MSDOS) I always did this to get perfect synchronized output without any tearing:
1. Render frame to a (double) buffer in memory 2. Wait for vertical retrace to begin (beam moving from bottom of the screen to top) 3. Copy the double buffer to display adapter framebuffer 4. Goto 1
So the video adapter framebuffer was always filled with a full new frame right before it was visible to the monitor..
This way you always got full framerate, smooth video, no tearing.. as long as your rendering took less than duration of a single frame :)
So I guess the question is can't you do the same nowadays.. lock the PutImage() to vsync?
-- Pasi