Mailing List archive
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[vdr] Re: Interlacing
On Wednesday 27 October 2004 17:36, Lucian Muresan wrote:
> A dxr3 or a FF DVB card may do MPEG1/2 decoding very well and may have a
> very good picture quality, but so do most domestic DVB receivers too. If
> I do DVB on a PC, I want it do more than that, decode various other
> formats without transcoding first at the expense of CPU cycles which I'd
> rather put into decoding, decode even HDTV (MB + CPU upgrade) when the
After fighting interlacing for six months with different tvout techniques I
must say that "wasting" cpu cycles for encoding the video into mpeg1 is so
much better than the alternatives I really can't be made to care about it.
With the other alternatives you always seem to end up with some or all of
the following
1) you lose half the motion (half the fields)
2) you lose detail
3) you get the field order wrong and this causes annoying artifacts and
jagged edges or the picture jumps forward and backwards so much it makes
you sick (really, physically)
4) you get annoying artifacts
5) tearing because vsync doesn't want to work this day of the week
6) the picture is annoyingly blurry because of not-so-good quality tv-out
components used in display adapters
7) cpu usage is high anyway because of all the filters you had to put in
place to get anything watchable out and in the end the picture doesn't
look as good as it would if the output could just spit out the interlaced
picture on the tv as it is.
So, in short: if you have to deinterlace then you've already lost the game.
And television is always interlaced (sure, HDTV can be progressive too).
Even if they show you pretty shot-on-film-at-24fps movies it'll still come
out interlaced. This might not always be immediately apparent but it can
still manage to get the field order wrong at some point and then you'll
have the jagged edges or picture moving in order 2 1 4 3 6 5 instead of 1
2 3 4 5 6. And it gets better: most (all?) movies and tv shows that are
shot on film are edited on video. This can cause changes in the
interlacing pattern at scene change. Someone called this "phase change", I
have no idea what the official term is. Whatever it is what happens is
that some scenes come out so that fields 1 and 2 are about the same and
then suddenly it changes so that fields 2 and 3 are the same. Have fun
going through every scene figuring out what interlacer looks best.
There are of course deinterlacers that detect such things but isn't that
the real waste of cpu cycles? Tvtime seems to have a filter that at least
says it does something like this. Applied to raw video from an analog bttv
card (meaning that cpu time required for decoding the MPEG2 stream you'd
get from a dvb card isn't included) this filter made my athlon xp 2000+
scream in horror as it kept dropping frames all the time. Definitely not
something you'd be running on my 1GHz vdr box.
For these reasons I think a DXR3 or a HW+ card is indeed an investment
worthy of 20e. You can probably find it for less.
Home |
Main Index |
Thread Index