On 2011-01-15 22:36 +, Tony Houghton wrote:
> I wonder whether it might be possible to use a more eonomical card
which
> is only powerful enough to decode 1080i without deinterlacing it and
> take advantage of the abundant CPU power most people have nowadays to
> perform software deinterlacing
On Tue, 18 Jan 2011 15:06:50 +0200
Niko Mikkilä wrote:
> On 2011-01-15 22:36 +, Tony Houghton wrote:
>
> > BTW, speaking of temporal and spatial deinterlacing: AFAICT one
> > means combining fields to provide maximum resolution with half the
> > frame rate of the interlaced fields, and the o
On 2011-01-18 14:49 +, Tony Houghton wrote:
> I still can't translate that explanation into simple mechanics. Is
> temporal like weave and spatial like bob or the other way round? Or
> something a little more sophisticated, interpolating parts of the
> picture belonging to the "wrong" field fro
Thanks a lot. I used:
--enable-kernel=2.6.36
and it works fine now.
--
Scott
On 12/01/2011 00:11, David Spicer wrote:
It turns out that the problem was known already. The fix is to
rebuild Arch's x86_64 glibc 2.12.2 package with the --enable-kernel
configure option set to one of the "Good