For a number of reasons, I’ve been dissatisfied with using our Hauppauge PVR-350’s on-board MPEG-2 decoder and TV Out port for MythTV. For one thing, the MythTV mailing-list makes it clear that support for the decoder of this device is on the way out. None of the developers use one, so the code is unmaintainable.
With this in mind, I purchased an XFX GeForce FX5200 AGP video card for €39 from Alternate, which has a well-supported nVidia chipset. The card arrived in the post yesterday, so I installed it in the system and got to work configuring it at the end of the evening.
With the loss of the PVR-350’s on-board MPEG-2 decoder, playback now has to be handled by the CPU, but we have a 3 Ghz Pentium 4 in the box, so it can take it in its stride. This is pretty much the only disadvantage to switching to this video card.
Here are some of the advantages:
The FX5200 has a DVI Out socket, so we’re now connecting to the TV using DVI. The result is that X and MythTV now look superb on our 94 cm Philips LCD TV. The image is crisp and clear and we can drive the TV at its native resolution, which appears to be 1280×720. Related to this, menus and widgets no longer run off the screen and fonts are properly scaled. Bliss!
Thanks to DDC, the monitor can now tell the video card which video modes it can handle, so the Xorg configuration is now very simple and doesn’t even need HorizSync or VertRefresh parameters for the TV. Here’s the relevant section from Xorg.0.log;
(--) NVIDIA(0): Connected display device(s) on GeForce FX 5200 at PCI:1:0:0:
(--) NVIDIA(0): Philips FTV (DFP-0)
(--) NVIDIA(0): Philips FTV (DFP-0): 135.0 MHz maximum pixel clock
(--) NVIDIA(0): Philips FTV (DFP-0): Internal Single Link TMDS
(WW) NVIDIA(0): Mode "1280x800" is too large for Philips FTV (DFP-0);
(WW) NVIDIA(0): discarding.
(WW) NVIDIA(0): Mode "1280x768" is too large for Philips FTV (DFP-0);
(WW) NVIDIA(0): discarding.
(II) NVIDIA(0): Assigned Display Device: DFP-0
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "1280x720"
(II) NVIDIA(0): Virtual screen size determined to be 1280 x 720
(--) NVIDIA(0): DPI set to (76, 76); computed from "UseEdidDpi" X config option
I’m not sure why those first two modes are tried, since they wouldn’t offer the correct 16:9 aspect ratio. If anyone knows, please tell me. Nor do I understand why 1920x1080i isn’t tried or selected, since our TV is capable of that resolution. I’m not too bothered, though, since we don’t have any input sources that can provide that resolution.
MythTV is slowly moving towards OpenGL for its themes and menus, so having an nVidia card with its hardware acceleration allows us to enjoy all of the OpenGL goodies. TV playback can also use certain OpenGL functions to reduce jitter and make other improvements to the image. And, very pleasingly, we can now enjoy all of the OpenGL visualisations offered by the MythMusic module. The TV has become a big flat disco ball for listening to my music collection!
Although MPEG-2 decoding is now handled by the CPU, we can still offload some of the work to the video card’s GPU by using MythTV’s XvMC support. A lot of people seem to have trouble with XvMC, but I found it very easy to get working. The only issue was that the OSD (on-screen display) became greyscale rather than colour, but that’s a known issue and there’s an easy fix for it.
During playback, fast-forwarding at greater than 3x speed now properly displays the current location in the recording. With The PVR-350, fast-forwarding beyond 3x speed would freeze the playback image, leaving you with only the playback timer to hazard a guess at where in the stream to resume playback. This was very annoying and confined us to 3x fast-forwarding much of the time.
During playback, rewinding at any speed now properly displays the current location in the recording. Similar to the previous point, but even more annoying, was that the PVR-350 would freeze the playback image when rewinding at any speed. If you wanted to replay the last couple of scenes in a programme, you would thus have to guess how long they were and resume playback after rewinding that many minutes.
Not using the PVR-350 MPEG-2 decoder for playback means that the audio and video are now separable. This, in turn, means that we can now send audio to our sound system, rather than being restricted to using the TV’s built-in speakers.
As a result of the previous change, the internal MythTV volume controls now also work. Previously, we had to adjust the volume via the TV, because the PVR-350’s hardware decoder volume was beyond the control of MythTV.
The ivtv driver decoder errors that have plagued us since I first set up this system almost a month ago are now happily also a thing of the past. We would regularly get errors like these in the log:
Sep 18 14:36:52 tourbillon kernel: ivtv0 warning: DEC: Sched Buffer end reached 0x0ad51267
Sep 18 14:36:52 tourbillon kernel: ivtv0 warning: DEC: Mailbox 10: 0x00000000 0x0ad41267 0x0ad41267 0x00000000
Sep 18 14:48:32 tourbillon kernel: ivtv0 warning: DEC: Decoder Invalid type 0xd8031707?
Sep 18 14:48:32 tourbillon kernel: ivtv0 warning: DEC: Decoder Invalid type 0x0e86df64?
In practical terms, these manifested as occasional picture freezes during playback. Sometimes we’d get several within a few minutes. At other times, we’d watch a whole programme without one. They were very unpredictable.
A quick hit of the Rewind button was necessary to unjam the system. It was irritating, but we were able to live with it. Now we’re no longer using the hardware MPEG-2 decoder of the PVR-350 card, however, we’re avoiding whatever it was in the driver or the firmware that was causing these.
- Time-stretching now works properly. Time-stretching is the ability to play back video at faster than normal speed, but with the appropriate audio compensation, so that the pitch remains constant.
The idea is that, if you’re short on time, you can get through a 30 minute programme in, say, 20 minutes if you pay good attention and can stand listening to people who talk very quicklyt. At 1.5x speed, for example, everyone sounds like the people who read the disclaimer at the end of American radio adverts. You know the ones, where it sounds as if all the gaps between the words in their sentences have been sucked out.
- Because we’re using a plain old video card now, the entire Linux boot process can be followed on the TV screen. If there’s ever a problem that causes the system to fail to boot, I’ll now be able to debug it without a blindfold.
And that’s about it. I reprogrammed our Harmony 885 universal remote-control to use the external sound system instead of the TV’s audio for MythTV playback and to select the DVI input on the TV instead of the relevant SCART connector, and now we’re all set. I just need to go and buy a new DVI cable, as I borrowed the one from my computer upstairs for testing purposes.
Very high resolutions (above about 1600×1200) require dual-link DVI, which I’m guessing neither your video card nor your DVI cable support.
Actually, that’s not true. A single-link DVI can display up to 1920 x 1080. You’re right that my display isn’t dual-link, but nor does it support higher resolutions than 1920 x 1080, which is already more than I need, so I’m fine with single-link.
A dual-link display will also provide slightly better signal strength, but I don’t need that, either. The image on my TV is crisp and sharp with no ghosting or other artefacts.
The messages you see in the Xorg log above, complaining that certain resolutions are too large for the TV, are simply the result of the DDC (a.k.a. EDID) information provided by the TV being wrong. Yes, believe it or not, Philips TVs are notorious for lying about their own capabilities over DDC.
To test this, I disabled DDC and added a 1920 x 1080i modeline. I was able to restart X in the new resolution without any issue. Everything looked sharp and crisp, just unbelievably tiny.