On 11/12/2013 10:09 PM, Jon N wrote: > On Nov 12, 2013 7:32 PM, "Stan Hoeppner" <s...@hardwarefreak.com> wrote: >> >> On 11/12/2013 5:37 PM, Jon N wrote: >> ... >>> There is one an area that I'm pretty unsure of. I am planning on >>> purchasing a Nvidia video card and disabling the built in Intel video >>> support. Since I plan to use this computer as a MythTV >>> frontend/backend (as well as for general web browsing/email) getting >>> the audio out on the Nvidia card's HDMI port is important to my >>> particular setup. So will the audio automatically be switched to the >>> Nvidia cards HDMI connector? >> >> No, it won't be automatic. And frankly I don't believe nVidia supports >> HDMI digital audio pass through, nor any discrete GPU card. For >> argument's sake, let's say it does. Then you run into the problem that >> the onboard audio chip can't pass digital audio through PCIe to the >> nVidia HDMI port. None of them are designed to do this, that I'm aware > of. > > Wow, I'm glad I asked that question :-). If I understand this correctly it > it doesn't matter if there is any video hardware on the mainboard, in the > processor, or none at all, you still can't get sound from a audio chipset > on the mainboard to the video card's HDMI connector anyway.
That's not actually how it works. The OS directs digital audio. In the case of a mobo audio chip, digital audio is sent to this chip no matter what, and it either decodes it to analog and dumps it out the discrete analog ports, or it passes the digital stream unmolested through a digital connector, either HDMI, coax, or Toslink. In this case, there are discrete mobo traces from the audio chip to each of these 3 outputs. You select the audio output in the driver. The chip then passes it to the connector you choose. In the case of sending digital audio out the HDMI connector on a graphics card, this is done purely in software, and the data stream never goes to the audio chip. It's sent from the application through the audio driver directly to the video card HDMI drive chip. The reason I mentioned going from the audio chip to the HDMI of the vid card is that, AFAIK, the Alsa driver doesn't support multiple digital out devices, and neither nVidia/ATI support HDMI audio output in their drivers. If you use a mobo audio chip, its driver should allow you to select any of the audio ports on the board. You may have to do it statically in a config file instead of selecting it on the fly as in MS Windows, but you should be able to use any connector on the back panel nonetheless. So again, if you want to send video+audio over a single HDMI cable to the TV, I think this is your only option. >> If I were you I'd get a mainboard with with HDMI out and use the CPU's >> GPU. Mobos that have onboard HDMI have their audio chips wired to the >> HDMI port, the chips support PCM/AC3 digital output, and selecting the >> HDMI output for digital audio is pretty straightforward. > > I think pretty much all the Mobos have HDMI in them, especially since they > support a processor line that all has built in video. Not by a long shot. Only about half the boards available at Newegg have HDMI. And, obviously, all boards lacking integrated GPUs, or supporting AMD/Intel "performance" CPUs with no GPU, do not have HDMI. These are your high end SLI/X-fire boards. These have integrated audio chips but their digital out is limited to Toslink/coax. This is no problem in MS Windows as you can select the HDMI output on the discrete GPU board. With Linux, thus far, it appears one is SOL. > I was planning on > Nvidia simply because a) I use it now and b) I am under the impression that > they have better overall support (i.e.: just work better). But I may be > underestimating how well the built in Intel video solution works. And it > would same me money by not purchasing a new board, use less electricity > (love that), and maybe even make the system quieter (no fan on a separate > video card). I will check over at the MythTV mailing list about it. 2D video processing takes almost no GPU horsepower at all. Any modern CPU w/integrated GPU can handle broadcast HDTV or Blue Ray HD video without breaking a sweat, and so can any modern mobo GPU. It's 3D gaming where they have weakness, especially at high resolutions and high texture detail. But if you're only doing video, any of them is more than sufficient. >> The Intel GPU should be plenty powerful enough for HD1080 output. If >> you decide it's not, and want to add a discrete card, you'll need a mobo >> with coaxial digital SPDIF output, or Toslink optical digital output, >> and a TV or A/V receiver that is cable of using an HDMI input for video >> while using coax or Toslink for audio. Nearly all modern A/V receivers >> support this. WRT LCD/Plasma TVs I have no idea how many support this. > > I currently use my DVI out to HDMI in on my receiver, and s/pdif for the > audio, it works fine. I thought it would be nice to have it all in one. I It is nicer. And it's a couple of clicks to make it work in MS Windows. But with Linux and a discrete GPU I think you'll have to stick with two cables. Using an integrated GPU and mobo HDMI port, you should be able to get by with one cable. > think the HDMI supports higher bandwidth for the audio, but I'm not sure > anything I'm playing would need it anyway. Later HDMI revs do support higher audio bandwidth. They did this to support 7.1 channel Dolby TrueHD and DTS-HD which are part of the Blue Ray Disc specifications. Neither encoding has achieved market penetration. AFAIK the studios are mastering almost no BRDs with either of these newer audio standards, but with the "old" 5.1 channel Dolby Digital and DTS. And the few that are mastered with TrueHD and DTS-HD also include standard DD and DTS audio tracks for compatibility with older A/V receivers. These discs typically cost more due to their limited production runs. The reason TrueHD and DTS-HD didn't take off is that multiple A/V tech outlets performed significant blind testing and published results showing that humans could not perceive any significant sound quality difference between the newer and older digital encodings. Using the old encoding formats leaves more room on the disc for higher quality video encoding, more special features, multiple languages, etc. Thus it's more profitable for studios to use the "old" audio encodings. -- Stan -- To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org Archive: http://lists.debian.org/52830fa7.7040...@hardwarefreak.com