Axiom Home Page
Posted By: nickbuol video line driver vs. signal booster - 08/24/06 03:01 AM
I have a brother-in-law who is trying to set up some new 72" (1080p capable) HDTVs at their church to displaying song lyrics and video clips, etc from a PC in the back of the sanctuary. Anyway, that is like 100' away. He is looking at a straight VGA cable run or a VGA - Cat5e - VGA conversion setup.

One thing that came up were the terms "Video Line Driver" and "Signal Booster". What is the difference. Does anyone here have any experience with video from that distance. I guess the TVs have HDMI input (but only does 720p over it) and the PC has DVI output. They won't be doing 1080p any time soon, but don't want to rewire for it down the road if they can do it now.

He likes the idea of the VGA to Cat5e to VGA because then the long runs would be over smaller Cat5e cables that don't have bulky plugs on the ends of them as they have to run through conduit. I've already told him about interference issues, especially with any parallel running power, etc.

Any help or thoughts would be helpful.
Posted By: SirQuack Re: video line driver vs. signal booster - 08/24/06 03:34 AM
We just use a projector mounted on the ceiling, a pc in the back running software which displays what used to be in the bulletins during service, and a electric screen that is mounted on the ceiling just in front of the front pews, works great for our services and was a lot less hassle.
Posted By: BrenR Re: video line driver vs. signal booster - 08/24/06 06:14 AM
Either way won't be too easy... line loss is the big issue. Over VGA, it's analog signals becoming undermodulated, over DVI (which is what the HDMI would revert to from a DVI source) it's digital data loss.

And to fix it - it's the same problem in reverse - how much do you boost voltages? If you underamplify the VGA signal, your R, G and B signals will be low at the far end (not a HUGE deal)... but so will HSync and VSync (which IS a big deal) if you overamplify, you'll have blooming colours and sync issues. A pro would calculate and measure line loss - as an amateur, it would be a lot of guess and check.

For long DVI runs, most often I've heard of electrical to optical conversion, but I've never worked with the installation of one. Can't be any help there.

Bren R.
Posted By: bridgman Re: video line driver vs. signal booster - 08/24/06 12:57 PM
The easy/sleazy way to do it would be to move the PC to the front but keep it hidden, run a network cable to the back, and add someone's cast-off PC there running Remote Desktop into the "real" PC at the front near the TVs.

Not as convenient as having direct wiring though. My first thought would be to run an RG6 coax for each of R,G,B and see how it works for VGA without amplification. 100 feet is not a real long run for good coax at video frequencies.

Assume you would be running a display resolution of 1280 x 720/768 or something like that ? That's only about 70-80 MHz pixel clock so I think it would probably work OK.

EDIT -- one more point... hooking RG6 coax to VGA connectors is a pain -- if you could find a video card with three RCA jacks on the back panel for component out that might be worth it. If the card uses a dongle to go from a many-pin connector to component it's not worth the bother.

Edit Edit -- Hold on, I just realized you want to run multiple displays, right ? Um... never mind, ignore all of the above, you do need a video distribution amplifier-like thing for that.
Posted By: nickbuol Re: video line driver vs. signal booster - 08/25/06 02:00 AM
Quote:

We just use a projector mounted on the ceiling, a pc in the back running software which displays what used to be in the bulletins during service, and a electric screen that is mounted on the ceiling just in front of the front pews, works great for our services and was a lot less hassle.




Yeah, but some member of their church donated the two 72" HDTVs!!!
© Axiom Message Boards