I ran into this article from January tonight while cruising Facebook. Michael Martens KB9VBR writes about the advantages and disadvantages of using cheap, easy-to-come-by RG-6 and RG-59 coaxial cable in amateur radio applications. Of course, this type of cable, normally used for cable and satellite TV, closed circuit TV, and even some popular industrial automation networks (Allen-Bradley ControlNet and Modicon Remote IO) has a 75Ω impedance instead of the 50Ω impedance of coaxial cable that we normally use in radio, such as RG-58, RG-8, RG-8X, etc. Is this a problem? It can be, and it is more pronounced on VHF and UHF than it is on HF… which is kind of cool, because it’s more easily abated on UHF and VHF than it is at HF. From Michael’s article:
The first method is to measure and cut your coax so the entire cable run can be measured in 1/2 wavelength multiples. For the two meter band, a half wave is approximately 38 inches. Keeping your cable length within these 1/2 wave multiples will present a near 50 Ohm match at the transmitter end of the line. But how does this work?
Say you where to take a length of 50 Ohm coax and put a 100 Ohm resister at the end. If you where to measure the impedance at the other end, what would it be? Not necessarily 50 Ohms. The reason is that coax offers a mix of resistive and reactive elements that change with the length of the coax. For example, using 1/4 wave length multiples of 75 Ohm coax will give you your 100 Ohm resister a 50 Ohm impedance at the transmitter end. Now if you were to substitute that resister with a 50 ohm antenna, using 1/2 wave multiples of 75 Ohm coax would give you a 50 Ohm impedance. Not too shabby.
This practice works well with VHF as the 1/2 waves are relatively short, so you don’t need to contend with a bunch of extra coax cable. But as you lower the frequency, those wavelengths increase, to the point were you’ve got up to 120 feet of extra cable on the 75 meter band.