Does Cable Length betw. Amplifier & Antennas Decrease Signal Strength?
The answer is yes! Typically, decibel strength is decreased by 3-4 dB for each 100 ft. of cable on the 800 cellular band, and around 7 dBs on the 1800/1900 PCS band. This means that the higher the frequency, the more the signal decreases. For the AT&T and Verizon 700 LTE bands, signal strength would decrease 3-4 dBs, as on the cellular band. Signal strength would decrease for the T-Mobile AWS 4G band by 7 dB for each 100 ft. of cable length, similar to the PCS band. Should the length of cable between the cell amplifier and antenna be quite long, in-line boosters with lower signal strength are often used in order to extend the signal along the cable line.
Attenuation of CM400 or SC400:
30 MHz = 0.7 dB loss per 100 ft.
50 MHz = 0.9 dB loss per 100 ft.
150 MHz = 1.5 dB loss per 100 ft.
220 MHz = 1.9 dB loss per 100 ft.
450 MHz = 2.7 dB loss per 100 ft.
900 MHz = 3.9 dB loss per 100 ft.
1500 MHz (1.5GHz) = 5.1 dB loss per 100 ft.
2400 MHz (2.4GHz) = 6.65 dB loss per 100 ft.
5800 MHz (5.8GHz) = 10.8 dB loss per 100 ft.
For example, 100 ft. of cable from the antenna to the amplifier would decrease the 1800/1900 MHz PCS frequency by approximately 7-8 dBs and decrease the 800 MHz cellular frequency by 3-4 dBs. The same would apply to the 700 LTE frequency.
You will find a handy little dB attenuation cable calculator online on the Times Microwave site.