Connect with us

Question about radio waves and signal propagation

Discussion in 'General Electronics Discussion' started by circuit_newb, Jul 25, 2017.

Scroll to continue with content
  1. circuit_newb


    Jul 15, 2017
    I'm trying to learn radio and I have a couple of simple questions.

    My question is,

    What causes a RF wave to be more or less ' sensitive ' to distance travel?
    I mean, what is it about a wave that causes it to be less/more usefully receptive at large distances?

    I know the wavelength determines how the signal will travel.
    ideally the longer the wavelength the less power it takes to ' overcome ' the obstacles that block or deflect the
    signal, etc. but why is it certain frequencies are more or less sensitive to this?

    for instance short and medium wave radio can travel world wide. they can skip travel and a few hundred Watts of RF power on the right day can be heard in China. However VHF radio doesn't seem to go far at all. commercial FM stations output multiple kilowatts of power and generally do not cover more than half a single US state..

    why is this? the wavelength of say, 100MHz is 3 meters. this is really ' short wave ' when compared to say 1 MHz shortwave band. However 3 meters is not as short as cell phones, or microwaves. I can barely hear a 5,000 Watt station more than 20 miles of me but amazingly my cell phone, which runs at 900 or 1800MHz, can talk to a tower more than a mile away on miliwatts. ?

    obviously line of sight direction has a good deal to do with it a higher frequencies but still it seems certain radio bands are less useful at distance travel. why is this? I have had ota TV stations transmitting from aprox the same place and direction, the same power and been able to pick up one on UHF and none of the VHF worked. Can you explain this?
  2. davenn

    davenn Moderator

    Sep 5, 2009
    welcome to EP :)

    as the radio wave leaves the antenna, it spreads out and as a result lessens in strength according to the inverse square law

    sort of

    higher frequencies do attenuate more

    not even a few 100 watts ... just a few watts ..... High Frequency (HF) signals ( up to around 30 MHz and a bit) reflects very easily off the ionosphere at certain times of the day
    VHF/UHF signals don't do this they will penetrate the ionosphere and go directly out into space

    short waves as in the previous bit I quoted from you is just a name for signals from around 3 - 30MHz. 100 MHz would be deemed VERY short waves

    there's all sorts of reasons why you can and cannot hear something closer or further from you. Cellular systems use very sensitive receivers and very high gain antennas
    to pick up the few 100 me from your mobile

    the higher the frequency the more line of sight it becomes

    not really sure what you mean by that ?

    I would need more specific info on the transmitters, their power, location, terrain between them and you etc to give a good answer

  3. kellys_eye


    Jun 25, 2010
    Signal propagation (power degradation) follows an inverse square law (in free space) over distance.

    For every doubling of distance you receive only 1/4 of the signal power at the receiver.

    This is just the 'basic' of power loss in free space and is additionally altered by the medium (air, moisture, terrain, buildings etc) and the type of antenna used - whether an isotropic radiator, dipole or yagi (directional) antenna.

    VHF signals have been known to travel 1000's of miles under the right circumstances - indeed I've received VHF port control radio signals (50W) from the Persian Gulf when on board a tanker some 2,000 miles south off the East coast of Africa! A phenomenon known as 'ducting'.

    It also depends on your receiver. Many 'basic' receivers - such as the one in your car - might require 1mV at the antenna to produce a 1W audio output whilst your mobile phone can do the same with only fractions of a μV.

    Of course, the antennas used at both ends make a HUGE difference, TV antennas usually being designed to concentrate power in a certain direction - remember this happens at BOTH ends, your antenna and the transmitting one - to maximise power transfer and thus decrease the overall transmission losses.

    Have a look at this article for some basics in calculating 'distance travelled' for radio signals:
  4. duke37


    Jan 9, 2011
    At very low frequencies, the signals are attached to the ground and long and medium wave transmitters are placed where there is good ground to get good range, they go over hills.

    On short wave, say 2MHz up the signals tend to be more line of sight but there are ionised layers in the atmosphere which can give total internal reflection, thus there can be a skip distance before the signal comes back down to ground. The strength of the layers vary with excitation from radiation from the sun. To see what is happening at any time you can use remote receivers, I use the SDR receiver at the secret bunker near crew. ( to see the band conditions.

    At higher frequencies things are more line of sight with only slight bending due to variations of air pressure or water content. Bouncing the waves from buildings can give a signal where you would not expect it. Satellite signals are pure line of sight and I used to get signal interruptions from bats flying in front of the dish. They used to fly in front of a flood light to turn it on, then come back for moths for dinner. I have turned the light off so they have to go further for their food.
Ask a Question
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.
Electronics Point Logo
Continue to site
Quote of the day