[lug] Local phone service with Comcast--any experience?

Nate Duehr nate at natetech.com
Sun Jun 26 02:18:32 MDT 2005


Daniel Webb wrote:

Disclaimer: This is a mega-late-night ramble below, and it's likely that 
I'll goof up some technical details.  Plus I'm not a degreed BSEE or 
anything like that, so what I've learned about RF is by hands-on 
experience over a long period of time.  I have had some great teachers, 
but sometimes people get dumb things that aren't correct stuck in their 
heads that only did "real-world" learning vs. "book" learning.  I hope 
most of the below is informational and useful.

Anyway, RF is great fun, no matter how you slice it... on with the fun...

> Something to think about: I have line of sight with 12 to 18 dB directional
> antennas, and I still get 20-30 seconds dropout per hour due to all the
> interference on the 2.4 Ghz band.  My normal signal is good enough to get 10
> Mbps sustained, which is a really good signal, but I get noise as high as
> -60dB, and because of the way 802.11b works, it won't even transmit when the
> noise is that high[1].  I was considering setting up voip over my wireless link,
> and I probably will still try, but I don't expect it to be very reliable.

Noise is typically measured by SNR or Signal-to-Noise ratio.  It can't 
really be measured by itself because you have to compare the noise to 
the desired signal to get a number.  A ratio.  A direct Signal to Noise 
ratio is often employed using dBm.

In digital systems, this is usually done mathmatically via the 
determination of a bit error rate and converted to an equivalent number 
in dB -- just as if the system were analog -- to make it easier for 
humans to read.

SINAD is another common measurement, used in analog but not often seen 
in digital, because the noise can't really mix with the digital signal 
if it's error-corrected and distortion really doesn't factor in much 
other than multipath which causes timing problems recovering the sent 
signal if it's synchronous... noise in digital signals just causes lost 
data, so a BER is actually much more accurate, but we humans like to 
compare apples to apples.  All else equal, a digital system will pass 
useful information over a farther distance without retransmission about 
3-6db further out than an analog reciever can hear it.

(And the first "digial" signals used in RF were, Morse Code.  Morse 
Code, or "CW" (Continuous Wave) as it's known, still carries more 
information, farther, than just about any transmission mode on RF ever 
created.  It's just not as fast as others, so they can resend their data 
numerous times in the same amount of time and mathematically reassemble 
anything that got wiped out, making their apparent SNR better at a 
higher speed.  This taken to the extreme is what 802.11b's 
Frequency-Hopping Spread-Spectrum technology helps to do, along with 
their encoding algorhythms.)

If your SNR reported by the routers drops by 60dB, remember that dB is a 
logrhythmic scale and 60 is a very large number on a log curve, compared 
to a linear scale.  Your link is getting HAMMERED by something.  60dB is 
a big hit.

Also in the microwaves, "line of sight" isn't always all you need.  Path 
losses in free air actually start to become important at 2 GHz and 
above.  Especially at the low power levels of 802.11b unless you're 
running illegal amplifiers and exceeding the FCC prescribed limits for 
point to point or point to multipoint.  Most amp manufacturers are 
counting on you using very lossy feedline to be legal.

To give an example of the path loss problem, a friend recently did a 31 
mile shot with some gear in the 5.8 GHz band and the calculated path 
loss was high enough that aiming the dishes was super-critical.  Nothing 
less than each antenna directly pointed into the Fresnel zone of the 
other antenna would create a workable link.

> And no, no one else is using my channel or even the two adjacent channels, at
> least not with 802.11b gear.  I suspect cordless phones, but I know there are
> at least two sources, because they affect the gear differently: one of them
> causes the signal strength reported by iwconfig to drop to -110db or so, and
> the other one causes the noise to rise to -60dB or so.

The one that drops iwconfig is probably an analog widebanded carrier, an 
analog 2.4 GHz cordless phone.

The one that causes the noise floor to rise could be another digital 
system like a digital 2.4 GHz cordless phone, a microwave oven (they're 
smack dab in the middle of the 2.4 GHz band, and if they leak... they'll 
wipe out an 802.11b network), or any number of things.

The 2.4 GHz band is almost literally a free-for-all.  I'm licensed to 
use it an can transmit right on top of the 2.4 GHz 802.11b band with a 
rather large signal compared to the allowable signal from an 802.11b device.

802.11b is not a licensed user and operates under the FCC rules for 
"ISM" (Industry, Science, and Medical) devices.  Part 15 devices are 
also in that band.

It's a mess in any densely populated area of the country.  Most wireless 
ISP's move away from it as soon as they can and buy more expensive gear 
at 5.3 and 5.8 GHz.

Want a much quieter place to play?  Get 802.11a cards.

If your feedline is cheap, your feedline losses will probably overthrow 
the additional gain your dishes will have at the higher frequency, but 
if you're using high quality feedline like superflex, heliax or 
honest-to-goodness hardline, you might actually see better signal 
strengths between locations at the higher band also.

> [1] At least, I think that's how it works... radioheads correct me if that's
> not how the carrier sense part of 802.11b works.

I can't remember ever looking to see if the spec says a client has to 
hear the base in Managed mode before it's allowed to transmit or not.

Certainly they transmit constantly in Ad-Hoc mode when looking for a 
link partner.

And unless broadcasts are turned off the bases transmit constantly 
saying "I'm here"...every few seconds.

I think it's all mode-dependent and documented in the spec, and probably 
the spec is broken by every other manufacturer.  The only way to know 
for sure would be with a nice 2.4 GHz capable spectrum analyzer.  ;-)

Oh yeah... one more trick for 802.11b... the spec shows that when in 1-2 
Mb/s mode, less "channels" are used (you know the channels overlap 
right, and the only non-overlapping channels are 1, 6, and 11 in the 
U.S.?).  One of the side-effects of using less channels is that more RF 
energy is expended into that smaller bandwidth.  Sometimes when a 10Mb/s 
link is flakey, forcing the routers to 1-2 Mb/s mode on each end can 
gain you a considerable jump in SNR.

Nate



More information about the LUG mailing list