# Forum

## Negative dBm

9 posts by 6 authors in: Forums > CWNA - Enterprise Wi-Fi Admin
Last Post: November 25:
• The book does a nice job of explaining how to perform the calculations, but seems to just jump right into a concept I'm having a bit of trouble with.

I see a lot of negative numbers. For example, the book shows a receive sensitivity as being -82 dBm. Why exactly is a negative number used?

I noticed this yesterday in Netstumbler as well. It shows the signal/noise ratio in negative dBm as well.

I feel kind of dumb for asking these questions, but can anyone help me out?

• Negative dBm can be treated very similarly to dBm.

As you know, dBm is the ratio referenced to 1 milliwatt so for example, +3dBm equals 2 milliwatts, +10dBm equals 10 milliwatts and +20dBm equals 100 milliwatts etc.

Where +dBm indicates "more than", -dBm simply indicates "less than" so a value of -3dBm means 3dB less than 1 milliwatt ... or 0.5 milliwatts. Similarly, -10dBm is 1/10 of a milliwatt or 0.1 milliwatts and -20dBm is 1/100 of a milliwatt or 0.01 milliwatt.

In the case of a receiver having a sensitivity of -20dBm, it would require a power at the receiver antenna input of 1/100 (0.01) milliwatts before it could hear the signal. If the receiver had a sensitivity of -82dBm it would need a power at the receive antenna of 82dB below 1 milliwatt ... a very small number! The lower the number, the better the sensitivity ... a receiver with -82dBm is much more sentitive than one with a sensitivity of -20dBm.

In regard to signal to noise ratio, you shouldn't really see a negative value and it would normally be quoted in dB, not dBm. For example, a s/n ratio of 3dB means that the signal is twice that of the noise. A s/n ratio of 10dB means the signal is 10 times the noise. If you were to see a s/n ratio of -10dB it would mean that the signal was one tenth of the noise ... meaning that you wouldn't hear the signal.

Signal to noise ratio is expressed in dB, not in dBm. Signal to noise ratio is just that ... the ratio of the signal to the noise. If you are seeing dBm in this regard it means the absolute value of the signal and/or the noise. For example, the noise might have an absolute value of -50dBm and the signal might have an absolute value of -40dBm in which case the signal to noise ratio (the difference between the two) would be +10dB ... NOT 10dBm.

• Ah, thank you! That's exactly what I was looking for.

• hmm.. nice topic..

just wanted to bring out the context of RSSI.. recieved signal strength indicator..

isn't RSSI time/distance variant compared to the reciever's sensitivity which is a design spec mentioned in the receivers documentation?

also is a recievers sensitivity directly related to the attenna used?

let me know thanks

• "is a recievers sensitivity directly related to the attenna used?"

No. The SYSTEM sensitivity would be affected by the antenna.

Receive sensitivity is a subjective measurement. Lowest signal strength that assures a reliable connection. Some manufacturers err on the side of caution, some choose to define a reliable connection as one that comes back on its own, often...

... Receive sensitivity is a subjective measurement. Lowest signal strength that assures a reliable connection ...

... or in the case of a data radio, a more meaningful specification is the lowest signal for a given bit error rate.

• This scale assumes you will never get more than 1mW of power? Some routers give you as much as 300mW so what would this scale show then if you place receiver let's say 10cm from router.

• This scale assumes you will never get more than 1mW of power? Some routers give you as much as 300mW so what would this scale show then if you place receiver let's say 10cm from router.

• By Howard - edited: November 26

Do you mean the dBm scale or an RSSI scale ?

Just guessing on your setup, but  I'd estimate  -7 or -8 dBm, or on a theoretical 8-bit  RSSI scale maybe  254 (it's non linear) .  It depends on the chip-set manufacturer and their RSSI calculation algorithm.

Many radios, especially older ones, would be totally swamped by this large a signal, and wouldn't read anything.  Some radios might be permanently damaged.

For the heck of it, I have occasionally tested down to -10 dBm (conducted), but usually -30 is about the lowest (strongest) we test to.

People in a hurry sometimes skip the higher power levels, if all they are interested in is the "sensitivity".   So they may start their tests at say -60 dBm, and head towards -90, in 1 dB steps.    But when you start at -30, or stronger, you sometimes find devices that will work 20 feet from an AP, but fail when directly beneath one.

Note 1:

An issue that comes up, anytime we are discussing sensitivity testing, is up-vs-down or higher-vs-lower, etc.   Everyone in the discussion needs to agree on the terminology.   For example, when we talk about -30 dBm, and then "going lower" do we mean lowering or raising the power level ?

Not everyone understands that -30 dBm is a higher power than -40 dBm. It's best to make sure that everyone understands tha.  Sometimes we need to go to extra lengths so that our explanations are clear.

Note 2:

Some manufacturers make a corny (IMHO) decision to express signal level as a percentage (%).  If you ask percentage of what, they rarely give you an answer that allows you to compare real device performance.   Beware !

Page 1 of 1
• 1