Keep it beautiful
Everything here is my opinion. I do not speak for your employer.
July 2014
August 2014

2014-07-14 »

Wifi and the square of the radius

I have many things to tell you about wifi, but before I can tell you most of them, I have to tell you some basic things.

First of all, there's the question of transmit power, which is generally expressed in watts. You may recall that a watt is a joule per second. A milliwatt is 1/1000 of a watt. A transmitter generally can be considered to radiate its signal outward from a center point. A receiver "catches" part of the signal with its antenna.

The received signal power declines with the square of the radius from the transmit point. That is, if you're twice as far away as distance r, the received signal power at distance 2r is 1/4 as much as it was at r. Why is that?

Imagine a soap bubble. It starts off at the center point and there's a fixed amount of soap. As it inflates, the same amount of soap is stretched out over a larger and larger area - the surface area. The surface area of a sphere is 4 π r2.

Well, a joule of energy works like a millilitre of soap. It starts off at the transmitter and gets stretched outward in the shape of a sphere. The amount of soap (or energy) at one point on the sphere is proportional to
1 / 4 π r2.

Okay? So it goes down with the square of the radius.

A transmitter transmitting constantly will send out a total of one joule per second, or a watt. You can think of it as a series of ever-expanding soap bubbles, each expanding at the speed of light. At any given distance from the transmitter, the soap bubble currently at that distance will have a surface area of 4 π r2, and so the power will be proportional to 1 / that.

(I say "proportional to" because the actual formula is a bit messy and depends on your antenna and whatnot. The actual power at any point is of course zero, because the point is infinitely small, so you can only measure the power over a certain area, and that area is hard to calculate except that your antenna picks up about the same area regardless of where it is located. So although it's hard to calculate the power at any given point, it's pretty easy to calculate that a point twice as far away will have 1/4 the power, and so on. That turns out to be good enough.)

If you've ever done much programming, someone has probably told you that O(n^2) algorithms are bad. Well, this is an O(n^2) algorithm where n is the distance. What does that mean?

    1cm -> 1 x
    2cm -> 1/4 x
    3cm -> 1/9 x
    10cm -> 1/100 x
    20cm -> 1/400 x
    100cm (1m) -> 1/10000 x
    10,000cm (100m) -> 1/100,000,000 x

As you get farther away from the transmitter, that signal strength drops fast. So fast, in fact, that people gave up counting the mW of output and came up with a new unit, called dBm (decibels times milliwatts) that expresses the signal power logarithmically:

    n dBm = 10n/10 mW

So 0 dBm is 1 mW, and 30 dBm is 1W (the maximum legal transmit power for most wifi channels). And wifi devices have a "receiver sensitivity" that goes down to about -90 dBm. That's nine orders of magnitude below 0; a billionth of a milliwatt, ie. a trillionth of a watt. I don't even know the word for that. A trilliwatt? (Okay, I looked it up, it's a picowatt.)

Way back in university, I tried to build a receiver for wired modulated signals. I had no idea what I was doing, but I did manage to munge it, more or less, into working. The problem was, every time I plugged a new node into my little wired network, the signal strength would be cut down by 1/n. This seemed unreasonable to me, so I asked around: what am I doing wrong? What is the magical circuit that will let me split my signal down two paths without reducing the signal power? Nobody knew the answer. (Obviously I didn't ask the right people :))

The answer is, it turns out, that there is no such magical circuit. The answer is that 1/n is such a trivial signal strength reduction that essentially, on a wired network, nobody cares. We have RF engineers building systems that can handle literally a 1/1000000000000 (from 30 dBm to -90 dBm) drop in signal. Unless your wired network has a lot of nodes or you are operating it way beyond distance specifications, your silly splitter just does not affect things very much.

In programming terms, your runtime is O(n) + O(n^2) = O(n + n^2) = O(n^2). You don't bother optimizing the O(n) part, because it just doesn't matter.

(Update 2014/07/14: The above comment caused a bit of confusion because it talks about wired networks while the rest of the article is about wireless networks. In a wireless network, people are usually trying to extract every last meter of range, and a splitter is a silly idea anyway, so wasting -3 dB is a big deal and nobody does that. Wired networks like I was building at the time tend to have much less, and linear instead of quadratic, path loss and so they can tolerate a bunch of splitters. For example, good old passive arcnet star topology, or ethernet-over-coax, or MoCA, or cable TV.)

There is a lot more to say about signals, but for now I will leave you with this: there are people out there, the analog RF circuit design gods and goddesses, who can extract useful information out of a trillionth of a watt. Those people are doing pretty amazing work. They are not the cause of your wifi problems.

I'm CEO at Tailscale, where we make network problems disappear.

Why would you follow me on twitter? Use RSS.

apenwarr on gmail.com