is interesting. The IEEE 802.11 group is canvassing proposals for a new wireless LAN sub-standard, 801.11n, that would allow up to 540 megabits per second. That is very, very fast. The bandwidth allocated to 802.11 is in two chunks, one around the microwave oven frequency of 2.4GHz (UHF) and another up around the 5 GHz region (SHF J-band). The channels are only 20 MHz wide. Putting half a gigabit per second through a channel that narrow is going to be a fairly amazing technical challenge. Of course the actual data rate is going to be nowhere near the 540 Mbps on-air value, as a lot of the available capacity will be taken up with overhead (I'd hazard a guess that throughput will max out at around 150 Mbps). Even so, these sorts of speeds are astonishing. In 1998/99, when I was working on a precursor to 802.11 called HIPERLAN, the proposal was for speeds of ~15 Mbps data at the MAC level over a 25 Mbps PHY layer. With off-the-shelf technology, that was completely unattainable then. The biggest problem we faced was that the computational power required was so large. With the system we were working on, in order to mitigate what is known as multi-path propagation (whereby the signal from transmitter to receiver can go via several different paths, and thus ends up smeared out in time) you need a signal-processing device called an equaliser. In essence, this time-shifts portions of the received signal to reconstitute it at a given instant. Our calculations showed that we needed GigaFLOPs of compute power to do this, which was way in advance of the speed of a general purpose computer's CPU. Custom hardware can be optimised to work a lot faster on specific problems, but the other worry we had was power consumption. The required power for MOS technologies goes up with clock speed. Mere battery drain wasn't the problem; the figures we had suggested you'd be able to fry eggs on your wireless LAN card. Of course, Moore's Law will usually come to your rescue, but I never expected it to arrive so soon.
I'd like to know what techniques these guys are going to use to get their data-rates this high. I presume that at a minimum they will be using spread spectrum OFDM with a very high coding gain, and a lot of diversity combining techniques. If they're using additional channel coding (which I'm sure they are), then the decoder technology is going to be interesting. I'd love to see a Viterbi decoder running at Gigahertz clock frequencies.
Something I've said before: it's engineers and scientists who are most blown away by the pace of technological advance. The ordinary guy in the street has no idea how, say, a mobile phone works. People like me, who've been exposed to their inner workings, know just how much time and effort have gone into their design. The GSM mobile phone standard, for example. represents thousands of man-years of work. It's several feet of shelf space, and you need to be a competent electronic engineer to make head or tail of it. This stuff is really quite insanely hard. It's amazing that there exists so many thousands of people on the planet who are smart enough to do it.
I've been out of this field for a long time, so it's entirely possible that some of my technical conjectures are way off-base. Nonetheless, sending half a billion bits of info per second over any
radio interface is astonishing.
The simplest way to increase data rate is to increase channel bandwidth (Shannon-Hartley equation). When I was still in the field. we were talking about wireless LANs operating in the 60 GHz region. This is attractive, because there's not much going on up there, so you can have nice wide channels. Another handy thing is that there's a strong absorption line from atmospheric oxygen in this region, which helps with frequency re-use, hence areal density of transmission, by cutting effective range. Trouble is, working at EHF (30-300 GHz) is very hard. After all, at the top end, you're only a decade away from far infra-red. Wavelength of light is only 5mm at 60 GHz.