OK, you've read the headline. So let's get the caveats out of the way first.

Any network environment is going to have lots of variables so absolutes in networking decisions are few and far between. Most of what you read has to be considered in the context of your own environment and your own beliefs about the 'right' way to do networking. Particularly when it comes to wireless networking, you're well advised to do some of your own benchmarking before making big decisions. What works well under a certain set of circumstances may not match your own.

That having been said, the wireless network environment has aspects to it that you don't even have to consider in the wired world. One is that radio-frequency signals don't stop and start at strictly defined points, as traffic traversing a cable attached to two ports does. Another is that radio signals radiate in three dimensions, not just back and forth on a cable.

These traits mean that wireless networks can unexpectedly affect the workings of other wireless products - whether they are yours or someone else's.

This is one concern that has come up surrounding products based on Atheros' so-called 'Super G' chip, a derivative of the 54 Mbit/s 802.11g standard that "bonds" channel 6, half of channel 1 and half of channel 11 together to give you higher theoretical throughput (108 Mbit/sec).

The speed up is controversial (see our news story on the subject).

The Tolly Group has tested the effects of Super G products on nearby 802.11g standard-compliant products as well as in networks with multiple Super G products only. The impact of the degradation in these tests is significant and worth noting.

Note that most of Atheros' Super G products are currently supported in consumer-class products. One reason is that consumers are very interested in downloading multimedia entertainment - and the higher speeds boost that ability.

But Super G is not an industry standard. If you don't want to use standards and not doing so affects only you, so be it. On the other hand, since wireless can affect your neighbours - whether in a house, flat, building or office suite next door - is installing it the right thing to do?

The Tolly Group plans to release a formal report later this month. It was commissioned to run the Super G tests by cutthroat Atheros competitor Broadcom (which admits having some skin in the game here). So there is an agenda afoot other than the good of the industry. But that doesn't mean that you shouldn't be informed about this issue, which has some teeth.

When the Tolly Group tested a single Broadcom-based Belkin F5-D7320 802.11g standard-compliant access point by itself, its average aggregate throughput was 22.96 Mbit/s. When an Atheros-based NetGear WGT624 access point and client card were running 30 feet away from the Belkin, though, the Belkin's performance dropped nearly 95 percent, to 1.27 Mbit/s.

Tolly Group head Kevin Tolly says his organisation used the NetIQ Chariot traffic simulator to conduct the tests and that the tests were repeated 33 times - three times on each of 11 channels. Then, he says, the organisation spot-tested the Super G impact on the Belkin device at 60 and 75 feet with the same result.

In addition, Tolly tested environments with multiple access points. He ran three Belkins with no Super Gs involved on channels 1, 6, and 11 (the three non-overlapping channels in 802.11g's 2.4 GHz band) and got an aggregate throughput of 76.53 Mbit/s, or about 26 Mbit/s each.

Then he tested three Super G-based Netgear WGT624s alone, with no Belkins involved. All were on channel 6, because that is required for Super G. The aggregate throughput was 48.62 Mbit/s - oddly, less than you would get if you simply ran three 802.11g-compliant products. So in a multiple access point-scenario, you don't seem to gain any bandwidth by running the Super G product because it seems to interfere with itself to such a degree that the throughput benefits drop off.

Atheros vice president of marketing and business development Colin Macnab had just a couple of comments on all of this. He acknowledged the Super G interference issue, but asserted that it doesn't reflect any real-world problems, because walls usually separate one office from another, indicating that interference can't penetrate walls. This argument is pretty weak. I'm writing this article from an Internet café, and I can see access points across the street. Presumably if their signal is reaching me, interference can reach me, too.

Macnab also pointed out that 802.11g-compliant products interfere with each other (which is of course potentially true of any and all types of products that attempt to occupy the same frequency band, at the same time). He showed me a demo of how adding a Broadcom-based Buffalo 802.11g access point on channel 6 at a distance of eight to 10 feet away from another Buffalo device on channel 1 severely degraded a streaming video clip.

One could argue that most people would avoid placing their access points that close together. The benchmarking carried out by Tolly was done in a more controlled environment, measured, and performed repeatedly; the demo I saw at Atheros was very informal and performed in a lab with so many access points running for other tests that even getting on channel 6 took about 10 minutes.

There's more to the story. Tolly says he will conduct more tests using different variables. There's also a chance that because greater bandwidth means you transmit less often, statistically speaking, two access points near each other will avoid interference altogether.

So is the Super G issue a big one? It depends. As the story unfolds, I'll keep you posted. My best advice is to think of the potential impact on your neighbour if you decide to deploy. And realise you might not gain much throughput if you have two or more Super G access points running within 30 to 75 feet of one another.

If you have a single building in the middle of the desert and want to just run one access point, Super G would be a fine way to go - for now, that's what we know for sure.