A question has come up on how the XTBM displays high-level noise. Because noise is sampled directly inside the X10 reception window, it cannot be sampled during reception of X10 “1” bits. As a result, the noise readout is not updated when receiving a continuous high noise level that looks sufficiently like a X10 signal that the XTBM tries to decode it. Since the XTBM is unlikely to decode a valid start pattern, the decode will quickly fail, indicating either “BSC” (bad start code) or ^NOISE^ (high noise). In this situation, those warnings are the only indication that there is a serious noise problem that must be resolved. Again, this issue only involves high-level noise that looks sufficiently like a X10 command for the XTBM to try to decode it.
As an alternative, I have a firmware mod that moves the noise sample point just after the X10 transmission window when receiving X10 “1” bits. The downside is that the noise value can change when receiving a X10 command. For example, the Cellet cellphone charger described in one of my troubleshooting tutorials produces more noise as the AC voltage rises. So the noise displayed from that unit increases significantly during reception of a continuous series of X10 commands. However, noise displayed from a continuous source, such as a “wireless” intercom, remains almost the same.
While this mod can result in some erroneous noise readouts, I think it will make the XTBM more useful for those of you battling a high noise level because the actual value of the noise level would be displayed at all times.
Your thoughts or suggestions on this issue are welcome.
Jeff