On 01/11/2011 05:24 PM, David VanHorn wrote:
>
> I'm interested to know, from anyone who actually has this device in
> production, how you are doing frequency setting.
>
I have a board with an AT32UC3A0512 and an AT86RF231. And another with
an ARM9 and an AT86RF231.
When the board powers up in the test fixture, it looks for a 32kHz
signal on an input. This is supplied by the test fixture.
A piece of code in the UC3 measures the frequency of the CPU clock
against the 32kHz signal, and at the same time measures the 1MHz signal
coming from the RF231 against the CPU clock. This takes 0.5 seconds for
a 0.1 ppm measurement. The sum of the two is the error of the 1MHz
output against the 32kHz reference.
Both inputs are normal IO pins. I only track phase error against
predicted phase so I do not need to capture all the edges.
First the center setting is measured. Depending on whether the setting
should be higher or lower, the lowest or highest setting is measured,
then the sensitivity is calculated (frequency change per adjustment
step), and the optimum setting calculated and stored to flash.
1 second to calibrate, +/-5ppm including the test fixture reference
which is a well cared for DS3231. A reasonable OCXO and divider would
make a better reference, but I was pressed for time. Maybe divide down
the 10MHz output of the factory floor reference (the best counter)?
The work is in the measurement routine, it took me several hours to get
this right. But, involving a PC and GPIB would also have taken time.
idea:
I was thinking of doing over-the-air calibration by timestamping
packets. The symbol time is 16 usec, but the uncertainty after
despreading much less. So that should be feasible too. One node to
generate a mark, and another node with a good clock to say "The mark we
all just heard arrived at time T." Sort of the way IEEE1588 does it.
/Kasper Pedersen
[Non-text portions of this message have been removed]