I have a TI MSP432P401R board. I have looked through the various other questions related to RTC calibration and I have not found an answer to this seemingly simple question.
The MSP432 SDK driverlib provides two ways to adjust the RTC calibration. The clock source is the external 32 kHz crystal. There are two calibration adjustments, that take either a +1 or a -1 ppm as the direction and then multiply that by an offset of 1 - 240.
The MSP432 peripheral driverlib User's Guide describes the temperature calibration routine, but the rtc.h
contains a separate, but almost identical function that is not related to temperature.
The User's Guide and the header contain the same explanation of the functions use, e.g.
//*****************************************************************************
//
//! Sets the specified temperature compensation for the RTC.
//!
//! \param offsetDirection is the direction that the calibration offset will
//! go. Valid values are
//! - \b RTC_C_COMPENSATION_DOWN1PPM - calibrate at steps of -1
//! - \b RTC_C_COMPENSATION_UP1PPM - calibrate at steps of +1
//! \param offsetValue is the value that the offset will be a factor of; a
//! value is any integer from 1-240.
//!
//! This function sets the calibration offset to make the RTC as accurate as
//! possible. The offsetDirection can be either +1-ppm or -1-ppm, and the
//! offsetValue should be from 1-240 and is multiplied by the direction setting
//! (i.e. +1-ppm * 8 (offsetValue) = +8-ppm).
//!
//! \return true if calibration was set, false if it could not be set
//!
//
//*****************************************************************************
extern bool RTC_C_setTemperatureCompensation(uint_fast16_t offsetDirection,
uint_fast8_t offsetValue);
The second function contains an identical description save and except the temperature verbiage and has the declaration of:
//*****************************************************************************
//
//! Sets the specified calibration for the RTC.
//! ... rest is the same ...
//*****************************************************************************
extern void RTC_C_setCalibrationData(uint_fast8_t offsetDirection,
uint_fast8_t offsetValue);
The TI MSP432 Technical Reference adds a bit of additional information, explaining that a +1 ppm offset will increase the clock frequency, while a -1 will decrease the clock frequency. The relevant information from the Technical Reference is:
That seems to explain that adding to the oscillator frequency will speed the clock up (more ticks per second, clock clicks off time faster), etc. However, there is no explanation of how to arrive at how much to compensate.
How to determine the OffsetValue for clock calibration?
This is where I'm confused. I don't know how the calibration of +ppm or -ppm will impact the clock rate (likely due to unfamiliarity with the jargon, Aero background, not EE).
Since I'm looking to compensate for the RTC running roughly 0.4 s too fast over 10 hours, how do you go about arriving at the OffsetValue
?
Ideally we have 32768 cycles per second for the oscillator (or one tick per 3.051758 · 10-5 seconds). The 0.4 s drift over 10 hours gives roughly a drift of 1.11111 · 10-5 drift per second.
Where I'm lost is: how does a ppm relate to either value or their difference? Is there a way to take this information and arrive at a proper OffsetValue
between 1 and 240 for the clock calibration?
Update
From further looking it seems we can compute ppm from Actual Frequency (Fa) and Nominal Frequency (Fnom), where the actual frequency Fa = Fnom · (1 + drift). In this case the Actual Frequency would be 32768.364088 Hz.
Then to find the PPM difference you would have:
$$\begin{align} \small PPM & = \frac{F_a - F_{nom}}{F_{nom}} \cdot 10^6\\ & = drift \cdot 10^6\\ & = 11.1 \end{align}$$
So roughly 11 ppm. In this case then, the direction would be -1 and the OffsetValue
would be 11? Is that right (sane)?
Result
That in fact was the correct direction and roughly the correct OffsetValue
. Given that the exact drift was an "eyeball" measurement of the difference between two running clocks, the true drift was probably closer to 0.2 to 0.3 s over 10 hours because after adjusting the clock as per above, the clock did slow and the drift was reduced, but it overshot a perfect correction.
After running the same test overnight, this time the clock lost 0.1 s over 12 hour, a four-fold improvement. So instead of an OffsetValue
of 11 ppm, the actual OffsetValue
needed looks between 6 ppm and 9 ppm.
We will re-run the test with an OffsetValue
of 7 ppm and I suspect that will put the clock dead-on or close enough for the longest expected runtime for this microcontroller project where a user would not notice any drift in the clock. Less than a 0.1 second drift over 12-hours is well within acceptable margins here, and much better than the range of accuracy of a non-oven-controlled 32 kHz crystal. The reference for both the PPM calculations and the varying tolerances of different oscillators were taken from Clock accuracy in ppm.
If there is anything unexpected in this next test, I'll update, otherwise, this matter is resolved. If someone would like to write up an answer that corrects any of the observations or material referenced here, I will gladly select it.
OffsetValue
to5
where he clock is within0.1
sec slow over 29 hours. I'll test with4
just to find the where the slow-to-fast breakpoint is. But honestly, drift being within a fraction of one second in 29 hours is acceptable for the length of time this will run. The ultimate solution if longer runtime is wanted will by to write code to sync the time from an external source daily. The external 32KHz crystal on this board is fine with a time sync every 24 hours. \$\endgroup\$