Timer.Timer() reliability

Hi,

I´m using a timer to measure a duration in my stretch app.
Quite naive I thought, it will be more or less accurate.

After some time I remind myself, that I forgot to subtract the duration, my function/code needs.
The API Doumentation says:
The given callback will be called no sooner than the given number of milliseconds.


Allright, so if my method needs ~50ms or something, it will be 1050 ms per iteration. Which means, every 20 Seconds, I´m stretching a second more.
To make sure if there really is a measurable difference, I used a stopwatch and found out something strange:
The timer was faster then my stopwatch :eek:
Exactly the opposite result, I expected! In 1 Minute, it was 3-4 Seconds faster.

I´m curious how to avoid that properly. Reducing the timer-interval dramatically and use getClockTime() to ensure a "real" second is elapsed?
Not the battery friendliest approach.
  • Instead of using timer ticks to calculate elapsed time, how about using Sys.getTimer() when you start and then at the same time when you're looking at elapsed? If you're recording a .fit you can also use timerTime or elapsedTime in Activity.Info.

    There will be some drift on the actual timer tick intervals.