I always thought compute() was called once per second for most devices unless they were compute bound. I never tested it.
We're seeing in a compute intensive data field running on an old/slow EDGE 820 is not accumulating a value as quicky as it should. Other devices match the post-ride server calculations, but the 820 is about 20% too low. So it seemed (to me) like it was bogged down and I was betting compute() was being called about every 1.2 seconds on that device.
So I wrote a little metric to display on the screen in the bottom left, that displayed the interval time using "elapsedTime". I expected to see about 1.00 seconds on my EDGE 1030, and I expected to see about 1.20 on the EDGE 820. I use a simple approach of saving the prior elapsedTime inside the method I call from compute(), and calculating the difference in current elapsedTime and displaying the value/1000.0 (for seconds).
SURPRISE SURPRISE
In the simulator, it shows a steady 1.00.
On my EDGE 1030, that interval varies every second (onUpdate() display refresh interval) between about 0.25 and 1.3. So it seems to imply that compute() is being executed between every 1/4 sec and 1.3 secs.... Does that even make sense?
I'll recheck my code and also now capture an average interval value. But the code is very straight forward so I don't know why it should show an interval time less and 1.0.