Currently I am working on an application that finds relevant events in accelerometer data.
I've recently moved the calculations to happen in real time, to avoid the watchdog.
I have the same algorithm written up in python which I know works properly. I can use the fit files generated to compare to the results.
The issue is that seemingly randomly, the results are offset 0 samples, 25 samples, or 50 samples. There seems to be no pattern the order of which it is.
e.g
given: [65, 80, 100]
expected: [15, 30, 50]
or
given: [30, 65, 70]
expected: [5, 40, 55]
or
given: [10, 15, 25]
expected: [10, 15, 25]
I'm pretty stumped. I feel like there may be some sort of delay between the initialization of the logging and the accelerometer but that didn't seem to be an issue in the past before I was doing real time calculations. I'm using a sample rate of 25hz and a period of 1.
Here is some simplified code to get the general idea.
private var curr = 0;
private var results = [];
private var power = [];
public function accelHistoryCallback(sensorData) as Void {
var x = sensorData.accelerometerData.x;
var y = sensorData.accelerometerData.y;
var z = sensorData.accelerometerData.z;
// get single vertex
var p = [];
for (var i = 0; i < x.size(); ++i) {
p.add(Math.sqrt(Math.pow(x[i], 2) + Math.pow(y[i], 2) + Math.pow(z[i], 2)));
}
power.addAll(p);
var slice = power.slice(curr, null);
// performs calculations on the slice and updates the relevant
// variables
calculate(slice, curr);
// calculate() uses a buffer so have to slide back 5 samples
curr = power.size() - 5;
}
public function calculate(data, curr) {
// do some calculations. i = index of data which is a slice of total data
// if the calculations find relevant event
results.add(i + curr) // do this to save actual index
}
public function start() {
initializeAccel();
initializeLogging();
session.start();
power = [];
curr = 0;
results = [];
}
public function stop() {
disableAccel();
session.save();
}