App to learn and recognize specific movement using sensorData.accelerometerData

Former Member
Former Member
Does anyone know of code samples for how to put an app in "learning mode", do a movement, and then be able to recognize that movement later, recording time for each of those movements ? I've been looking at using sensorData.accelerometerData for this, but I'm not sure how to do the learning stage.

I've looked at the PitchCounter example, but it would be great with some guidance on the learning mode aspects, as well as how to use Math.FirFilter to define how similar the movement need to be to be counted.
  • I don't want to pop your balloon,but I don't think CIQ provides a response rate fast enough to effectively recognize movement patterns. See this thread
  • Although there are custom accerometer devices/packages that can integrate the signals to determine velocity and position, they are expensive and the calculations are done via software on the computer. They typically require more than one unit.
  • Former Member
    Former Member

    I do understand that general analysis of accelerometerData is expensive, and that is why I was thinking about putting the device in a learning mode to limit the amount of data that needs to analyzed. RaceQs , what I would like is something like putting the app in a "learning mode" , pull the main sheet, and stop learning mode. I would then like the app to count and add timestamp for all the times I'm pulling the main sheet. The challenge is that a "main sheet pull" varies a lot from person to person, and boat to boat.... So I'm not looking for something small like a tap, but "bigger" and longer lasting movements. (Not actually doing a sailing app, but just to illustrate, I'm using RaceQs :) )

    Andy275, I'm fine with doing the movement pattern analysis this in for example the mobile apps and loading the threshold values and filter values back to the phone app when the pattern has been synthesized.

    I haven't found a way to do this yet, and it feels like I'm quite close with the accelerometerData and Math.FirFilter if I only can get out of having to enter the movement identifying values compile time...
  • Former Member
    Former Member
    ImxMonkey: As for what you are suggesting with the brief logging session (learning mode), I would think it is possible, so long as you do not run out of memory. Actually constructing all of the filtering you need to carry out recognition in the future as mentioned will be a hefty process, and your idea of sending it back to the phone is a good one.

    One thing I might suggest to make the testing of your filter construction process easier would be to initially log out repeated samples of the general motions to either a FIT file, or if you can handle the lower rate, an app log file. It is obviously a bit manual, but hopefully getting that testing information should not take too long.