Dynamic Array Sizes

I'm starting to use the .add() to extend arrays where the size is naturally dynamic as it collects sensor data over time. Rather than declare a fixed size array to the max size. For example, a User Setting to define a chunk size (every 20 secs, every 90 secs, whatever)... And save the avg power over that period in an array. Over a 24 hour race that could end up with 5000 elements. Just an example. Most activities won't need more than a couple hundred. This approach may allow devices with smaller memory footprints to run the data field.

I could add a memory check to prevent exceeding the memory footprint of a field before adding an element to the array. And remove the oldest elements. I'm not sure there is an efficient way to remove say the oldest "x" elements, other than to copy the array into another one? Is there a better way to approach this, using a different structure?

  • If the average is simply over a period of time then you don't need an array.  You can maintain the average like so (as an example):

    _lapSpeedAvgCnt++;
    _lapSpeedAvg = _lapSpeedAvg + ((_info.currentSpeed - _lapSpeedAvg) / _lapSpeedAvgCnt);
  • When you use add() with an array, when that happens, the array actually takes about twice the memory.  A new array in created and the contents of the old array are copied to the new/larger array then the old array is released

  • What's the point? It makes your app more complicated, it increases code size. Why not just allocate the max available array size based on the max memory per device?

  • jim_m_58 is right. It is very inefficient in terms of both memory usage and execution time to be constantly using Array.add().

    The bigger your array gets, the higher the cost of add().

    Ironically, it seems that you think Array.add() will help you save memory (or store more data on devices with less memory), but actually the opposite is the case.

    I'm starting to use the .add() to extend arrays where the size is naturally dynamic as it collects sensor data over time. Rather than declare a fixed size array to the max size. For example, a User Setting to define a chunk size (every 20 secs, every 90 secs, whatever)... And save the avg power over that period in an array. Over a 24 hour race that could end up with 5000 elements. Just an example. Most activities won't need more than a couple hundred. This approach may allow devices with smaller memory footprints to run the data field.

    I could add a memory check to prevent exceeding the memory footprint of a field before adding an element to the array. And remove the oldest elements. I'm not sure there is an efficient way to remove say the oldest "x" elements, other than to copy the array into another one? Is there a better way to approach this, using a different structure?

    Just so I understand the goal here - are you calculating rolling averages for consecutive chunks of time, and storing them for the duration of the entire activity?

    e.g. If the chunk size if 60 seconds, then you'd calculate and store the rolling average for 0:00 to 0:59, 1:00 to 1:59, etc.

    That's what I gather from this: "Over a 24 hour race that could end up with 5000 elements." i.e. 24 hours * 60 minutes per hour * 60 seconds per minute / 20 seconds per sample = 4320 samples

    My 2 cents:

    1) obviously to calculate a rolling average (with a sliding window), you only need an array whose size is equal to the time window in seconds (assuming that you take one sample per second).

    In this case, there is no need to to grow an array dynamically or remove items on the fly. Just use a fixed-size array as a circular queue - wrap an array in a class that keeps track of the current start index and the current size and manages operations like adding and reading data. (If you need to keep multiple circular queues, you can even save more memory by not using a class at all, but wrapping the array in another array/tuple and using global functions to access the data structure.)

    But as pstopanni pointed out, you don't need an array to just calculate an average over a period of time, assuming that none of the "windows" are overlapping.

    e.g. If you are really are just calculating non-overlapping chunks (0:00 to 0:59, 1:00 to 1:59, etc), then you don't need arrays to store the sample data for each chunk

    If the window is sliding (so the next window overlaps with the previous on) - i.e. 0:00 to 0:59, 0:01 to 1:00, etc, then you will need an array to store the sample data for each chunk)

    2) what exactly are you doing with all these averages?

    With 20 second non-overlapping chunks, after 24 hours you will have stored 4320 averages, but why do you need all that data? Are you taking the average of the chunk averages or something? Obviously if you only wanted to show the latest one to the user, you wouldn't need to store all of them. But in that case, it would make sense for the window to slide once per second instead - e.g. With 60 second chunks, you'd calculate rolling averages for 0:00 to 0:59, 0:01 to 1:00, 0:02 to 1:01, etc., and you'd only need an array of 60 elements.

    If you are taking the average of the "chunk averages", whether you need to store previous chunk averages again depends on whether your "windows" for the 2nd-level average are overlapping or not.

    e.g. Suppose you want to take the average of every 5 chunk averages (for some reason), but not overlapping: e.g. avg(chunk 0 to chunk 4), avg(chunk 5 to chunk 9), etc. In this case, you don't need to store the previous chunk averages

    But if you want to do the same, but with a sliding window - e.g. avg(chunk 0 to chunk 4), avg(chunk 1 to chunk 5), etc - then you would need to store the previous chunk average (but only enough to fill the window for your 2nd-level average)

    3) assuming that this use case makes sense, the solution is to allocate the max array size possible as flocsy suggested, and to use my suggestion of implementing a circular queue so you can efficiently discard older data.

    If you need to store huge amounts of endlessly growing data in your app for whatever reason, the obvious constraint is the available app memory (as you said) - since there's a limit to that, it makes sense to use a huge fixed-size data structure

    4) guess you could also try to use Storage as a fallback, but there's a size limit to that too, and it's extremely slow

    --

    Just because using Array.add() seems ceonvenient, doesn't mean it's the best choice in a highly constrained environment with little memory and a slow CPU. It's funny how a lot of the basic features that Garmin put in Connect IQ are actually not a good idea when app memory is tight - you can save a lot of memory by *not* using these basic features.

  • If the average is simply over a period of time then you don't need an array.

    Yeah but for some unspecified reason, it seems that he wants to keep all the previous results in memory.

  • TL;DR in general you would either not need arrays at all (except for rolling averages with sliding windows), but if you did, it would be better to use a fixed-sized data structure like a circular queue (implemented with a fixed-size array).

    Regardless of what you're doing, it's not really clear why would you'd need to store all 4320 samples for averages of 20 second (non-overlapping) "chunks" over 24 hours, unless you are doing some kind of rolling average of the chunks, where the window is constantly growing (yet the start of the window also slides in an overlapping way.)

    It would really help if you clarified exactly what you're trying calculate here. i.e. *why* do you need to store all the previous chunks?