Memory consumed by Storage.setValue

So I had a 'Out of memory' crash in my Glance code. According to System.getSystemStats().freeMemory, after reading an array of 1500 numbers, free memory went from 21 KB to 14 KB, so around 7 KB was used by the array. However, while storing the array back (where I still had 14 KB of free memory), it crashed with a 'Out of memory'. WHAT? As a test, I did a Storage.deleteValue BEFORE doing Storage.setValue and it didn't crash anymore. Weird that a setValue which is supposed to overwrite what's there in the first place takes that much free memory... Well at least now, I got it not to crash anymore lol.

  • After reading the 9015-byte array from storage, free memory goes down by 9024 bytes (note that free memory is always rounded to the nearest 8 bytes - could be that memory is allocated in 8-byte chunks). So far this makes sense.

    After deleting the item from storage, free memory increases by 16 bytes. Not sure what's happening here, except maybe some sort of storage *index* is cached? 16 bytes is not a lot of memory though.

    After setting the value, memory goes down by 16 bytes again. There's that theoretical storage index data again.

    I think it's safe to say that storage isn't cached in application RAM in the way that was suggested.

    There's definitely a lot of overhead for writing data (somehow much more than there is for reading data, again, unless I am misunderstanding something), regardless of whether the same key exists already. But if the same key does not exist, there is far less overhead.

    Idk if Storage.setValue() does something insane like read the entire existing value and keep it in memory while writing the new value? That would be super weird.

    I will say that the "resting" memory usage after each call makes perfect sense, it's just the peak values which make no sense.

  • Regarding the difference in peak memory between:

    1) just calling Application.Storage.getValue

    2) calling Application.Storage.getValue, deleteValue, then setValue (the optimal case)

    The difference is exactly 9 KB, which is exactly the size of the array.)

    So maybe I am being stupid and it's significant that in the case of 1), the array isn't already in memory before getValue is called, but in 2), the array already is in memory.

    But in the case of 2), shouldn't setValue() be able to work with the array that is already in memory, and therefore have the exact same overhead as 1)?

    I probably have to think about this a bit harder.

    Here's my previous peak memory experiments, restated as code:

     // Assume only *one* of the following functions is called for each of 4 test runs,
        // and assume that "test" is a 9015 byte array in storage
        function test() {
            // do nothing
            // 9.5 KB peak
        }
        function test2() {
            var bytes = Application.Storage.getValue("test"); 
            // 25.3 KB peak
        }
    
        function test3() {
            var bytes = Application.Storage.getValue("test"); 
            Application.Storage.setValue("test", bytes); 
            // 43.0 kb peak
            
            // ??? shouldn't this peak be the same as test2()'s peak?
            // it shouldn't take more memory to write (serialize)
            // a value than to read it.
            // maybe I am missing something (i.e. thinking about it incorrectly)
        }
    
        function test4() {
            var bytes = Application.Storage.getValue("test"); 
            Application.Storage.deleteValue("test");
            Application.Storage.setValue("test", bytes);
            // 34.2 kb peak
    
            // at the very least, this peak should be the same as test3()
            // ...
        }

  • I should probably be comparing peak memory to "resting" memory used to determine the real overhead of the various steps.

  • So i did some tests and here are my conclusions. (I could be missing something though)

    1) The overhead of getValue is about 100% of the data size. [*]

    [*] above and beyond the size of the returned data itself. So if you load a 10 KB array, your memory spike would be about 10 KB (overhead) + 10 KB (for the array itself). But out of that spike, 10 KB will be retained after getValue() returns (assuming you keep the returned data in memory). So in this case I think the overhead is really 100% (not 200%).

    2) The overhead of setValue for a key that exists is 300% of the data size.

    This does *not* include the size of the existing data in memory. Since setValue doesn't return any data, this is pure overhead.

    3) The overhead of deleteValue for a key that exists is 200% of the data size!

    That's right, it costs memory just to delete data. I hope I'm wrong about this

    4) The overhead of setValue for a key that does not exist is 200% of the data size.

    (This does *not* count the size of the existing data  in memory)

    I think only 1) makes sense. 2) is crazy, 3) is insane, and 4) still seems unacceptable.

    In an ideal world: 2) and 4) would be 100% and 3) would be 0%.

    Unless again I am missing something.

    Here's the code I used for testing. I had to do a lot of manual commenting out of code so that the numbers for individual test runs were comparable. (Although they don't really need to be, since everything is relative to the used memory at the start of each test run).

    I did use a 9 KB array here. Sorry if that makes things confusing.

    import Toybox.Activity;
    import Toybox.Lang;
    import Toybox.Time;
    import Toybox.WatchUi;
    import Toybox.WatchUi;
    import Toybox.Math;
    import Toybox.System;
    
    /*
    var globalBytes = [
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
        1,2,3,4,5,6,7,8,9,10,
    ]; // 9015 bytes in memory
    */
    
    // DAT file about 9 KB
    
    class TestDFWView extends WatchUi.SimpleDataField {
    
        // Set the label of the data field here.
        function initialize() {
            SimpleDataField.initialize();
            label = "My Label";
    
            // Save the test array before test runs, when necessary.
            // e.g. if you do the tests in order,
            // you would need to save the array before test1()
            // and test4(), since test1() is the first
            // test which needs the array and test3() deletes the array.
            //
            // this line and the array itself are always commented out
            // during the test runs below
            // Application.Storage.setValue("test", globalBytes); 
    
            // tests were performed by uncommenting exactly one of the following lines.
            // Used memory at the beginning of each test run as recorded below
            // differs because I changed some code between test runs :/,
            // but it doesn't matter because overhead is calculated relative
            // to the used memory at the beginning of each test run.
            //
            // test0();
            // test1();
            // test2();
            // test3();
            test4();
        }
    
        function compute(info) {
            return 0.0;
        }
       function printUsedMemory() {
            System.println(System.getSystemStats().usedMemory);
        }
    
        
        function test0() {
            // do nothing
            printUsedMemory(); // 6376
            
            // 8.0 KB peak
            
            // this doesn't really matter
            //
            // it just demonstrates that there's really nothing
            // else the app that produces big memory spikes
        }
        
        function test1() {
            printUsedMemory(); // 6384
            var bytes = Application.Storage.getValue("test"); 
            printUsedMemory(); // 15408
            return bytes; // prevent bytes from being freed before we can print the memory out
           
            // 23.8 KB peak
    
            // The total cost of calling getValue() seems to be ~23.8 KB - 6384 ~= 24371 - 6384 = 17987
    
            // But since we got a 9015 byte array out of the deal, the real "overhead" (*) is roughly:
            // 24371 - 6384 - 9015 = 8973
            //
            // this is close to 24371 - 15408 = 8963 
            //
            // (*) overhead = memory used above and beyond the size of the 
            // data that's returned
        }
    
        function test2() {
            printUsedMemory(); // 6400
            var bytes = Application.Storage.getValue("test"); 
            printUsedMemory(); // 15424
            Application.Storage.setValue("test", bytes); 
            printUsedMemory(); // 15424
    
            return bytes; // prevent bytes from being freed before we can print the memory out
            // 41.5 kb peak
            
            // total cost of calling setvalue in this case is actually ~41.5 KB - 15424 ~= 27072 = 3 * 9000 bytes
            //
            // - since we get no data back, 
            // we can say the additional memory is pure overhead: 27072 bytes (3 * the data size)
            // - in a perfect world, the overhead would be 0 or some constant 
            //   (small) size. e.g. a few buffers for serializing data on the fly
            
        }
    
        function test3() {
            printUsedMemory(); // 6504
            var bytes = Application.Storage.getValue("test"); 
            printUsedMemory(); // 15528
            Application.Storage.deleteValue("test");
            printUsedMemory(); // 15512
    
            return bytes; // prevent bytes from being freed before we can print the memory out
            
            // 32.8 kb peak!!!
    
            // This interesting. Apparently it costs more memory to read and delete a value
            // than it does just to read it...
    
            // here we guess that the overhead for deleting the value was 
            //   ~32.8 KB - 15512 ~= 18075 bytes ~= 2 * 9000 bytes
            // this doesn't make sense at all
        }
    
    
        function test4() {
            printUsedMemory(); // 6504
            var bytes = Application.Storage.getValue("test"); 
            printUsedMemory(); // 15528
            Application.Storage.deleteValue("test");
            printUsedMemory(); // 15512
            Application.Storage.setValue("test", bytes);
            printUsedMemory(); // 15528
    
            return bytes; // prevent bytes from being freed before we can print the memory out
            
            // 32.8 kb peak
    
            // Here we guess the cost of calling setValue was 
            // ~32.8 kb - 15512 ~= 18075 bytes = 2 * 9000 bytes
        }
    }

  • I tried to confirm my "200% of data size" overhead for Storage.deleteValue().

    Looks like it's true, unless I am doing something wrong.

        // just for completeness, delete value without ever getting it,
        // to convince ourselves the overhead is truly due to deleteValue
        // and not getValue
         function test5() {
            printUsedMemory(); // 6552
            Application.Storage.deleteValue("test");
            printUsedMemory(); // 6536
            
            // 24.0 kb peak!!!
            // 24 KB - 6552 = ~= 18 KB ~= 2 * data size (9 KB)
    
            // Yes, it looks like the overhead of simply *deleting* a value from Storage
            // is about 200% of the data size...
        }

  • Since the simulator presents application storage as a dictionary (and calls it the object store, even though the doc makes clear that the object store is the old properties and not the new storage), I kinda wondered if CIQ is doing something crazy like deserializing all of the storage data every time you access one key.

    Doesn't look like it, since reading and writing one integer of data doesn't have the same overhead as reading, writing and deleting the big 9000 byte array, even though the array is also in storage.

    Also, the DAT file for the app in the sim is like 100 KB after all this testing, even though the total amount of data is only about 9 KB. I hope real devices don't work like that.

  • I think it kind of makes sense that writing has a bigger overhead than what you expected:

    When you pass the array then it has to serialize it. So it starts with a small string, that it appends to (since an array can hold any types of elements (forget strict type checks :) it can't allocate the correct string (or array of bytes or Byte array) size in advance. So as you explained last week, by growing the buffer gradually it'll need to create a bigger one and copy the already federalized chunk, then add the new value. This copy is what takes probably twice the size of memory of the serialized array during the serialization of the last item in the array.

    But I can't explain how deleting a value before write changes anything.

  • I understand that serializing data has a certain overhead. I don't see why it shouldn't have the exact same overhead as deserializing data.

    Unless I am missing something:

    - getValue (read and deserialize) = 100% overhead (very high but understandable)

    - setValue (serialize and write) of existing key = 300% overhead

    - deleteValue of existing key = 200% overhead

    - setValue of non-existent key = 200% overhead

    Even 200% overhead seems extreme to me.

    So it starts with a small string, that it appends to (since an array can hold any types of elements (forget strict type checks :) it can't allocate the correct string (or array of bytes or Byte array) size in advance

    It could if 2 passes were made:

    - 1 to determine the total size of the object (including children) (ofc this is possible in principle as the sim memory viewer does it)

    - 1 to actually serialize the data

    Perhaps this is a case where memory is sacrificed for efficiency.

    I still don't see how setValue (serialize) can have 2-3 times the overhead of getValue (deserialize).

    Unless I am missing something, the overhead for serialization and deserialization should be symmetrical, unless ofc there is some metadata in storage which makes deserialization more efficient.

    But again, it seems like that sort of metadata could be generated for serialization by adding an extra pass over the data.

  • I added a bug report for this:

    https://forums.garmin.com/developer/connect-iq/i/bug-reports/application-storage-unreasonable-high-overhead-for-deletevalue-200-of-object-size-and-setvalue-200--300-of-object-size 

    Maybe we can get some clarity from the ciq team on whether this is expected behaviour.

    Great find, thank you for posting!