Sorry if this has been covered, but I don't think it has.
I have a background service that's pulling data from a rest api. The api (was) returning a dictionary as json data. This would work as long as the json data was very small - like 400 bytes. But when I got to around 600 bytes, it fails. (-403 - NETWORK_RESPONSE_OUT_OF_MEMORY)
If I change the request to expect 'text/plain', and change the response to match, it will succeed. Of course, now I have a blob of unparsed json that I have to deal with.
It seems to me that the background process is running out of memory when parsing the json response and turning it into a dictionary object. Has anyone else run into this?
The dictionary has 38 key/value pairs - not an unreasonable amount. Obviously, none of the values are too big if the entire json blob is only 600 characters.
The server-side code is mine, so I can certainly change it to something easy to parse - like a comma-separated list. Seems like a shame to do that when the SDK is supposed to take care of converting data types to/from json. But if it's doing it in a really memory-inefficient way, I may not have a choice.
This is a data field, so perhaps there are very tight memory constraints (as opposed to apps/widgets). I'm testing on a Fenix 5X in the simulator, but I know the SDK documentation says the background process memory is even more limited than what the data field has available.
Are the background process limits documented anywhere? Anyone else run into this yet? Thanks...