makeWebRequest call crashes Garmin Connect Mobile

I wrote an app that is using makeWebRequest to load JSON data. When I run in the emulator everything works well and I get my results.

When I run on my watch (vivoactive HR) using the latest version (3.11) of iOS GCM on the latest version of iOS 10 I get back a -2 error. If I have GCM running in the foreground on my phone I can see it crash as soon as the request is made.

How do I figure out why GCM is crashing? Is there anything that I can do differently in my app? I know that there were bugs with GCM on iOS 10 requesting web data, but the most recent build seems to have fixed that (at least for the build in weather widget).

Here is an example of the data that it is getting back:
http://tidesandcurrents.noaa.gov/api/datagetter?begin_date=20161006%2000%3A00&end_date=20161006%2006%3A00&station=8454000&product=predictions&datum=mllw&units=english&time_zone=lst&application=web_services&format=json


and the code:
function requestNextTideData()
{
var begin_date = startTimes[iTimes];
var end_date = endTimes[iTimes];

System.println("makeRequest()");
var url = "tidesandcurrents.noaa.gov/.../datagetter";
var station = stations[iStation]["id"];
System.println("begin_data = " + begin_date + ", end_date = " + end_date + ", station = " + station);
Comm.makeWebRequest(
url,
{
"begin_date" => begin_date,
"end_date" => end_date,
"station" => stations[iStation]["id"],
"product" => "predictions",
"datum" => "mllw",
"units" => "english",
"time_zone" => "lst_ldt",
"application" => "web_services",
"format" => "json"
},
{
:method => Comm.HTTP_REQUEST_METHOD_GET,
:responseType => Comm.HTTP_RESPONSE_CONTENT_TYPE_JSON
},
method(:onReceive));
System.println("makeRequest() done");
}
  • Try adding headers

    Hi,

    I had problems with the web requests and I found I had to add headers to the web request, I've modified your code below (but not tested it).

    Cheers,

    Chris

    function requestNextTideData()
    {
    var begin_date = startTimes[iTimes];
    var end_date = endTimes[iTimes];

    var headers = {"Content-Type" => Comm.REQUEST_CONTENT_TYPE_URL_ENCODED, "Accept" => "application/json"};

    System.println("makeRequest()");
    var url = "tidesandcurrents.noaa.gov/.../datagetter";
    var station = stations[iStation]["id"];
    System.println("begin_data = " + begin_date + ", end_date = " + end_date + ", station = " + station);
    Comm.makeWebRequest(
    url,
    {
    "begin_date" => begin_date,
    "end_date" => end_date,
    "station" => stations[iStation]["id"],
    "product" => "predictions",
    "datum" => "mllw",
    "units" => "english",
    "time_zone" => "lst_ldt",
    "application" => "web_services",
    "format" => "json"
    },
    {
    :headers => headers,
    :method => Comm.HTTP_REQUEST_METHOD_GET,
    :responseType => Comm.HTTP_RESPONSE_CONTENT_TYPE_JSON
    },
    method(:onReceive));
    System.println("makeRequest() done");
    }
  • Thanks for that idea. I tried it and it still works in the simulator, but is still failing on the watch itself.

    I've also tried making my request windows smaller to reduce the amount of data that I'm getting in any single call (in case there is an unpublished maximum when doing web requests over Bluetooth), but it still fails with small requests.
  • It's the toString() issue mentioned here:
    https://forums.garmin.com/showthread.php?361379-Invalid_http_body_in_request&highlight=makewebrequest

    My problem went away when I did a toString() on the station id. It's slow now, but it works.
  • Doing makeWebRequest()/makeJsonRequest() over BLE is very slow in comparison to what you see in the sim, and that's the way it is.

    Doing calls ons on a PC over wifi will be a lot faster than a watch with BLE to a phone and then on a cell network and back. Bluetooth isn't that fast...
  • Doing makeWebRequest()/makeJsonRequest() over BLE is very slow in comparison to what you see in the sim, and that's the way it is.


    I wasn't expecting it to be anywhere close to PC speed and know that BLE is limited to around 200kbps. However it's taking 7-12 seconds to load under 3kb of data, and it's hard to describe that as anything but very slow (it's about 2-3kbps, or two orders of magnitude slower than BLE).
  • Former Member
    Former Member over 8 years ago
    The results you are describing seem a bit slow, but 200kb/sec is quite optimistic.

    Here is a good article about BLE throughput
    https://punchthrough.com/blog/posts/maximizing-ble-throughput-on-ios-and-android

    This shows a maximum throughput on various phone types to be between 3000 Bytes/sec and 16000 Bytes/second.

    Another factor that could be affecting your performance is iOS limiting connections from backgrounded applications. Garmin Connect Mobile services these requests, and latency increases significantly when it is in the background on iOS devices.
  • Thanks Brian, that is a helpful document for showing the theoritical differences between devices. I am testing on iOS.

    Does the watch support pending multiple calls to makeWebRequest to maximize throughput over a single round trip? I'm having to break my request up into 4 requests because otherwise I hit maximum object exceptions in the JSON parser, but I know all 4 requests at the same time and could make them simultaneously.

    That brings up another issue, which is that it would be helpful to know how multithreaded CIQ and monkey-c really is. Async callbacks like those on makeWebRequest make it appear to be, but there is no mention of locking anywhere in the documentation. Is the watch app run on a message pump so that I don't need to worry about locking? If I pend multiple makeWebRequest calls will they complete in a defined order, or could it be in any order?
  • Former Member
    Former Member over 8 years ago
    You can make multiple simultaneous requests at the same time, but there is a maximum of about 3 before you will see full queue responses.

    Monkey C is not multithreaded. All asynchronous callbacks complete on the same execution thread for an application. However, I do not believe multiple makeWebRequest calls are guaranteed to complete in the order they were made. If the first request takes longer to respond than the second, I would expect the second to return first.
  • Thanks. I switched to this model and it is a bit faster (27 seconds total to get 12kb of data, vs about 40 seconds previously). The maximum number of objects is the real limitation here, because your JSON parser hits that when getting large arrays of string data in JSON. Sample JSON data that I'm getting can be seen here:
    http://tidesandcurrents.noaa.gov/api/datagetter?begin_date=20161006%2000%3A00&end_date=20161006%2006%3A00&station=8454000&product=predictions&datum=mllw&units=english&time_zone=lst&application=web_services&format=json

    It contains about 120 strings, which eats about half of the object space available to my app. As a result I can only afford to parse 6 hours of this JSON data at a time. I parse the JSON and convert each value to a float or number, which appear to be "free" when packed into large arrays (but an array of strings or moments or other objects is expensive).

    I can confirm that responses do come out of order from requests. Interestingly they all seem to come very quickly after each other (within about a second each), but it takes a lot longer (~20 seconds vs ~10 seconds) for me to get the first response.

    A feature request -- it would be nice to be able to pass a parameter into makeWebRequest that was passed back into my completion function. An alternative would be a language feature that let me make these functions programmatically. I had to do an ugly hack to work around this so that I could pass some state related to the request into the response function.
  • A feature request -- it would be nice to be able to pass a parameter into makeWebRequest that was passed back into my completion function. An alternative would be a language feature that let me make these functions programmatically. I had to do an ugly hack to work around this so that I could pass some state related to the request into the response function.


    Could you do something like giving each request have it's own handler?

    onReceive1(), onReceive2(), onReceive3() for example, and then in each of those you call a common function and pass it a parameter as to which one you got?