"A Note to Developers" and the va6

Ok, I started to add the va6 to my apps, and here's a bit of what I found.  Not a bug, just a bit more detail.....

I'm using the 8.1.1 SDK.

With the va6, unlike the va5 and other devices, a long press of the lower button does not trigger onMenu.

So I added the code mentioned in the va6 announcement for setActionMenuIndicator() to initialize in the main view in my app.  Here's what that indicator looks like.  The line on the right side of the screen in the middle.

I use a BehaviorDelegate in the app, and all I added there was this:


	function onActionMenu() {
		return onMenu();
	}

In the app, onMenu is used directly on other devices and builds and pushes a Menu2

Then in the sim, starting at the indicator on the right, I drag to the left to trigger this.  The sim does seem to be kind of touchy as far as speed and angle, so it might take a couple tries to get it right the first time.

  • that's probably why it doesn't work (at least in the sim)

    it works for me in the sim (windows and mac). not sure why it's crashing for you, but that does sound like a serious bug

  • At first it looked like it works. The indicator is displayed, and if I ignore the incorrect positions for all kind of areas (like the buttons) then opening it also works.

    But then I started to add my logic. And that means, that I want to have the indicator and the trigger to work at some times in the same view and not to be there other times. Before the activity is started I want the user to be able to open the menu (similar to how Jim did) and then while the activity is active, then I want to disable it, but if the activity is paused then enable it again.

    And problems start there. From the documentation it sounds like passing {:enabled=>false} or null should behave the same way but I think they don't.

    Also calling it multiple times in some cases makes it crash (not my app, but the simulator). Now this "some cases" is 100% reproducible. It depends where I call it (i.e constructor, onLayout, onShow, orvfrom other functions. Also if I accidentally call it twice with the same value, or if I call it with different value twice one after the other.

    Because of the crashes I did refactor parts and it made it a but better, so one could argue I had bugs in my app, but I'd argue as you write, there is some serious bug in there. Even if I call it from the wrong place (and let's say it will be added to the docs that it should be called not before onLayout or whatever it may be) it should not crash even my app (just maybe the action menu shouldn't work) and certainly not the simulator.

    Anyway even if I made this work the GPS still doesn't work in va6, so I decided not to include va6 in my next release, as I feel it would just harm my app's reputation more, than a user asking for va6 to be supported.

  • Sorry, when I said "it works" I just meant it doesn't crash for me in the one case I tried. It was intended to be more like "huh weird why it crashes in some cases and not others" rather than "this feature is working 100% and you can't convince me otherwise!!!!1!"

    I didn't mean to minimize the obvious frustration you are encountering with this feature (and va6 in general).

  • I've tested a number of real apps with the logic I posted in the sim and they are now in the app store.  If someone with a real va6 finds something with that logic, I'll look into it, but for now, no complaints or issues..

  • I've tested a number of real apps with the logic I posted in the sim and they are now in the app store.  If someone with a real va6 finds something with that logic, I'll look into it, but for now, no complaints or issues..

    Yeah it's up to you, I really don't care.

    I was only saying that clearly *Garmin* thinks that when the user swipes left on an action menu indicator in Vivoactive 6, the user expects an action menu to be opened. All indications are that this is exactly what will happen in the native UI on a real device.

    I think the common-sense reason should be entirely obvious ("action menu indicators" are for "action menus", not other kinds of menus) but if that's not enough, the documentation for onActionMenu() tells the dev to push an action menu using showActionMenu(). Like do you really think it's a coincidence that all 3 of the following functions have the word "action" in their name:

    - setActionMenuIndicator(): enable or disable action menu indicator

    - onActionMenu(): triggered when the user swipes left on the action menu indicator. the doc for this literally tells the dev to call showActionMenu(): "Invoke WatchUi.showActionMenu to push an action menu."

    - showActionMenu(): pushes an action menu view

    You're the one who was saying just recently that it's better for CIQ apps to act in ways that users are familiar with (with respect to button / touch input). I think that on Vivoactive 6, all the native code which displays an action menu indicator will open an action menu, not some other kind of menu (i.e. a full-screen menu). Obviously there must still be ways to open a full-screen menu, such as opening device settings from the activity list, or opening activity settings from...wait for it...an action menu (just like the Strength activity on FR955.) 

    I'm aware that CIQ apps do things differently from the built-in UI all the time. And if that's what you want to do, that's fine. I was only pointing out that if anyone pushes a full-screen menu from onActionMenu(), it will almost certainly be different than the behaviour of the built-in UI.

    And ofc users probably won't complain, they don't know or care how things are supposed to work. If some little thing is different than they're used to, they'll just shrug their shoulders. Again, I know so many runners who don't know that you hold UP for settings / context-sensitive menu on a 5-button watch, and Garmin obviously realized this, as in the past few years, they invented so many ways to open settings / context-sensitive menu without holding UP.

    But the point is being consistent with the built-in behaviour leads to a better user experience, whether users are consciously aware of it or not.

    There's a CIQ device app with glance (former widget) where the pages are scrolled by pressing START, but UP and DOWN do nothing. Nonetheless, the UI has the standard vertically arranged page indicator dots on the left side of the screen, as if you're scrolling vertically. Even better, if you swipe up, it "scrolls up" (goes up one dot to the previous page), and if you swipe down, it "scrolls down" (goes down one dot to the next page). Obviously this is backwards, as swiping up is supposed to scroll down and swiping down is supposed to scroll up, and that's how it works with built-in apps. Furthermore, it makes zero sense that pages can be scrolled by swiping up and down, but pressing the UP and DOWN buttons does nothing.

    It's still a good app, and I bet nobody has complained. But I notice these discrepancies every time I open the app.

    Finally, why do you think Garmin got rid of onMenu / the menu button gesture in VA6 in the first place? Clearly it's because they want only one way to open a context-sensitive menu, and only one kind of context-sensitive menu, because it's simpler that way. And because the presence of the unique indicator will always let users know when a context-sensitive menu is available. So if a 3rd party app does something else other than open an action menu when the user swipes on the action menu indicator, that really seems to go against what Garmin is trying to do here.

    As a counter example, every modern 5-button watch uses hold UP to display a full-screen settings / context-sensitive menu, when one is available. There are a few exceptions, like the map pan/zoom page, where holding UP just pans/zooms real quick.

    So what if I made an app where holding UP on a 5-button watch opens an action menu? That wouldn't be "wrong", I can do whatever I want with my own app. But it would be a little weird compared to all the built-in apps and glances. On my FR955, holding UP never opens an action menu. 

  • With regard to usability, I don't know if things changed a lot, but is it not standard to be able to use buttons for everything you can do with touch? Why has Garmin ignored that, also in example with the touch and hold gesture. I like their thinking in behaviour programming a lot, as it saves a lot of time to respond to behaviours instead of catching the interactions. But they should somehow comply to industry standards imho. To make an example. On me, every touch device gives problems in humid conditions. When I wear a raincoat, touch, touch and hold, swipe, it all get's triggered when the coat gets wet. So I need to disable touch in those conditions with the lock option. If I want to do something, it also will not work if things get wet for me with touch, so I need to be able to use the buttons. That's why there are buttons imho.

  • @beva Garmin agrees with you, if you buy the more expensive/profitable 5 button watches ...