supporting both physical button and touch input

I find it very hard to support "correct"/user friendly input for an app across different devices.

On example:

I made it work with behaviordelegate. As an example let's look at onSelect. Works fine on devices with physical select button.

But/and It also "works" on devices without select button. For example on edge devices (ee2) a touch of the screen generates onSelect. This could be OK, though I don't like it.

And on watches ie: fr965 that both are touch screen and have a physical select button it kind of works: either I click the button or when I touch the screen anywhere an onSelect event is generated.

However I'd like to be more user friendly and be able to know where the screen was tapped, and react accordingly. This could be achieved by disabling the onSelect (for example by removing the onSelect method using an annotation) and thus onTap will be called when the screen is touched, which is good, but when a user clicks the physical button then nothing happens.

Ideally I would hope that I could disable the translation of taps to onSelect from the code. Or onTap should be called before onSelect. But to my knowledge none of this is possible. Are they?

So the only way I can think of supporting "both" touch and physical buttons is by adding a stupid, unnecessary user setting where they can disable the physical buttons. When they chose this setting then onSelect would return false => onTap will be called.

Does anyone have a good solution for this?

  • Yes, this (and the fact that I don't have 100 physical devices to test) is exactly why I asked if [some of this] is documented somewhere, other than anecdotical evidence from years ago from comments or memories of developers....

  • It's even worse than what I thought. Even on devices I have. ie.e: fenix6. According to https://developer.garmin.com/connect-iq/reference-guides/devices-reference/#f%C4%93nix%C2%AE66solar6dualpower there are "6" buttons: enter, up, menu, down, clock, esc. "Of course" only 4 of the 5 physical buttons play in the game (the light button doesn't). I had no idea about the clock button. And actually my guess was that it's a long press of the back button (as on the real device that usually brings up the watchface - even if not the one I selected, just one of the built-in WD-es). However running the Input sample on simulator surprised me: long press on the down button is mapped to KEY_CLOCK. However the real device doesn't do anything when I long press the down button... Very strange.

    Also: forums.garmin.com/.../bug-the-lap-and-start-stop-buttons-are-switched-in-the-simulator-on-edgeexplore2

  • Does onSwipe up and down work for you in vivoactive? In the simulator (6.4.2) it only gives me left, right swipes. I mean even in the Input sample.

  • As unbelievable as this is going to sound, the OG vivoactive only supported horizontal swipes, not vertical swipes. I never saw one in person, but I deduced this from the behavior in the sim, and the videos of the UI where all scrolling was horizontal instead of vertical.

    This is one of those cases where using the behavior delegate callbacks (e.g. previous page, next page) is better than handling “raw” input. For vivoactive, a left swipe should trigger the next page callback, and a right swipe should trigger the previous page callback (which is ofc different pretty much every other Garmin touchscreen watch, which use up/down swipes for next/previous page.)

  • That doesn't help in my case because in my app I need both left/right and up/down and the BehaviorDelegate maps the left, up to onPrevPage and right, down to onNextPage even on devices that have physical up, down buttons and the swipe works to all 4 directions. And since I see that it doesn't have yet onDrag either the only logical thing I can do it is to exclude it from my app.

  • Yeah I'm saying vivoactive physically (or otherwise intrinsically) doesn't support vertical swipes, for whatever reason, not just at a CIQ level. Don't quote me on this but it might have something to do with the touchscreen hardware.

  • I made this TouchAwareBehaviorDelegate gist: https://gist.github.com/flocsy/40f4d1082187a35336f3c4813e1ddccc

    You'll get the idea and if you need more actions you can easily add them.

  • the BehaviorDelegate maps the left, up to onPrevPage and right, down to onNextPage even on devices that have physical up, down buttons and the swipe works to all 4 directions.

    Sorry to revisit this a year later, but since you bumped this thread, that simply is not true.

    Try the SDK Input sample on fr955 or fenix7s (5 button watches with touch, supporting 4-direction swipes, which existed a year ago when that comment was made).

    You will see that:

    - swipe left maps to no behaviour

    - swipe right maps to onBack

    - swipe up maps to onNextPage

    - swipe down maps to onPreviousPage

    This is pretty much reflects the way those swipes work on most screens in the native UI. (The point is obviously to allow your app behave similarly to the native UI)

    The same is true for vivoactive5, a touchscreen device with only 2 buttons (no buttons for scrolling).

    If I'm not mistaken, it should only be vivoactive and fr630 which map swipe left to onNextPage and swipe right to onPreviousPage. (fr630 has support for 4 swiping directions, its native UI just uses swipe left and right to change pages)

    Ofc that doesn't mean onNextPage / onPrevPage are sufficient or suitable for your stated goal of needing to detect both left/right and up/down swipes. They are obviously good for the purpose reflected by their names: detecting whether the user triggered the logical "next page" or "previous page" actions (via swipe or button press), in a context where those actions make sense.

  • Yes, the Input SDK sample is a good way to understand input!  I was just going to mention that!

  • So now what I need is a way to determine which devices can I display the setting to disable touch. In other words I need a way to differentiate between "hybrid" and "mandatory touch" devices. Ideally this would be some data I can dig up with a script from compiler.json and simulator.json or the debug.xml.

    My current idea is to look in simulator.json for .keys and see if there are (key) behaviors for all 3 behavior I use in my app: onSelect, onMenu, onBack, and if yes, then it's a hybrid, in which case I can display the option to disable touch.

    But I'm not sure if this will work for all devices, and there might be a better way. Do you have any idea?