supporting both physical button and touch input

I find it very hard to support "correct"/user friendly input for an app across different devices.

On example:

I made it work with behaviordelegate. As an example let's look at onSelect. Works fine on devices with physical select button.

But/and It also "works" on devices without select button. For example on edge devices (ee2) a touch of the screen generates onSelect. This could be OK, though I don't like it.

And on watches ie: fr965 that both are touch screen and have a physical select button it kind of works: either I click the button or when I touch the screen anywhere an onSelect event is generated.

However I'd like to be more user friendly and be able to know where the screen was tapped, and react accordingly. This could be achieved by disabling the onSelect (for example by removing the onSelect method using an annotation) and thus onTap will be called when the screen is touched, which is good, but when a user clicks the physical button then nothing happens.

Ideally I would hope that I could disable the translation of taps to onSelect from the code. Or onTap should be called before onSelect. But to my knowledge none of this is possible. Are they?

So the only way I can think of supporting "both" touch and physical buttons is by adding a stupid, unnecessary user setting where they can disable the physical buttons. When they chose this setting then onSelect would return false => onTap will be called.

Does anyone have a good solution for this?

  • Implement onTap() and onKey(), but not onSelect().

    In onKey(), treat KEY_ENTER and KEY_START as the "select" button.

    In one of my apps, I also have code which maps the following key values to KEY_ENTER, but honestly I'm not sure if it's still necessary: 45, 47 and 49.

  • Is there any documentation of which devices have which keys and which phyisical key generates which key number? In the documentation of WatchUi (https://developer.garmin.com/connect-iq/api-docs/Toybox/WatchUi.html) I can only see key numbers up to 23. And if I go this way it means that I'll need to implement the mapping that is otherwise done by the BehaviorDelegate from physical keys to behaviors. (Not only for onSelect, which was one example but for onMenu, onBack, onPreviousPage, onNextPage)

  • class MyBehaviorDelegate extends BehaviorDelegate 
    {
        var mWasOnSelct = false;
        
        function  onTap(clickEvent)
        {
        	if(mWasOnSelct)
        	{
        		mWasOnSelct = false;
        		return ...;
        	}else
        	{
        		return ...;
        	}
    
        }
    
        function onSelect() 
        {
            mWasOnSelct = true;
            return false;
        }
    }

  • I thought about this direction, but then on devices that have physical select button but no touch screen this wouldn't work (but that could maybe be fixed with excludeAnnotations generated according to the device), nor would it work on a watch with touch screen when I pushed the physical button (i.e: fr965).

  • Is there any documentation of which devices have which keys and which phyisical key generates which key number? In the documentation of WatchUi (https://developer.garmin.com/connect-iq/api-docs/Toybox/WatchUi.html) I can only see key numbers up to 23.

    No, you have to guess / test it in the sim. I think for most devices, the START/STOP button should be KEY_ENTER.

    As far as the va3 hack goes, it's been so long that I have no idea if it's still valid (or if it ever was), but I can't imagine I did it for no reason. A comment in my code does indicate that it's an "attempt" to work around a problem, so I'm not even sure if it helped. Honestly, one of the problems I had was that my app was both a device app and a widget, and the va3 doesn't allow its sole button/key to be passed to widgets (since when you're in a widget, the key always takes you back to the watchface.) There was a point when I didn't realize that, so this workaround may have been a fruitless attempt to get around that issue, and thus completely useless and irrelevant. Sure wish I had written a longer comment / git log lol.

    However, I did try the Input sample (device app) with va3 in the sim, and when I press the sole key, KEY_ENTER is detected. However, it doesn't fire onSelect as it would on a five button watch. On va3, onSelect is only triggered by tapping the watch screen.

    Maybe just ignore that hack unless you find yourself needing it.

    And if I go this way it means that I'll need to implement the mapping that is otherwise done by the BehaviorDelegate from physical keys to behaviors. (Not only for onSelect, which was one example but for onMenu, onBack, onPreviousPage, onNextPage)

    I don't think that's true at all. In onKey(), handle the select button(s) (return true), and for every other key, return false, so you can handle them using the friendlier BehaviorDelegate callbacks.

    You may even like your custom "select key" behavior better than the default onSelect behavior on some devices, such as va3, where the sole key triggers KEY_ENTER but not onSelect.

  • So you have to analyse

    Toybox.System.DeviceSettings.isTouchScreen

    but I don't know what isTouchScreen returns if watch has touch screen but user switches it off

  • I think the onKey solution is simpler and more robust. If you don't like the behavior of onSelect(), don't use it, but emulate the desired subset of the same behavior in onKey().

  • One thing that's important is that you want to make sure that each callback in the delegate returns the proper true/false.  Real devices, starting with the va3 if I recall could get in an odd state if you don't  And you won't see this in the sim.

    If you want to support touch only, button only, and touch/button. it is helpful to have one of each type to test with. IMHO.

    The va3, with only one button get's interesting.  A screen tap is seen as onSelect.  onBack, a swipe to the right, and onMenu, a touch and hold.

    For other devices, you may only want to pay attention to these - onSelect. onBack, and onMenu, as users with those may be used to how to interact.  and that includes button and touch where the user may be used to one or the other.  The user may be used to a tap to do a select or the upper right button and you probably want to be consistent with which ever they use.

    Here for example are the callbacks for an app I first did on the original va, and runs fine on the latest button/touch devices as well as button only devices.

        function onSelect() {
            //handle this
            Ui.requestUpdate();
    		return true;		    
        } 
    
        function onMenu() {
            //hanle this
        	Ui.requestUpdate();
        	return true;
        }

    Keep in mind it's not so much how you want things to work, but what users of that device expect because it works the same way in other things.

  • Keep in mind it's not so much how you want things to work, but what users of that device expect because it works the same way in other things.

    There's plenty of native or big 3rd party apps on 5-button touchscreen devices (like Forerunner 955) which respond to touch differently depending on where you tap the screen, yet also activate a specific function when you press START, which isn't necessarily the same as the tap action.

    An easy example is Spotify (which is a CIQ app). Pressing START activates the control next to the START button, while tapping on the screen activates the control you tapped. Yes it's a music provider and not a device app, but I'm only speaking from the perspective of the end user, who has the ability the tap on different parts of the screen to activate different behavior.

    Another example is most native glances which have a chart, such as the heart rate and steps glances. If you tap on a specific part of the chart, the value at the point you tapped is displayed. But on the HR glance's chart, pressing START opens a popover menu. This is an example where tapping and START do completely different things.

    If flocsy uses onSelect() as you suggested, then it's impossible for him to implement behavior such as that described above, where tapping on a specific part of the screen does one thing, but pressing START does another.

    In other words, just because the CIQ API implementation chooses to map a tap to onSelect() for the convenience of developers, doesn't mean that native apps always interpret a tap the same as pressing START, nor does it mean that users necessarily expect that.

    If you want to support touch only, button only, and touch/button. it is helpful to have one of each type to test with. IMHO.

    There's no such thing as a "touch only" Garmin device (that's supported by CIQ) last time I checked, at least as far as watches go.

    As far as (CIQ) watches go, it's more like this:

    1) buttons with mandatory touch

    (like vivoactive and venu series, where touch is required for certain actions such as scrolling, but buttons are still used for certain things.) This covers most (if not all) Garmin "lifestyle" watches (supported by CIQ) and Forerunner 630. In this case, it's not possible to completely turn off the touchscreen (I don't count *locking* the touchscreen to mean the same thing, since at some point, to use crucial features of the watch, you have to unlock the touchscreen.)

    2) buttons only

    (e.g. Forerunner 245). These are usually 5 button watches, with the exception of 920XT. This covers most older Garmin running, multisport and outdoor watches.

    3) buttons with optional touch

    (these are usually - if not always - 5 button watches) (e.g. Forerunner 255/265). This covers most current Garmin running, multisport and outdoor watches. In this case, it is possible to turn off the touchscreen - when you do so, you don't lose any crucial functionality, just some "nice-to-have" features in the native glances and music providers

    Yes, in the case of 1), it's possible that certain apps may never register a button press at all, even though the device has a button. For example, the vivoactive 3 only has one button, and when you're in a widget, that button always returns you to the watchface. So CIQ widgets on the vivoactive 3 will never see a button press. I still don't think that counts as a "touch only" (device).

  • I know that but that's not the point. For example my fr965 has both touch (and it's enabled) and 5 keys. So I want the user to be able to use whatever is more intuitive for them. Anyway I think it works now using what FlowState recommended.