I find it very hard to support "correct"/user friendly input for an app across different devices.
On example:
I made it work with behaviordelegate. As an example let's look at onSelect. Works fine on devices with physical select button.
But/and It also "works" on devices without select button. For example on edge devices (ee2) a touch of the screen generates onSelect. This could be OK, though I don't like it.
And on watches ie: fr965 that both are touch screen and have a physical select button it kind of works: either I click the button or when I touch the screen anywhere an onSelect event is generated.
However I'd like to be more user friendly and be able to know where the screen was tapped, and react accordingly. This could be achieved by disabling the onSelect (for example by removing the onSelect method using an annotation) and thus onTap will be called when the screen is touched, which is good, but when a user clicks the physical button then nothing happens.
Ideally I would hope that I could disable the translation of taps to onSelect from the code. Or onTap should be called before onSelect. But to my knowledge none of this is possible. Are they?
So the only way I can think of supporting "both" touch and physical buttons is by adding a stupid, unnecessary user setting where they can disable the physical buttons. When they chose this setting then onSelect would return false => onTap will be called.
Does anyone have a good solution for this?