Monday, January 13, 2014

A Method for More Intelligent Touch Event Processing - Most Likely Widget

A Method for More Intelligent Touch Event Processing

Link to slide deck: http://goo.gl/y4Mx4G

Link to Java source code Main.java: https://goo.gl/T9iwhL   NOTE: Lines end with just \n not \r\n

The above PDF slide deck summarizes ideas for reducing the frequency of accidentally invoking unintended UI widgets on touch devices.

The Java unit test draws an overlay semi-transparent graphic which visualizes which widget is activated for each pixel in a mockup e-mail app.

Summary

• Desktop pointing devices (mice) have precise, single-pixel accuracy - touch devices do not


• Depending on device attributes, touch users are lucky to achieve an accuracy of 10-30+ pixels


• This causes many occurrences of: User intends to activate widget A but inadvertently activates nearby widget B


• The reason this problem exists is because touch device and OS OEMs assume that the legacy desktop single-pixel precision model will work well on touch devices - this is a poor assumption


• My recent experiment suggests that the frequency of inadvertent widget activations (event-to-unintended-widget mapping) can be improved


• The above URL slide deck summarizes a project I did over this past weekend to demonstrate that, for one simple UI at least, an algorithm for mapping touchevent (x,y) points to widgets which considers touchpoint-to-widget centroid distances as well as which widget's bounding rectangle contains the touchpoint (x,y) can provide the user with a parameterizable/tunable margin of error border around widgets which has the potential to substantially reduce the activation of unintended widgets


• I might add that inadvertently activating an unintended widget can be dangerous if the unintended widget were to, for example, open a malicious URL or e-mail


• More work is needed to evaluate and refine the proposed method in a variety of UI contexts, but I believe the presented algorithm has merit


So, extremely weary of using touch device UIs which frequently activate the wrong widget, I spent some time this weekend developing and validating (on one simple UI) the above simple algorithm which offers imho a better approach to map touch event (x,y) coordinates to UI widgets.

Below is a screenshot from this mini project I did over the weekend showing a semi-transparent overlay encoding, via color, to which widget a touch event at each pixel would be mapped were the current naive Widget.rect.contains( Point pt ) logic to be replaced with a simple algorithm based on touch point distance to widget centroids + which widget's bounding rectangle contains the touch point (x,y).

Original Google+ post on this topic

Below is a screenshot from the unit test touch-widget event mapping code:




As can be seen in the above screenshot, touch points anywhere within the semi-transparent red circle for the top checkbox would be routed to that checkbox.  This provides users with a margin of error which should reduce the frequency of having the wrong/unintended widget receive the touch event.  Note that the left and right overlays pertain to two different centroids for the checkboxes: Left: centroid is for checkbox alone, Right: centroid is for table cell which contains the checkbox widget.

Comments are welcome, either below or via email: 2to32minus1@gmail.com



Copyright © 2013-2014 Richard Creamer - All Rights Reserved

No comments:

Post a Comment