Touch Input System


TLDR

Custom gesture recognition and support for touch picking with multiple cameras.

Rational

One of the first systems I worked on was the Touch Input System. Before writing my own, I first took the time to research and evaluate some off the shelf solutions. I experimented with Unity’s XRInteractionToolkit package but found it was too high level, didn’t support all the gestures or interactions I wanted, wasn’t easily extendable, and was in preview anyways. Next, I looked into the third-party package TouchScript, it seemed like it could do most of what I wanted, and was extendable, but represented the addition of a large codebase considering its purpose. This meant that if I ran into any difficulties it would be hard to shift through the mountains of foreign code, and if I ran into any bugs I would be at the mercy of the developer on a non-active-development package. I wanted the simplicity a purpose-built system to solve just the problems we had, with the infinite flexibility to modify and adapt the systems as needed. For all these reasons I decided to work on a custom input system.

Features

At the time of writing, the custom system supports multi and single finger taps, a single finger press, single and multiple fingers drags, pinching and twisting. It includes 3D touch picking with support for multiple layers and cameras, which is useful for if there are overlayed views on top of each other (for instance, a mini-map on top of an AR view). The system is low level, and is not a state machine, meaning multiple gestures can be occurring simultaneously and have no coupling. This means more complex gesture relationships (like only allowing the exclusive rotation (twist) OR scaling (pinch) of an object becomes the responsibility of the consumers of the touch input system. This allows the touch input system to remain relatively app agnostic and extendable.

code

Leave a comment

Log in with itch.io to leave a comment.