Advertisement
Articles
Advertisement

The True Meaning of MultiTouch

Tue, 08/31/2010 - 12:17pm
Larry Mozdzyn, Ocular Inc.
Despite what you may read in articles or hear in advertisements, typical gesturing on most smartphones is not a true multi-touch implementation. The touch controller for a gesture only interprets relative movements of one or two touches on the panel and reports these movements as predefined gestures. For example, the ‘pinch’ gesture, where the thumb and forefinger move together to resize a window, seems to imply a true multi-touch event but in actuality the controller only recognizes the relative movement of those touches to each other. If the relative touch movements fit the predefined characteristics of a “pinch,” then a “pinch” event will be reported by the controller. In the case of true multi-touch, data such as touch location, touch area, touch angle and other touch features, will be reported in detail. The simple “gesture only” touch controllers do not have the level of sophistication and processing power to provide this enhanced functionality.

A true multi-touch UI can recognize multiple touch points with much greater precision and resolution than a single touch or a gesturing-enabled panel. That is, the true multi-touch panel can sense multiple, simultaneous touches and the controller can identify the precise locations of the multiple touches on the screen, independently track the movement and duration of each touch, calculate the speed and direction of a moving touch and perform other functions that enrich the user interface. The richer, more compelling nature of a true multi-touch interface is convincing design engineers to implement true multi-touch panels into a broad range of applications, many of which have never utilized a touchscreen interface before.

The Crystal Touch line of capacitive touch screens from Ocular is capable of a true multi-touch user interface. Pictured here, a Crystal Touch panel is packaged with an ARM processor in Ocular’s CascaSince true multi-touch is a superset of single touch and gesturing, incorporating display panels today that are capable of implementing a true multi-touch UI will give manufacturers a viable and cost-effective platform for years to come. New multi-touch inputs or actions will certainly evolve and become an expected part of the UI for certain applications. This will especially be the case for applications that have not featured touch panel UIs in the past.

For example, the human/machine interface for most medical imaging systems in the past has mostly depended upon a keyboard and a mouse or roller ball device. Such a system with a true multi-touch interface will not only make the operator more efficient and effective, but it would also enrich the visual experience for the patient, enhancing his or her understanding of the images on the screen. Shackling a medical imaging system with a single touch panel would unnecessarily hamper how the system might be improved and enhanced in the future when new types of onscreen multi-touch actions and gestures are introduced and users begin to require them.

The smartphone has certainly taught the industry that a highly capable and very compelling platform which engages and draws users into an intimate relationship with the system can quickly evolve into the basis for a wide range of applications that were not even imagined when the platform was first developed. Surely the application store or application library is now a permanent fixture in the marketplace and, indeed, a feature that has come to be expected and demanded by most consumers these days. System suppliers who will be able to stock their stores with a wide range of software applications, add-on modules and plug-ins will have a decided advantage over their competitors. Limiting a system to a single touch UI would inhibit the applications that could be deployed on any product platform in the future. A single touch UI would mean fewer apps in the store and that’s not good for the system supplier. First generation touch smartphones utilized a single touch interface with the advantages of gestures and single touch actions. However, it was the smartphone that started to blur the line of true functionality definitions. Since gestures such as pinch, zoom and flick were featured and two fingers were used for these items, smartphone users thought this was a multi-touch interface.

The sky’s the limit
Now that smartphones have conditioned most consumers to the many benefits of a touch panel UI, the logical question becomes: what’s next? And the honest answer is: the sky’s the limit.

Some of the possible applications for a true multi-touch UI may seem futuristic and somewhat removed from the applications that are current today, but with true multi-touch capabilities, the UI does not limit the imaginations of the designers who are developing advanced systems in areas like online gaming, certain types of simulations for military and training purposes, medical imaging systems, architectural and other types of computer-based design systems, drawing/drafting applications and the list goes on and on.

It is important to remember that when the two-finger gesturing interface was introduced, there were relatively few applications which took advantage of it. Now there are hundreds of applications available. Since the hardware building blocks – the sophisticated panel controllers and the high-resolution, crystal clear projected capacitive touchscreens – can be deployed at attractive price points compared to single touch solutions, product developers are in a good position to integrate advanced functionality into their systems today and rest assured they will be able to accommodate the most advanced applications in the future. The same cannot be said for single touch panels. Not all products need a true multi-touch interface to meet their design criteria, in which case, product designers need to understand the difference between the single and true multi-touch interface to design products accordingly.

A touchy subject
The distinction between true multi-touch, single touch, gesturing and other words that are often thrown around indiscriminately is not a mere matter of semantics. Without clearly understanding the differences among these terms, a design team might deploy a touch panel solution today only to find out later that the system cannot be easily and cost-effectively enhanced in the future with new features and functionality, or that the system cannot run the newest applications because it does not support a true multi-touch UI. Learning the true definitions will benefit not only product designers, but consumers as well.
Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading