Cutting the fat: The case for a microcontroller-less capacitive touch screen controller

Wed, 05/01/2013 - 12:13pm
Eric Siegel, Touch Screen Control Business Development Manager, Texas Instruments

Touchscreens are everywhere! While the means by which they detect a touch has evolved over the years, more recently you might say they have revolved as old technologies are reopened and rechristened anew (I’m looking at you optical touch). Touchscreens are finding their way into our homes and our everyday consumer devices. Need some evidence? Check out the latest and greatest digital still cameras that run Android. Instead of just playing Angry Birds, you will be able to photo-bomb with them, too. Oh, the places you will go! Regardless of how these new applications become visible to the public eye, if you get out your consumer microscope and start looking beneath the sizzle and start analyzing your steak, you'll see that these advancements are actually occurring because of new technologies. 

Touchscreen control technology has been vastly pioneered by the mobile market. This, of course, is due to the high volumes and yearly turnaround associated with it. It doesn't take a prodigy to see the first relevant steps of touchscreens for consumers were with resistive technology. Its handwriting functions lead to a whole new way to "write the alphabet." It became a cool unspoken token of your social status, not unlike knowing how to "properly" order at Starbucks in today's society. If you don't know, then you just aren't that cool. Resistive technology held its seat on the “iron throne” of the mobile market only to be knocked off in recent years by its new usurper: capacitive touch (cap touch) screen control. Initial uses of this technology were proprietary, but then more companies saw how beneficial it is to implement and bended the knee and dedicated their life to it. So what makes it special? Let's take a closer look at capacitive touch and its maturity and variants.

First implementations of cap touch looked at one touch at a time. After all, that is how resistive works, and why ever would you need more than one touch at a time? This self-capacitance only approach looked at one sensing channel’s capacitive value with respect to ground. A novel approach, until people got greedy for multi-touch. In this case, surface capacitance or just self-capacitance creates a ghost effect (Figure 1).



Figure 1. Images of a self-capacitance only sensor, with one touch point (1a), and a dual touch point (1b), resulting in ghost points (white circles).

To solve this issue the concept of mutual capacitance was incorporated to look at the capacitive value combined between each row and column. This gives the system a higher level of precision, but takes the number of searches from an arithmetic search into the geometric realm. Now you have the number of rows * columns vs. rows + columns (Figure 2).

Figure 2. Mutual capacitance sensor layout. Instead of 10+15 = 25 sensing channels, you now have 10*15 =150 sensing nodes.

From these basic touch detection systems, requirements evolved to include gesture recognition, object rejection, and other options. Initially, these things required extra horse power, so touch designers took the existing microcontrollers and retrofitted the necessary analog tools onto them to handle the job. They may have gotten the job done, but was this the most efficient way? Not necessarily. I am often reminded that you don't have to be the best, just better than the competition.

There is something to be said about mob mentality when it comes to staying safe. But in the business world, blending in with the herd won't always get you where you want to be. New technologies and implementations are always coming about. When it comes to cap touch, using an integrated microcontroller can get the job done, but at what expense? Integrated FLASH and RAM drive up costs, both from a power and dollar perspective. Moreover, systems with a touch screen typically have some sort of embedded controller already designed into handle the necessary requirements for doing touch calculations or complicated touch recognition. In fact, when you look at today’s current market trends, application processors are carving out their own dedicated touch engines. Now why is that?

It's about system optimization. Why have redundant parts, if instead, you can:1) save money, and 2) charge your portable device less often. In other words, talk to your friends a little longer, or watch another movie. Along these lines touch screen controller companies are following suit. They are looking into that need specifically and creating an analog front-end-based (AFE) projective capacitive touch screen controller.

What happens when both a touch IC with an embedded microcontroller is turned on, versus one with a digital state machine (or AFE-based design)?

Figure 3. Average power consumption with applications processor on (light gray) versus operational modes for a Touch IC with embedded MCU (3a:dark gray), and Touch IC-based on an AFE with a digital state machine (3b:red).

Figure 3 shows that both methods are really noisy when the applications processor is up and running. But, what happens if we shut down the applications processor?

Figure 4. Power consumption with applications processor off:4a) running Touch IC with integrated MCU;4b) AFE-based Touch IC with state machine.

Here, it’s a completely different story. Now we are talking two orders of magnitude of difference: <0.1mW vs <10mW.Figure 4 clearly shows that the extra hardware on the touch screen controller with integrated MCU (Figure 4a) is burning more power over a longer period of time ,versus an AFE-based design (Figure 4b). This use case is much more meaningful when one considers that a device typically is in the above state for 90 percent of the time. Due to low-power innovation in such AFE-based designs, new hooks like a double-tap to wake function can be added.

We’ve shown from a power perspective why it can make sense to use an AFE-based only solution. Are you thinking that without the MCU present you will be relying more on the applications processor and over burdening it? Let’s evaluate that thought.

Applications processor:
• ARM A9 dual-core
• 1GHz
• Total MIPS/Power
   o 1000 MHz * (2.5 DMIPS/MHz) * 2 =  5000 DMIPS consuming ~0.6W

Competitive MCU integrated touch screen controller:
• Specifications
   o Arm cortex M3
   o 60Mhz
   o 1.25DMIPS/MHz
   o 149 µW/MHz

• Assuming that the TSC CPU is 100 percent loaded
   o 60MHz * 1.25 DMIPs/MHz = 75DMIPS
   o 60MHz * 149 µW/MHz = ~9mW + Flash + RAM

• Resources used if running all touch code on the Apps processor
   o 75 DMIPS / 5000 DMIPS = 1.5%
   o 0.6W * 1.5% = 9 mW

Running the same code on the apps processor actually saves power costs of Flash and RAM, and consumes 1.5 percent of the available DMIPS on your processor. All that may seem rather reasonable and low power, but let’s assume 100 percent of the CPU load is equally split between filtering and gesture, and coordinate smoothing. Since an AFE-based design has hardware built in for filtering, that portion of the CPU load is no longer required. Now you can reduce the loading and power by half!

• Resources used if running gesture/smoothing
   o 32.5 DMIPS / 5000 DMIPS = 0.75%
   o 4.5 mW

We discussed the benefits of an AFE-based design, but note that this architecture is not a pro-cap panacea. Just like other architectures, it also has down sides. The power and cost savings are proven, but since the flash is contained in the embedded/applications processor, updates to the system become a bit more involved. Changes to the system at large create the need to update drivers and generally tweak the system code. Along those lines, if you max out the state machine, any raw data “secret sauce algorithms” need a safe home –if you don’t want them open-sourced. Every situation is different and each designer has their own values and design considerations to consider. AFE-based capacitive touch screen controllers, like the TSC3060, are a viable option for low-power-based designs for reducing cost while extending battery life.


• For more information, visit:

About the author
Eric Siegel has a Master’s of Science from the University of Florida and has held many roles at Texas Instruments. He started out as a test engineer, moved into microcontroller marketing, and most recently is responsible for business development of TI’s Touch Screen Controller portfolio. Outside of work he enjoys movies and is an active boxer. Eric can be reached at


Share this Story

You may login with either your assigned username or your e-mail address.
The password field is case sensitive.