Synaptics Blog

Exploring The Latest in Human Interface Technologies

We've been moving rapidly since reaching our first billion units sold in 2011. Now, less than three years later, we've delivered the next billion. Synaptics has its sights set on growing even more and reaching the next billion-device milestone even faster!

Sensors in Sports: How Technology Will Revolutionize the Games We Love

SXSW logoWe all know how the NFL’s Super Bowl — America’s most-watched sporting event — goes: One team kicks off, and the ball tumbles into the returner’s hands. From the opening whistle, there are massive, rigid bodies plated in plastic pads and metal helmets smashing into one another at top speed, with the expressed intent of moving its target to the ground.

This makes for an entertaining spectacle, but one or more players will inevitably suffer an injury during the course of a given football game. This could be something as harmless as a cut or bruise, or something as life-threatening as a spinal injury or severe concussion, which has become the top health concern in American sports.
Now, imagine the action of the Super Bowl, and look a decade into the future:

One team kicks off, and the WiFi-enabled ball tumbles into the returner’s hands. From the opening whistle, there are massive, rigid bodies plated in plastic pads embedded with a heart monitor and metal helmets wired with pressure-sensitive sensors, smashing into one another at top speed (MPH of course measured by an accelerometer on each player’s wrist) with the expressed intent of moving its target to the ground, where a set of precise cameras will measure the exact location of the ball for placement on the next play.

Those injuries will still occur. But with hundreds of sensors on the people, field and machines involved, the harshness of injuries will be measured and relayed to team physicians in an instant, better equipping the NFL to make quick, smart decisions about treatment and to help collect information for prevention of future incidents.

If the risk of concussion is present, the player might wear an octopus-like “helmet” wrapped around his head. This device, complete with 256 perfectly-placed electrolyte-soaked sensors, will run comprehensive, accurate tests on the player’s brain to determine the best course of action. Sensor-infused technological advances like this may change the way sports are played and officiated forever; in some ways, they already are:

FIFA used cameras to help determine whether a ball was officially in the goal or not during this year’s World Cup.

• At the NFL Draft Combine, several athletes display Under Armour’s shirts with built-in breathing monitors and accelerometers – this technology is tweaked and improved for every new combine.

• Major League Baseball is looking at sleeve technology that will measure a pitcher’s health in real-time and potentially cut down on the torn UCL epidemic.

• In professional tennis, Ralph Lauren just introduced high-performance shirts at the U.S. Open, which contain sensors knitted into the core of the product to read biological and physiological information; a major breakthrough in the Quantified Self movement.

• Professional cycling already features on-bike point-of-view camera shots and collects massive amounts of data to measure riders’ physiological signs. That data is published and broadcast to the public, heightening the incredible demands the sport places on these athletes.

Yours truly is up for an SXSWi 2015 session on sensors in sports and the science behind them. I’ll discuss the benefits of allowing technology to infiltrate the sports world, including my own predictions on which applications may be on the horizon. If you want to know everything about the Trillion Sensor Movement and its relation to your favorite sport, please visit the SXSW panel picker and give Synaptics a big thumbs-up! Voting ends this Friday, September 5th!


Want more Synaptics news? Follow @SynaCorp on Twitter!

On the Up and Up: We’ve Shipped Two BILLION Units

There’s an old adage somewhere that says the first million is the hardest to make, but that applies to the first billion too. It took us 17 years to ship our first billion units, a milestone we reached back in 2011. Driven by the industry’s broadest human interface solutions portfolio, it only took us just three more years to cross the threshold of two billion shipped units. As we continue to up the game and do what we do best – lead the industry with our touch and fingerprint authentication solutions – our next billion is just around the corner.

The mobile market has been incredibly dynamic for years now and that activity doesn’t appear to be going away anytime soon. Our high-performance touch solutions have helped to enhance the mobile experience, and are making their way into the hands of more consumers across the globe faster than ever before. Additionally, our recent acquisition of Validity has left us in a very favorable position to take advantage of exciting opportunities in the burgeoning biometrics market – an area that will help make interacting with your phone effortless.

All of this comes on the heels of significant company milestones which have accelerated our explosive momentum over the past year. The latest Samsung Galaxy S5 smartphone includes our Natural ID™ fingerprint solution along with our industry-leading ClearPad® technology for the second consecutive iteration of the flagship device. In the PC industry, the latest HP EliteBook line incorporates ForcePad®, making it the first TouchPad to include pressure recognition, while ClickPad™ 2.0 won a CES Innovation Award for being the most advanced capacitive-sensing notebook touchpad technology available.

These game changing technologies have translated to notable customers wins from the biggest global OEMs including Acer, Amazon, Dell, HTC and others – all of which will help to drive our business to new heights.

Now that we’re done patting ourselves on the back, it’s time to get back to work and set our sights on shipping the next billion units!

Want more Synaptics news? Follow @SynaCorp on Twitter!

Samsung Galaxy S5: Busting the Biometric Myths

Biometrics MythThe future of technology rests in your hands. And your eyes. And your ears and your heart and your hair and your toes and every inch of human body in between.

Of course, I’m talking about biometrics, which is already gaining mass adoption in the newest mobile devices as a means to authenticate one’s ability to communicate, shop and browse on a smartphone.

Most recently, Synaptics’ Natural ID solutions were implemented in the Samsung Galaxy S5 and allow users access to their phones with a fingerprint swipe rather than fumbling with a hard-to-remember, broken passcode system.

The biometric method is superior for many reasons, with convenience and improved security functions chief among them. But many users are questioning the technology and its hackability, and whether it could lead to more important data being stolen.

That’s what I’m here for. To bust those myths that swirl around biometrics. Here are five of the most common fables, debunked:

1. Can my severed finger unlock my phone?

Ouch. Not only is this completely unlikely, but it’s entirely gruesome. Luckily, the technology behind these sensors requires that it be accessed with a live finger. There is no way for the reader to pick up the correct image on a dead pulse, though a fingerprint won’t be completely unrecognizable until a few bloody minutes have passed. Unless you have zombie limbs. And in that case, you have bigger problems. Personally, I’d be more than happy to unlock the device if threatened, than lose a finger over my Facebook posts.

2. What about an exact replica of my fingerprint?

If you have the specialized, expensive equipment to pull a fingerprint replica, then this plan just might work. However, all the components in the Galaxy S5 allow for the process to be canceled and renewed in those situations, and you have the option of applying your finger at a certain angle and pressure to make it even harder to hack. Even if someone goes to all that trouble just to steal access to your phone, it would be a lot harder to get into than hacking a four-digit passcode.

3. Are you going to save my fingerprint data in the cloud?

Not to worry. We only save a local copy (the template) of your print on the device, so there is no massive, hackable cloud hovering in the interwebs somewhere. If your fingerprint is stolen or your device is hacked, your identity is still safe. Thieves will not be able to reconstruct a clean image of your fingerprint from the digital template in the device. The template is the element that can be replaced.

4. Will a stolen phone put my data at risk?

Again, unless the thief has the wherewithal and technology to replicate your fingerprint, this won’t pose any problems beyond ruining your day. The local copy of your data won’t be accessible without your print, and you can turn it off remotely if necessary. And it remains much better protected than phones belonging to the 50 percent of users today who don’t even bother to use their PIN.

5. What if I break my phone and can’t use the scanner?

The touch driver was designed to read the print through cracks and dents for those users possessing above-average clumsiness. It won’t be as pleasant to swipe across, but the reading itself shouldn’t be affected. If absolutely necessary, you can initiate a recovery mechanism in this instance. Just try to drop it on couch cushions instead of concrete from now on.

More questions for us? Follow @SynaCorp on Twitter!

Pro Tips: Achieving the Best Fingerprint ID Experience on the Samsung Galaxy S5

FingerprintIt should come as no surprise that the mobile industry is in the midst of a dramatic shift towards biometrics, with companies like Samsung becoming the latest smartphone manufacturer to leverage fingerprint ID technology in its newest flagship device, the Galaxy S5.

As you may have seen, Synaptics Natural ID™ solution is behind the fingerprint security feature for the S5, which allows users to swipe their finger across the phone’s home button to unlock it, as well as provide access and authorization for mobile payments.

Synaptics Natural ID sensors are compliant with the FIDO Alliance, the industry-standard for online authentication. FIDO-compliant websites and web apps, like PayPal, seamlessly interact with the fingerprint sensor to replace annoying, unsecure and hard-to-remember passwords.

With any new technology, there can be a slight learning curve. To help ensure that every user has a seamless, frustration-free transition, we pulled together a quick video that’ll showcase our tips and tricks for achieving the best experience possible with the S5’s fingerprint ID functionality. Check it out:

Have an S5 of your own? Let us know what you think about the fingerprint functionality.

Don’t forget to follow Synaptics on Twitter @SynaCorp.

The Chimera Chronicles: How Synaptics’ First DIY Processor Was Born

In every modern Synaptics touch or TDDI chip, there lurks a mythical beast revered throughout history for its undying fierceness. It was born out of stone, fire and human ingenuity, molded at the hand of Federico Faggin, Synaptics’ co-founder and first CEO.

The legendary Chimera microprocessor, much like the Greek mythological figure from which it takes its name, combines many different parts to create one magical, undefeatable whole. Its purpose was, and still is, to pull together multiple processes into one centralized solution, and to do so with power and efficiency.

The story starts in 1996, when Synaptics created its first all-in-one touch controller. The goal was to integrate touch analog circuits, a microprocessor for running firmware, and program and data memories all into a single silicon chip. Finding a commercial core with the immense strength and small stature needed to perform our job proved difficult. So, a crazy alternative was put into action.

Creating a processor from scratch seemed like a foolhardy task, but we had inspiration – Faggin had built several noteworthy microprocessors in his time, including the Intel 4004, the world’s first microprocessor chip.

Nine months after the conception of the plan, we had created our first integrated touch chip, called the T1004 (which has a processor still shipping in today’s T1007!). Eventually, we embarked on a more scalable design. We christened that final product with the “Chimera” moniker, after the mythological beast of Greek lore.

Chimera 1.0 was smaller and cheaper than the standard alternative, yet proved to be much more adept at the intricate control functions that encompass Synaptics firmware. As the touch world continues to move from simple profile sensors to multi-touch transcapacitive imagers (plus proximity, gloves, In-Cell, Single-Layer, and so on), our firmware’s needs have consequently grown and changed in character, and Chimera has evolved to match them.

We’re proud of what we created almost 20 years ago, and the Chimera monster has stood the test of time thus far. So, what is next? Bringing more scalability and better power management to larger memories is a natural fit for our current chip road map.

Beyond that: Chimera is all ours, so we can adapt it to meet the market’s every demand.

As they say, heroes are remembered, but legends never die.

You can follow Synaptics on Twitter @SynaCorp

Synaptics 2014 Human Interface Index Sheds Light on Touchscreen Use in Asia

Did you know that a majority of Asians (61 percent, in fact) find themselves interacting with a touchscreen in the morning before interacting with their spouse or family? This is just one example of the pervasiveness of capacitive touchscreens in the daily lives of the Asian population, highlighted this week in Synaptics’ first-ever Asia Human Interface Index.

The Human Interface Index surveyed more than 2,000 consumers across China, Hong Kong, Taiwan and South Korea to gain an understanding of how Asians use capacitive touchscreens in their daily lives and their overall perceptions of the technology.

The infographic below highlights some of our key findings.

For example, what do Asians love most about touchscreens? Ease-of-use, fast performance and the intuitive, interactive experience topped the list. They also see the technology as significantly enhancing the user experience when it comes to gaming, handwriting with a stylus, and interaction with photos and videos.

Where do they want to see updates? Resizable buttons, integration with voice command and multi-gesture touch capability were most frequently noted.

Surveys like our Human Interface Index help Synaptics better understand the end user so that we can continue to provide the absolute best touchscreen experience possible. Plus, it’s just fun to see the results. That said, keep an eye out for more interesting data from Synaptics in the future!




You can follow Synaptics on Twitter at @SynaCorp.

Five Questions with Synaptics Biometrics Guru, Sebastien Taveau

With last year’s acquisition of Validity Sensors, Synaptics gained a world-class team of engineers and business experts in the fast-growing world of biometrics and fingerprint authentication. Among them is Sebastien Taveau, a 20+ year tech industry veteran and a well-known thought leader on the topic of mobile authentication, payments and security.

We are very excited to have Sebastien join the team here at Synaptics, and we recently caught up with him to learn a bit more about his new role at Synaptics and his vision for how biometrics technology will impact our lives in the future.

Sebastien Taveau, SynapticsHere’s some of what he had to say…

1. What is your role at Synaptics?

I am the chief evangelist for Synaptics’ newly established Biometrics Product Division (BPD). If you’re wondering what that actually means – it means that I help to bridge the technology and products developed within Synaptics BPD to real-world uses cases. I help explain, in everyday terms, how biometrics and fingerprint ID technology will impact our day-to-day lives and society as a whole. I do this through speaking engagements, interviews and more. It’s also my job to maintain a deep understanding of the biometrics industry and its progression, and regularly feed this knowledge back into the BPD.

2. Can you give us an update on Synaptics’ integration of Validity?

Since the acquisition of Validity, Synaptics has seamlessly integrated Validity’s engineering team to accelerate innovation for our line of Natural ID fingerprint sensors. The team continues to seek ways to push the envelope for fingerprint ID. For example, the Holy Grail in this market is to capture the fingerprint sensor beneath the active display or “glass” in device touchscreens vs. a discrete button. This has never been done before, but once it’s achieved, fingerprint ID will be that much more powerful and seamless for the user. With Validity as the expert in fingerprint ID and Synaptics as the expert in touchscreens, it’s the perfect coupling to achieve this major next step in mobile authentication.

3. What progress do you see happening in the biometrics industry in the next year?

This year will be big for kicking off the ascent of biometrics into the mainstream. We’re seeing a huge amount of interest from some of the world’s biggest brands. For example, take the FIDO Alliance, of which Synaptics is a founding member. The goal of the FIDO Alliance is to establish open, scalable and interoperable standards for mobile online authentication. In the one year since FIDO was established we’ve seen membership skyrocket from six members in 2013 to more than 100 members in 2014, including brands like MasterCard, RSA and Bank of America. We’ll continue to see increased standardization and regulation, as well as increased access to open APIs – measures that will continue to fuel innovation and exciting new biometric use cases in the months to come.

4. Beyond unlocking phones, how will fingerprint ID technology impact lives in the future?

In the future, fingerprint ID will become more invisible, but it will be there in the background, connecting us to our surroundings in ways we haven’t even imagined yet, especially as more of the environment around us becomes interconnected through sensors, etc. The online experience will become hyper customized – an experience I refer to as “the Internet of me.” While active authentication will still be necessary for things like mobile commerce transactions, it’s the emergence of passive (i.e. invisible) authentication where things really start to look exciting! For example, imagine renting a car and as you touch the door, the seat, in-cabin temperature, even the radio is automatically set to your desired preferences before you get inside. Connecting to your smartphone, your itinerary is then automatically uploaded to the navigation system – this is hyper customization of the connected experience; this is the Internet of me.

5. So tell us, can someone chop off our finger to gain access to our data?

This is a funny question that I hear all the time. I recently wrote a blog post that provides an in-depth answer, but the short answer is no. The fingerprint sensor technology is built in a way that the fingerprint image has to be taken from a live finger. As I’ve said before, if someone is compelled to chop off your finger to access your smartphone, you likely have bigger problems!

If you’re interested in catching up with Sebastien and hearing more, he’ll be sharing his thoughts at the following upcoming industry events:

Bringing Hover Tech to Next-Gen Notebooks

Over the past year, we’ve brought several new solutions to market that are changing the way users interact with their devices. It’s the mentality built around innovation that helps to keep our User Experience and Concept Prototyping teams looking forward to find the next big trend in human interface technology.

We’ve already seen the successful implementation of our 3D Touch solution, which debuted in last year’s Samsung Galaxy S4 and Galaxy Note 3 devices – and the response from users was incredible. As a result, we believe that hover technology can also make a big impact in the notebook PC space as well.

In fact, one of our most popular demos as CES last month was our proximity hover prototype laptop – otherwise known as our “Orzo” demo. As part of the demo, our team of engineers developed a notebook with our proximity hover technology built into an existing touchpad. The idea was to showcase how a hover enabled touchpad can accurately detect a finger from a distance of as much as 3-4 cm, allowing for a host of new ways for users to interact with and control their notebooks. This also presents new opportunities in Windows 8 gesture, gaming and waking up from sleep mode – all without touching the PC.

Check out the video below for a quick demo of our proximity hover technology to see for yourself – as always, feel free to let us know what you think!

That’s a Wrap: Synaptics Takes Human Interface Tech to New Heights at CES 2014

January is known for many things, the start of the new year, winter weather, and for those of us in the technology industry, the start of CES! Each year thousands of companies descend onto Vegas for a whirlwind week where new gadgets, technologies and trends take center stage for what is largely considered the super bowl of the consumer electronics industry – and this year was no different.

It was great to see that human interface solutions were a huge trend at CES this year, with many companies showcasing a variety of different implementations for how consumers can interact with devices. As a company at the forefront of a bigger industry shift in human interface technology, we get excited to see how other companies are taking their solutions to the next level. Our partner Tobii for example, had a really great demo of its eye tracking solution that could pin point exactly where you were looking down to the centimeter, allowing users to zoom in on maps, scroll through your browser and even change the gaming experience. The Verge has a really great round-up of some of their favorite human interface demos from the show.

2013 was clearly a big year for Synaptics – from the “touch first” computing experience to fingerprint authentication and gesture control, we’ve continued to redefine the human interface space. This year, we leveraged CES to showcase all of our industry-leading solutions that hit the market over the past year including ForcePad, ClickPad 2.0, 3D-Touch and Active Pen.  These technologies represent new user experiences that are challenging the designs of previous touch solutions – all of which are available to consumers today.

While 2013 represented the year of industry firsts, 2014 will likely be the year of new innovations for the human interface market. As part of our showcase, many of these solutions were on display in our booth including new practical uses for our ThinTouch technology, 1mm tip stylus support for smartphones, demos of new implementations of our biometrics and fingerprint ID solutions and a new prototype showcasing proximity hover technology which allows users to control their notebooks without having to physically touch the device.

Here are just a few shots of our booth to give you a sense of what was on display.








image (6)







image (5)







image (4)







image (3)









image (2)








image (1)








The spirit of CES has always been about what’s on the horizon and we feel the same way with our solutions. The technologies that were demonstrated at our booth are only a preview of what consumers can expect to see in upcoming devices and we look forward to working with our OEM partners to bring human interface innovations to new heights this year.


Putting Users at the Heart of Design – Taking A Closer Look at the Synaptics User Experience Design Team

Over the years, Synaptics has developed many new human interface technologies. These technologies provide new and powerful ways for users to interact with their devices – from our 3D Touch™ finger hover technology to the Synaptics ForcePad® with pressure sensing. Often times, these technologies get their start in the labs of Synaptics’ User Experience Design team. It’s here that we take amazing technology and transform it into a great end-user experience.

The User Experience Design Team is made up of a diverse set of backgrounds and skillsets from mechanical engineering, computer science, cognitive science and even philosophy. We work together to study user behavior, evaluate needs and problems, and put the user at the heart of our design decisions. By looking at Synaptics’ technical solutions from the perspective of the user, we can uncover new human interface implementations that will make our device experiences easier, more fluid and more exciting for all of us.

Driving innovation and new features

Whenever Synaptics develops new technology, like ForcePad or ThinTouch®, the User Experience team is part of the process early on. We work with engineering to shape the technology so that it solves user needs and provides a great user experience. We develop new interactions that take advantage of the technology, which drives innovation on our technology and creates new features for users. This process helps push the technical solution forward as engineering iteratively improves on the technology to meet the high standard of a great user experience.

Creating products that are more useful and desirable

When we are working on a product, whether the technology is new or established, the User Experience team does research and testing to create a product that is useful and desirable. We study existing usage behaviors and actions in research studies to learn where our technology will be most beneficial. We study the usability and user experience of specific features in laboratory studies to make our features more desirable and interactive.

Supporting our customers & partners by collaborating on UX issues

Our team frequently collaborates with partners and customers to create new user experiences that extend beyond the touch input device. We work with User Experience and Usability teams at these companies to share knowledge and build new user experience solutions. Often, customers come to us hoping to create a completely novel user experience that builds on our expertise with touch devices, and fits into their technology.

Creating the best user experiences & building our product brand through user experience

Ultimately, our goal is to create the best possible user experiences for touch and input devices. We are always looking at our existing technologies to see how we can make them work better to solve past issues, and we stay up to date on new input technologies; even those that are not directly related to Synaptics technology. Our work always puts the user experience first, and we are the internal advocate for user needs at Synaptics. Through our research and design work, we strive to perfect user interactions.

As far as some of the exciting areas we’re exploring now, look for 3D gestures, gaze tracking, active pen inputs and fingerprint ID technologies, among others, to revolutionize how we interact with our devices very soon. Keep an eye on the Synaptics blog for more updates from our team!

The Synaptics UXD team (from bottom left, clockwise): Mohamed Sheik-Nainar, Justin Mockler, Anna Ostberg, Dan Odell, Eric Faggin

The Synaptics UXD team (from bottom left, clockwise): Mohamed Sheik-Nainar, Justin Mockler, Anna Ostberg, Eric Faggin, Dan Odell

Follow Synaptics on Twitter @SynaCorp