Synaptics Blog

Exploring The Latest in Human Interface Technologies

slide
Synaptics has a new expert in town. Get to know biometrics guru and FIDO Alliance co-chair Sebastien Taveau in his new role as Synaptics chief evangelist with five questions about the touch industry and its promising future.
Prev
Next

The Chimera Chronicles: How Synaptics’ First DIY Processor Was Born

In every modern Synaptics touch or TDDI chip, there lurks a mythical beast revered throughout history for its undying fierceness. It was born out of stone, fire and human ingenuity, molded at the hand of Federico Faggin, Synaptics’ co-founder and first CEO.

The legendary Chimera microprocessor, much like the Greek mythological figure from which it takes its name, combines many different parts to create one magical, undefeatable whole. Its purpose was, and still is, to pull together multiple processes into one centralized solution, and to do so with power and efficiency.

The story starts in 1996, when Synaptics created its first all-in-one touch controller. The goal was to integrate touch analog circuits, a microprocessor for running firmware, and program and data memories all into a single silicon chip. Finding a commercial core with the immense strength and small stature needed to perform our job proved difficult. So, a crazy alternative was put into action.

Creating a processor from scratch seemed like a foolhardy task, but we had inspiration – Faggin had built several noteworthy microprocessors in his time, including the Intel 4004, the world’s first microprocessor chip.

Nine months after the conception of the plan, we had created our first integrated touch chip, called the T1004 (which has a processor still shipping in today’s T1007!). Eventually, we embarked on a more scalable design. We christened that final product with the “Chimera” moniker, after the mythological beast of Greek lore.

Chimera 1.0 was smaller and cheaper than the standard alternative, yet proved to be much more adept at the intricate control functions that encompass Synaptics firmware. As the touch world continues to move from simple profile sensors to multi-touch transcapacitive imagers (plus proximity, gloves, In-Cell, Single-Layer, and so on), our firmware’s needs have consequently grown and changed in character, and Chimera has evolved to match them.

We’re proud of what we created almost 20 years ago, and the Chimera monster has stood the test of time thus far. So, what is next? Bringing more scalability and better power management to larger memories is a natural fit for our current chip road map.

Beyond that: Chimera is all ours, so we can adapt it to meet the market’s every demand.

As they say, heroes are remembered, but legends never die.

You can follow Synaptics on Twitter @SynaCorp

Synaptics 2014 Human Interface Index Sheds Light on Touchscreen Use in Asia

Did you know that a majority of Asians (61 percent, in fact) find themselves interacting with a touchscreen in the morning before interacting with their spouse or family? This is just one example of the pervasiveness of capacitive touchscreens in the daily lives of the Asian population, highlighted this week in Synaptics’ first-ever Asia Human Interface Index.

The Human Interface Index surveyed more than 2,000 consumers across China, Hong Kong, Taiwan and South Korea to gain an understanding of how Asians use capacitive touchscreens in their daily lives and their overall perceptions of the technology.

The infographic below highlights some of our key findings.

For example, what do Asians love most about touchscreens? Ease-of-use, fast performance and the intuitive, interactive experience topped the list. They also see the technology as significantly enhancing the user experience when it comes to gaming, handwriting with a stylus, and interaction with photos and videos.

Where do they want to see updates? Resizable buttons, integration with voice command and multi-gesture touch capability were most frequently noted.

Surveys like our Human Interface Index help Synaptics better understand the end user so that we can continue to provide the absolute best touchscreen experience possible. Plus, it’s just fun to see the results. That said, keep an eye out for more interesting data from Synaptics in the future!

 

Synaptics_Human_Interface_Infograpic_Final

 

You can follow Synaptics on Twitter at @SynaCorp.

Five Questions with Synaptics Biometrics Guru, Sebastien Taveau

With last year’s acquisition of Validity Sensors, Synaptics gained a world-class team of engineers and business experts in the fast-growing world of biometrics and fingerprint authentication. Among them is Sebastien Taveau, a 20+ year tech industry veteran and a well-known thought leader on the topic of mobile authentication, payments and security.

We are very excited to have Sebastien join the team here at Synaptics, and we recently caught up with him to learn a bit more about his new role at Synaptics and his vision for how biometrics technology will impact our lives in the future.

Sebastien Taveau, SynapticsHere’s some of what he had to say…

1. What is your role at Synaptics?

I am the chief evangelist for Synaptics’ newly established Biometrics Product Division (BPD). If you’re wondering what that actually means – it means that I help to bridge the technology and products developed within Synaptics BPD to real-world uses cases. I help explain, in everyday terms, how biometrics and fingerprint ID technology will impact our day-to-day lives and society as a whole. I do this through speaking engagements, interviews and more. It’s also my job to maintain a deep understanding of the biometrics industry and its progression, and regularly feed this knowledge back into the BPD.

2. Can you give us an update on Synaptics’ integration of Validity?

Since the acquisition of Validity, Synaptics has seamlessly integrated Validity’s engineering team to accelerate innovation for our line of Natural ID fingerprint sensors. The team continues to seek ways to push the envelope for fingerprint ID. For example, the Holy Grail in this market is to capture the fingerprint sensor beneath the active display or “glass” in device touchscreens vs. a discrete button. This has never been done before, but once it’s achieved, fingerprint ID will be that much more powerful and seamless for the user. With Validity as the expert in fingerprint ID and Synaptics as the expert in touchscreens, it’s the perfect coupling to achieve this major next step in mobile authentication.

3. What progress do you see happening in the biometrics industry in the next year?

This year will be big for kicking off the ascent of biometrics into the mainstream. We’re seeing a huge amount of interest from some of the world’s biggest brands. For example, take the FIDO Alliance, of which Synaptics is a founding member. The goal of the FIDO Alliance is to establish open, scalable and interoperable standards for mobile online authentication. In the one year since FIDO was established we’ve seen membership skyrocket from six members in 2013 to more than 100 members in 2014, including brands like MasterCard, RSA and Bank of America. We’ll continue to see increased standardization and regulation, as well as increased access to open APIs – measures that will continue to fuel innovation and exciting new biometric use cases in the months to come.

4. Beyond unlocking phones, how will fingerprint ID technology impact lives in the future?

In the future, fingerprint ID will become more invisible, but it will be there in the background, connecting us to our surroundings in ways we haven’t even imagined yet, especially as more of the environment around us becomes interconnected through sensors, etc. The online experience will become hyper customized – an experience I refer to as “the Internet of me.” While active authentication will still be necessary for things like mobile commerce transactions, it’s the emergence of passive (i.e. invisible) authentication where things really start to look exciting! For example, imagine renting a car and as you touch the door, the seat, in-cabin temperature, even the radio is automatically set to your desired preferences before you get inside. Connecting to your smartphone, your itinerary is then automatically uploaded to the navigation system – this is hyper customization of the connected experience; this is the Internet of me.

5. So tell us, can someone chop off our finger to gain access to our data?

This is a funny question that I hear all the time. I recently wrote a blog post that provides an in-depth answer, but the short answer is no. The fingerprint sensor technology is built in a way that the fingerprint image has to be taken from a live finger. As I’ve said before, if someone is compelled to chop off your finger to access your smartphone, you likely have bigger problems!

If you’re interested in catching up with Sebastien and hearing more, he’ll be sharing his thoughts at the following upcoming industry events:

Bringing Hover Tech to Next-Gen Notebooks

Over the past year, we’ve brought several new solutions to market that are changing the way users interact with their devices. It’s the mentality built around innovation that helps to keep our User Experience and Concept Prototyping teams looking forward to find the next big trend in human interface technology.

We’ve already seen the successful implementation of our 3D Touch solution, which debuted in last year’s Samsung Galaxy S4 and Galaxy Note 3 devices – and the response from users was incredible. As a result, we believe that hover technology can also make a big impact in the notebook PC space as well.

In fact, one of our most popular demos as CES last month was our proximity hover prototype laptop – otherwise known as our “Orzo” demo. As part of the demo, our team of engineers developed a notebook with our proximity hover technology built into an existing touchpad. The idea was to showcase how a hover enabled touchpad can accurately detect a finger from a distance of as much as 3-4 cm, allowing for a host of new ways for users to interact with and control their notebooks. This also presents new opportunities in Windows 8 gesture, gaming and waking up from sleep mode – all without touching the PC.

Check out the video below for a quick demo of our proximity hover technology to see for yourself – as always, feel free to let us know what you think!

That’s a Wrap: Synaptics Takes Human Interface Tech to New Heights at CES 2014

January is known for many things, the start of the new year, winter weather, and for those of us in the technology industry, the start of CES! Each year thousands of companies descend onto Vegas for a whirlwind week where new gadgets, technologies and trends take center stage for what is largely considered the super bowl of the consumer electronics industry – and this year was no different.

It was great to see that human interface solutions were a huge trend at CES this year, with many companies showcasing a variety of different implementations for how consumers can interact with devices. As a company at the forefront of a bigger industry shift in human interface technology, we get excited to see how other companies are taking their solutions to the next level. Our partner Tobii for example, had a really great demo of its eye tracking solution that could pin point exactly where you were looking down to the centimeter, allowing users to zoom in on maps, scroll through your browser and even change the gaming experience. The Verge has a really great round-up of some of their favorite human interface demos from the show.

2013 was clearly a big year for Synaptics – from the “touch first” computing experience to fingerprint authentication and gesture control, we’ve continued to redefine the human interface space. This year, we leveraged CES to showcase all of our industry-leading solutions that hit the market over the past year including ForcePad, ClickPad 2.0, 3D-Touch and Active Pen.  These technologies represent new user experiences that are challenging the designs of previous touch solutions – all of which are available to consumers today.

While 2013 represented the year of industry firsts, 2014 will likely be the year of new innovations for the human interface market. As part of our showcase, many of these solutions were on display in our booth including new practical uses for our ThinTouch technology, 1mm tip stylus support for smartphones, demos of new implementations of our biometrics and fingerprint ID solutions and a new prototype showcasing proximity hover technology which allows users to control their notebooks without having to physically touch the device.

Here are just a few shots of our booth to give you a sense of what was on display.

image

 

 

 

 

 

 

image (6)

 

 

 

 

 

 

image (5)

 

 

 

 

 

 

image (4)

 

 

 

 

 

 

image (3)

 

 

 

 

 

 

 

 

image (2)

 

 

 

 

 

 

 

image (1)

 

 

 

 

 

 

 

The spirit of CES has always been about what’s on the horizon and we feel the same way with our solutions. The technologies that were demonstrated at our booth are only a preview of what consumers can expect to see in upcoming devices and we look forward to working with our OEM partners to bring human interface innovations to new heights this year.

 

Putting Users at the Heart of Design – Taking A Closer Look at the Synaptics User Experience Design Team

Over the years, Synaptics has developed many new human interface technologies. These technologies provide new and powerful ways for users to interact with their devices – from our 3D Touch™ finger hover technology to the Synaptics ForcePad® with pressure sensing. Often times, these technologies get their start in the labs of Synaptics’ User Experience Design team. It’s here that we take amazing technology and transform it into a great end-user experience.

The User Experience Design Team is made up of a diverse set of backgrounds and skillsets from mechanical engineering, computer science, cognitive science and even philosophy. We work together to study user behavior, evaluate needs and problems, and put the user at the heart of our design decisions. By looking at Synaptics’ technical solutions from the perspective of the user, we can uncover new human interface implementations that will make our device experiences easier, more fluid and more exciting for all of us.

Driving innovation and new features

Whenever Synaptics develops new technology, like ForcePad or ThinTouch®, the User Experience team is part of the process early on. We work with engineering to shape the technology so that it solves user needs and provides a great user experience. We develop new interactions that take advantage of the technology, which drives innovation on our technology and creates new features for users. This process helps push the technical solution forward as engineering iteratively improves on the technology to meet the high standard of a great user experience.

Creating products that are more useful and desirable

When we are working on a product, whether the technology is new or established, the User Experience team does research and testing to create a product that is useful and desirable. We study existing usage behaviors and actions in research studies to learn where our technology will be most beneficial. We study the usability and user experience of specific features in laboratory studies to make our features more desirable and interactive.

Supporting our customers & partners by collaborating on UX issues

Our team frequently collaborates with partners and customers to create new user experiences that extend beyond the touch input device. We work with User Experience and Usability teams at these companies to share knowledge and build new user experience solutions. Often, customers come to us hoping to create a completely novel user experience that builds on our expertise with touch devices, and fits into their technology.

Creating the best user experiences & building our product brand through user experience

Ultimately, our goal is to create the best possible user experiences for touch and input devices. We are always looking at our existing technologies to see how we can make them work better to solve past issues, and we stay up to date on new input technologies; even those that are not directly related to Synaptics technology. Our work always puts the user experience first, and we are the internal advocate for user needs at Synaptics. Through our research and design work, we strive to perfect user interactions.

As far as some of the exciting areas we’re exploring now, look for 3D gestures, gaze tracking, active pen inputs and fingerprint ID technologies, among others, to revolutionize how we interact with our devices very soon. Keep an eye on the Synaptics blog for more updates from our team!

The Synaptics UXD team (from bottom left, clockwise): Mohamed Sheik-Nainar, Justin Mockler, Anna Ostberg, Dan Odell, Eric Faggin

The Synaptics UXD team (from bottom left, clockwise): Mohamed Sheik-Nainar, Justin Mockler, Anna Ostberg, Eric Faggin, Dan Odell

Follow Synaptics on Twitter @SynaCorp

Synaptics Engineers Amaze at Second Annual Hackathon

By: Justin Mockler, Usability Engineer

HackathonLast month, Synaptics held its second annual Hackathon at company headquarters in San Jose and remotely in other regional offices. For those who don’t already know, a Hackathon is a day-long contest in which contestants can work alone or on teams to make whatever tech projects they’ve dreamt about become reality.

Participants present their concepts and short demos to a panel of judges for prizes and recognition. Not only is it a great time for our engineers, but also a way to foster camaraderie and give people an opportunity to collaborate with co-workers they normally wouldn’t work with.

Each region had its own judging panel that awarded local prizes, which totaled $4,500 in cash along with a trophy and medals to each regional winner.

In San Jose, the Hackathon saw 36 participants on 14 teams, including two visitors from the Rochester office. Worldwide, 88 people on 28 different teams took part. Those participants explored and utilized many Synaptics technologies and products, including 3D-touch hover technology (aka AirView), ClearPad capacitive touchscreens and pressure-sensitive ForcePad to generate their ideas and demos. Biometrics and eye-tracking implementations also made an appearance at this year’s Hackathon.

Of particular interest in San Jose were prize-winning projects: One, which won the San Jose region’s grand prize, utilized global hover gestures for innovative setting control. Another boasted a custom-made algorithm that allowed a phone user multiple stylus control.

Overall, it was a great event that built on the success of last year. If you’ve ever participated in a Hackathon, we’d love to hear some of your ideas for what we can do next time.

Follow Synaptics on Twitter @SynaCorp

Access Granted: Synaptics Steps Into The Biometrics Market With Validity Acquisition

Biometrics_fingerprint_smartphoneThere’s no question biometrics is emerging as a major area of interest for the consumer electronics market. While technologies like fingerprint sensors were typically found in nuclear reactors in James Bond movies, experts now believe that consumers can expect to see more biometrics technology incorporated into everyday electronic devices in the coming months. Gone are the days of entering your passcode every time you pick up a phone or pull out your credit card to make an online purchase.

At Synaptics, we are always looking for the next big trend that will dictate how consumers will interact with devices. For quite some time, we’ve been exploring biometrics, and today we’re thrilled to announce our intent to purchase Validity Sensors, Inc., a leading provider of biometric fingerprint authentication solutions for tablets, smartphones and notebook PCs. With its industry-leading technology, solid corporate strategy, strong engineering culture and a customer base that mirrors ours, Validity is a perfect fit for Synaptics, allowing us to enter the thriving biometrics market and grow our leadership as the go-to company for human interface technology innovation.

Upon closing this acquisition, Synaptics will continue to offer and support existing fingerprint detection solutions from Validity, but will also look for new and creative ways to integrate biometrics into our existing portfolio of touchscreen and touchpad-related solutions. We see many unique implementations ahead, from mobile payment transaction solutions, to numerous cloud-based services and more.

In addition, we’re excited to announce the creation of an all new division within Synaptics that is focused exclusively on biometrics. The Biometrics Product Group, led by our CTO Stan Swearingen, will welcome Validity’s world-class engineering team. According to Swearingen, “this new engineering group will be organized to leverage Validity’s world-class innovation and Synaptics’ in-depth, system-level expertise and will be focused on driving biometrics technology for mass adoption.”

At the end of the day, what gets us really excited is having the opportunity to bring on such a talented group of individuals into our organization. Biometrics and touch technologies go hand in hand, and we’re eager to deliver new innovations that these two teams will develop together. As you can tell, we’re thrilled about what the future holds for biometrics, but we want to hear from you. What types of biometric implementations are you excited to see?

Follow Synaptics on Twitter @SynaCorp.

You Are What You Wear: How Wearable Tech Will Evolve to Become Your Sixth Sense

Girl with an Internet Smart Watch isolated on whiteIn the future, wearable technology will become the new sixth sense, giving people seamless access to data on the fly. Some say a sixth sense is the power of perception beyond the five senses, but in fact, wearable tech creates the opportunity to combine the senses and many modes of perceptual computing into a unified device that can truly be an extension of ourselves.

The wearable tech trend is driven by consumers’ desires to simplify their lives by storing and analyzing data to better interpret the world around them. It’s the concept that Google uses with the massive amounts of data they collect from their users, or even the basic concept of the Nike FuelBand, where users track how far they run to analyze their performance more easily and ultimately train harder.

However, the key in this transformation is usability, and consumer electronic devices will need to evolve to provide the symbiotic experience wearable tech can offer. Today, there are several innovations in human interface technology that are making this trend a reality – soon, consumers can expect to interact with their devices in new and unexpected ways, including:

Functionality in Harsh Environments:
We’ll see wearable computing devices that can weather the elements and survive in multiple environments – from being waterproof and sweat-proof, or capable of being used in extreme temperatures. For example, you’re already seeing Synaptics-powered smartphones on the market with extremely sensitive touchscreens that can be used while wearing gloves or even while dunked under water. We’ll start to see this same type of technology make its way into wearable computing devices. Imagine being able to successfully use the touchscreen on your smart-watch while swimming laps, or activate your Google Glass without having to take off your gloves while skiing.

A New Level of Touchscreen Precision:
These smaller touchscreens will also have to deliver a new level of responsiveness, capable of distinguishing random interference with intended user inputs. For example, gesture is an area that we’ll see surface in wearable computing devices that will require the ability to differentiate between intended input and unintended noise. We’ve already begun to see successful gesture capabilities appear in smartphones. For example, the recent Air View gesture technology found in the Samsung Galaxy S4 smartphone is powered by Synaptics’ 3D Touch technology.

Curved and/or Bendable Screens:
Wearable devices, like watches, will need curved screens to conform around users’ arms or other body parts. The curved lenses can also help with wide viewing angles and cutting away glare when using these devices outdoors or under direct sunlight. Additionally, curved lenses can help to prevent the noticeability of scratches, as these smart devices could scratch even more easily than today’s mobile phones.

Enhanced Sensory Feedback:
Touchscreens are ubiquitous these days, but in the next few years we’ll see them evolve to incorporate touch with the other senses and interactions, including voice, sight and gesture. For example, haptic technology, provides the touchscreen user with tactile feedback through force, vibration or motion. So in a few years, you might be able to experience the sensation of touching multiple surfaces through your glass touchscreen. Synaptics is currently working with a company called Tactus to build touchscreens with a special fluid-filled layer that can rise to provide users with the actual sensation of keyboard keys and potentially other types of surfaces. Imagine the implications haptic feedback could have for the visually impaired, who are currently unable to use touchscreens.

As the leader in touch and human interface technology, Synaptics is looking more closely at all of these possibilities for the future and working with an exciting array of partners in other human interface areas from haptics to eye tracking, voice recognition, video input, health monitoring and even biometrics, to innovate and transform the way people interact with technology.

We still believe touch will continue to be the de facto human to computer input mechanism, but when paired with the exciting array of advanced human interface technologies being developed, we can expect to see amazing new user experiences ahead for consumers.

You can follow Synaptics on Twitter @SynaCorp.

My Afternoon Talking About Touch

Dr. Andrew HsuEvery day, alongside my concept prototyping team, I live and breathe research on touch technology. One of the more enjoyable aspects of my job at Synaptics is participating in panel sessions where I get to share my opinions on the current and future state of human interfaces.

Recently, I was invited to participate in a panel discussion hosted by BTIG on the future of touch technology. Moderated by Cathal Phelan, former CTO of Cypress, the panel included Owen Drumm, CTO of RAPT; Daniel Gelbtuch, CFO of Neonode; and Rob Petcavich, CTO of UniPixel.

Over the 90-minute session, my fellow panelists and I covered a lot of ground. We responded to questions ranging from the relevance of stylus input, whether voice would ever replace touch and even whether there would ever be a standard screen size. There was also a lively discussion on whether there was any innovation left for touch, as opposed to touch simply being a race to lower costs.

Early in the session, I broke down Synaptics’ technical expertise into three informal silos: cost innovation, performance innovation to our existing products and advanced technology and partnerships. Synaptics’ market leadership is based on more than just servicing our OEM customers, product innovation and excellent delivery execution. Our unique focus on research and innovation in advanced human interface trends is what gives our technology the Synaptics touch. With human-machine interfaces and touch technology receiving attention and prominence in the design of electronic devices, it’s quite a treat to educate and evangelize others on all the hard work done at Synaptics to make our device interaction as widespread and as intuitive as possible.

It was also educational for me to learn how non-capacitive touch players like Neonode and RAPT viewed the market. Overall, the panel responses seemed to validate Synaptics’s overall technical strategy for remaining relevant in the area of human interfaces.

I’m always thrilled to be a part of such a dynamic discussion about the future of our field; and, of course, it’s always nice to see that gorgeous view of San Francisco from the 48th floor of the Transamerica pyramid!