Business & Technology Insight Forums Cambridge 2017

5, 6, 7 December 2017
Cambridge, UK

Wolfson College, Cambridge

Next-Generation Human-Machine Interface Technologies

Rapid technological development in areas such as consumer electronics, automotive, robotics, home automation and wearable devices has greatly changed the way we interact with machines/computers. Human-machine interface began as "computer interface" as early computers were not interactive.

But gradually, "human machine interface" became "human machine interaction". Adding interactive features to human-machine interface is the first revolution of HMI. Over the last century, human-machine interface has transitioned from the earliest text user interface (TUI) / command line interface, to graphical user interface (GUI), and now on the way to natural user interface (NUI), with natural, intuitive, immersive and intelligent beyond-touch methods/ techniques. Natural user interface is the second revolution of HMI, which we are experiencing at the moment.

Artificial intelligence (AI) plays an important role in the development of human-machine interactions, especially via machine learning / deep learning. AI is evolving from handcrafted-knowledge-based reasoning to statistical learning on big data. In near future AI will also have the capability of contextual adaptation. This master class will also introduce how machine learning works and what role AI is playing in human-machine interactions.

All the technologies within human-machine interaction category are undergoing revolutions. These include peripherals such as keyboards and game console controllers, touch screens, and beyond touch modalities such as voice, speech & conversational interactions, gesture, vision, force sensing, haptics, perceptive computing and many others. The competitive markets are pushing device and software players to rapidly innovate. Next-generation user interfaces are a core feature enabling them to differentiate from traditional players and offering the customers with amazing user experiences.

Technologies covered in this masterclass include:

Touch-based interactions

The first "touch display" was reported by E.A. Johnson in 1965, using a cathode-ray-tube (CRT) monitor with a capacitive touch overlay. In recent decade, touch-screen technologies have been widely deployed to offer an intuitive user experience. There are many different technologies that can detect touch inputs on a display, including capacitive, resistive, acoustic and optical techniques. Working principles of touch sensing technologies will be explained. Advantages, disadvantages, system architecture and integration approaches will be compared.

In addition, on-cell & in-cell integration/ embedded touch and corresponding changes to the value chain will be analysed.

Voice/speech-based interactions

Speech-based interface provides seamless interactions between people and devices. For instance, "Give me the direction to LAX" will display the map and driving directions to the Los Angeles International Airport. While via traditional interfaces users have to switch between multiple command windows, typing in text inputs to get the results. (Automatic) speech recognition, an important technology in voice interface, has been developed for decades and it was only until recently the recognition accuracy improved rapidly to an applicable stage. Natural language understanding as well as speech synthesis expand the usage of speech-based HMI.

Vision-based interactions

Vision plays a dominating role in human interactions with the world. Displays are already primary interfaces in our daily life, especially with the integration of touch screens to act as two-way interaction devices. 3D vision and gesture interactions are making displays more powerful.


This session will focus mainly on emerging haptic technologies through case studies. Corresponding applications and players will be introduced as well.

Multimodal interactions

It will be difficult to find another single dominating human-machine interaction technology after touch. Natural and intuitive human-machine interactions need to be multimodal, combining gesture, speech, eye gaze, facial expressions, emotions and touch. Besides the above-mentioned input modalities, emerging brain-computer interface technologies and new keyboard technologies will also be briefly discussed.

This masterclass will guide you to explore the next big opportunity in human-machine interactions that can be applied in consumer electronics, automotive, robotics, home automation, wearables, etc. It will cover explanations and assessment on disruptive technologies, as well as case studies, vertical application market status & analysis, value chains, opportunities and challenges. It is designed for those who want to get depth understanding of disruptive technologies in user interfaces, who need to expand their existing business areas and who would like to learn the big picture to assess the markets, challenges and opportunities.

For those attending.

  • The PDF files of the presentations will be available to download from this website after the sessions are ended.

Register for the Forum

Exhibitor Opportunities

Report Package

Forum 5 / Agenda

7 December / Morning

Please join IDTechEx Analysts for lunch and networking from 13:00-14:00

  • 08:30 - 09:10
  • Registration
  • 09:10 - 09:20

Introduction to human-machine interface and human-machine interaction

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • 09:20 - 09:35

Introduction to artificial intelligence and its role in human-machine interactions

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • 09:35 - 10:35

Touch sensing

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • Introduction to touch technologies
  • Classifications by sensing capabilities
  • Detailed explanation of touch technologies (resistive, capacitive, acoustic, optical) and comparison
    • Resistive touch: analog resistive
    • Capacitive touch: surface and projected capacitive touch, self and mutual capacitive touch
    • Acoustic touch: surface acoustic wave
    • Optical touch: infrared, camera-based optical, vision-based optical
  • Embedded touch technologies and change of value chain
  • Force sensing
  • The future of touch technologies
  • ITO alternatives
  • 10:35 -10:55
  • Networking Break
  • 10:55 - 11:25

Voice/speech-based interactions

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • Introduction to voice user interface
  • Enabling technologies
    • Speech recognition
    • speech synthesis
    • Natural language understanding
  • Hardware improvement
  • Market analysis
    • The value chain
    • Revenue model
    • Important players
  • Applications
  • 11:25 - 11:50

Vision and gesture interactions

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • Introductions
  • Vision-based gesture recognition / control
  • 3D sensing and imaging
  • Eye gaze and tracking
  • 11:50 - 12:10


Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • Introduction to haptics
  • Haptic technologies and working principles
  • Case studies, applications and players
  • 12:10 - 12:20

New keyboard technologies

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • Introduction to traditional keyboard
  • Flexible keyboard
  • Projection keyboard
  • Dynamic keyboard
  • Capacitive keyboard
  • 12:20 - 12:30

Multimodal inputs and conclusions

Compiled by Dr Xiaoxi He, Senior Technology Analyst, IDTechEx.

  • 12:30 - 13:00
  • Questions and Answers
    followed by networking and lunch
Register for the Forum

Timings and the agenda are subject to change

About IDTechEx

Since 1999 IDTechEx has provided independent market research, business intelligence and events on emerging technologies to clients in over 80 countries. Our clients use our insights to help make strategic business decisions and grow their organizations. IDTechEx is headquartered in Cambridge, UK with additional offices in Boston, USA; Berlin, Germany; Tokyo, Japan; Seoul, Korea. Learn More

Contact Event Support