Skip to main navigationSkip to main content
The University of Southampton
Health SciencesOur research

Research project: A Tongue Movement Command and Control System Based on Aural Flow Monitoring

Currently Active: 
Yes

"This project uses a device to monitor air pressure in the outer ear that detects tongue movements to control a communication and environmental control system"

Project Overview

Although there is a well-recognized need in society for effective tools which will enable the physically impaired to be more independent and productive, existing technology still fails to meet many of their needs

In particular, nearly all mechanisms designed for human-control of peripheral devices require users to generate the input signal through bodily movements, most often with their hands, arms, legs, or feet. Such devices clearly exclude individuals with limited limb control. Spinal cord injuries, repetitive strain injuries, severe arthritis, loss of motion due to stroke, and central nervous system (CNS) disorders all represent examples of these impairments.

The aim of this project is to develop a unique tongue-movement communication and control strategy in a stand-alone device that can be used with all manner of common household devices, such as lights, televisions and computers. The technology is based on detecting specific tongue movements by monitoring air pressure in the outer ear, and then providing control instructions corresponding to that tongue movement. The aim is to enable patients with quadriplegia, arthritis, limited movement due to stroke, or other conditions causing limited or painful hand/arm movement, to interface with their environment for better independence and quality of life.

An unobtrusive method for detecting tongue movement is being used, by monitoring air pressure in the human outer ear, and subsequently providing control instructions corresponding to that tongue movement. This multidisciplinary project will establish new paradigms in research and implementation in:

Physiological Signal Processing: The core of this research focuses on developing signal filtering and pattern recognition algorithms to correlate air flow in the ear canal with tongue movement. Novel signal processing and pattern recognition algorithms are being developed to monitor and identify, in real-time, tongue movement through observation of pressure signatures in the outer ear.

Self-Tuning Algorithms: In addition to pattern recognition algorithms, we will develop self-tuning user-training programs to adaptively calibrate themselves for new users.

Human-Machine Interface: The project will produce a human-machine interface system based on aural flow monitoring.

This work will aims to improve independence and quality of life for a substantial patient population.

Project team

Vaidyanathan R, Lutman M, Stokes M,

Project funder

Engineering & Physical Sciences Research Council (EPSRC)
 

Associated research themes

Health Technologies
Communication

Related research groups

Active Living and Rehabilitation

Conferences and events associated with this project:

Mace, M., Mamun, K., Vaidyanathan, R., Wang, S. and Gupta, L. (2010) Real-time implementation of a non-invasive tongue-based human-robot interface. In, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). New York, US, IEEE, 5486. (doi:10.1109/IROS.2010.5648834)

Mace, Michael, Mamun, Khondaker, Wang, Shouyan and Gupta, Lalit (2010) Human-machine interface for tele-robotic operation using tongue movement ear pressure (TMEP) signals. In, 11th Conference Towards Autonomous Robotic Systems (TAROS'2010), Plymouth, GB, 6pp. Item not available on this server.

Mamun, K, Mace, M, Craig, R, Lutman, M.E, Vaidyanathan, R and Wang, S (2009) Tongue movement ear pressure signal classification using wavelet packet transform. In, British Society of Audiology Short Papers Meeting on Experimental Studies of Hearing and Deafness, Southampton, UK, 17 - 18 Sep 2009.

Mamun, K., Mace, M., Lutman, M. E., Vaidyanathan, R. and Wang, S. (2009) Pattern classification of tongue movement ear pressure signal based on wavelet packet feature extraction. In, 5th IEEE EMBS UK & Republic of Ireland Postgraduate Conference on Biomedical Engineering and Medical Physics, Oxford, UK, 12 - 14 Jul 2009. , 33-4

Mamun, K., Mace, M., Lutman, M.E., Vaidyanathan, R. and Wang, S. (2009) Bayesian classification of tongue movement based on wavelet packet transformation. At INSPIRE, London, U.K., 21 - 24 Sep 2009.

Mamun, K., Banik, M., Mace, M., Lutman, M.E., Vaidyanathan, R. and Wang, S. (2010) Multi-layer neural network classification of tongue movement ear pressure signal for human machine interface. In, 13th International Conference on Computer and Information Technology (ICCIT 2010), Dhaka, BD, 23 - 25 Dec 2010. 5pp

Mamun, K.A., Mace, M., Lutman, M.E., Vaidyanathan, R., Gupta, L. and Wang, Shouyan (2010) Multivariate Bayesian classification of tongue movement ear pressure signals based on the wavelet packet transform. In, 2010 IEEE International Workshop on Machine Learning for Signal Processing (MLSP). New York, US, IEEE, 208-213. (doi:10.1109/MLSP.2010.5589102)

Key Publication

Share this research project Share this on Facebook Share this on Twitter Share this on Weibo
Privacy Settings