Skip to main navigation Skip to main content
The University of Southampton

On Gait and Biometrics for Surveillance seminar by Professor Mark Nixon on Friday 24 January in Malaysia

Published: 19 December 2013

One of the UK's leading pioneers in biometrics will deliver a guest seminar at the University of Southampton Malaysia Campus on Friday, 24 January, 2014. Biometrics involves the identification of humans by their characteristics or traits including the way we walk and the shape of our ears, both of which Professor Nixon believes can be used as part of security surveillance and building access control.

Mark Nixon, Professor in Computer Vision at the University of Southampton, and his team are part of the University’s Centre of Academic Excellence in Cybersecurity as designated by the UK government's intelligence division GCHQ and the Department for Business and Skills. The team is actively involved in developing new techniques for static and moving shape extraction which have found applications in automatic face and automatic gait recognition and in medical image analysis. The team were early workers in face recognition and later came to pioneer gait recognition. More recently they joined the pioneers of ear biometrics.

Register your interest in attending this event via our contact form .

In particular, Professor Nixon’s work in tracking a person’s gait has led to the development of the country’s first biometric tunnel on the University campus in Southampton. The tunnel contains 12 cameras which track and measure a person’s gait as they walk along it capturing images of the ears as well. The tunnel is brightly coloured to optimise the contrast between the subject's clothing and background, making it easier for the researchers to capture a 3D image as they walk. Professor Nixon says that a person’s gait and ears are as unique as our fingerprints adding that they are harder to fake than more traditional forms of identification.

“The prime advantage of gait as a biometric is that it can be used for recognition at a distance whereas other biometrics cannot,” says Professor Nixon whose research interests are in image processing and computer vision. “There is a rich selection of approaches and many advances have been made, as will be reviewed in this talk.

“Soft biometrics is an emerging area of interest in biometrics where we augment computer vision derived measures by human descriptions,” Professor Nixon continues. “Applied to gait biometrics, this again can be used where other biometric data is obscured or at too low resolution. The human descriptions are semantic and are a set of labels which are converted into numbers.

“Naturally, there are considerations of language and psychology when the labels are collected,” he concludes. “After describing current progress in gait biometrics, this talk will describe how the soft biometrics labels are collected, and how they can be used to enhance recognising people by the way they walk. As well as reinforcing biometrics, this approach might lead to a new procedure for collecting witness statements, and to the ability to retrieve subjects from video using witness statements.

Professor Nixon has published over 400 papers in peer reviewed journals, conference proceedings and technical books. His vision textbook, with Alberto Aguado, Feature Extraction and Image Processing (Academic Press) reached 3rd Edition in 2012 and has become a standard text in computer vision. With T. Tan and R. Chellappa, their 2005 book Human ID based on Gait is part of the Springer Series on Biometrics.

Amongst Professor Nixon’s research contracts, he was Principal Investigator with John Carter on the DARPA supported project Automatic Gait Recognition for Human ID at a Distance and he was previously with the FP7 Scovis project and is currently with the EU-funded Tabula Rasa project. He has chaired/program chaired BMVC 98, AVBPA 03, IEEE Face and Gesture FG06, ICPR 04, ICB 09, IEEE BTAS 2010, and given many invited talks.

Register your interest in attending this event via our contact form .

Privacy Settings