About the project
Autonomous systems such as self-driving cars, drones, autonomous boats, unmanned underwater vehicles, potentially help to reduce the costs and protect humans from hazardous situations.
In situations where close collaboration between multiple assets is required across different operational domains, ensuring the safety and security of the complex multi-component systems is essential to allow different assets to successfully complete their individual tasks.
Secure communications which can influence the safety of missions as well as other safety conditions must be considered as early as possible.
Applications are invited for a fully funded PhD studentship to analyse and develop frameworks to ensure the trustworthiness of collaborative autonomous missions.
The work will involve applying analysis techniques as well as verification and validation methods to ensure the trustworthiness of autonomous systems and their conformance to the security and safety requirements. This research will address challenges inherent to autonomy and intelligent systems and include:
- cooperative behaviour
- evolving missions
- collaborations with industrial partners
This work could lead to new methods and techniques that could be applied to different domains that rely on autonomous systems and where safety and security guarantees are required.
Successful candidates would have strong background in computer science, mathematics, physics or Engineering, with interest in the interplay between software design and autonomous systems.
You will join and be supported by a dynamic and leading research team within the school of Electronics and Computer Science with opportunities to collaborate with world leading industrial partners. The University of Southampton is a member of the Russell group and is ranked in the top 100 universities.