This PhD opportunity at ÌúÅ£ÊÓÆµ University invites candidates to explore the integration of AI into certification and lifecycle monitoring processes for safety-critical systems. The project delves into areas such as AI-driven verification, predictive maintenance, and compliance assurance, aiming to enhance system reliability and safety. Situated within the esteemed IVHM Centre and supported by collaborations with industry giants including Boeing, Rolls-Royce, Thales, and UKRI, this research offers a unique platform to contribute to the advancement of intelligent assurance methodologies in sectors like aerospace, healthcare, and industrial automation.
In safety-critical domains, such as aviation and medical devices, rigorous certification processes and continuous lifecycle monitoring are essential to ensure compliance and operational integrity. The application of AI in these areas enhances the ability to predict system behaviours, detect anomalies, and streamline certification workflows. AI-driven tools can analyse vast datasets to identify potential issues before they escalate, facilitating proactive maintenance and compliance assurance. Integrating AI into certification and monitoring processes is revolutionizing how safety and reliability are managed throughout a system's lifecycle.
This PhD project explores the application of AI in enhancing certification processes and lifecycle monitoring of safety-critical systems. The research will focus on developing AI-powered verification tools, health monitoring algorithms, and compliance assurance techniques that ensure system reliability throughout their operational lifespan. A key aspect of the project will be the incorporation of communication security measures, specifically targeting resilience against jamming and spoofing attacks. Students will investigate how AI can streamline certification workflows and enable proactive maintenance, with applications in sectors such as aviation, healthcare, and industrial systems.
Research Focus Areas:
- AI-Powered Verification Tools: Develop AI algorithms that automate the verification process, ensuring systems meet required safety and performance standards.
- Health Monitoring Algorithms: Implement AI-based monitoring systems that continuously assess the health of components, predicting failures before they occur.
- Compliance Assurance Techniques: Design AI-driven methods to ensure ongoing compliance with industry regulations and standards throughout the system's lifecycle.
- Secure Communication Monitoring: Integrate AI techniques to monitor and enhance the security of communication channels, focusing on detecting and mitigating jamming and spoofing threats in real-time.
- Integration of Trusted Execution Environments (TEEs): Investigate the use of TEEs to create secure zones within embedded systems, facilitating secure data processing and storage, and supporting compliance with stringent certification requirements.
ÌúÅ£ÊÓÆµ University offers a distinctive research environment renowned for its world-class programmes, cutting-edge facilities, and strong industry partnerships, attracting top-tier students and experts globally. As an internationally recognised leader in AI, embedded system design, and intelligent systems research, ÌúÅ£ÊÓÆµ fosters innovation through applied research, bridging academia and industry. Students will have access to state-of-the-art laboratories, hardware/software resources, and design facilities, supporting AI-powered electronics research.
This project will be conducted within ÌúÅ£ÊÓÆµ’s Integrated Vehicle Health Management (IVHM) Centre, established in 2008 in collaboration with industry leaders such as Boeing, Rolls-Royce, BAE Systems, Meggitt, and Thales. The IVHM Centre is globally recognized for defining the subject area and continues to expand its research horizons. It plays a pivotal role in the £65 million Digital Aviation Research and Technology Centre (DARTeC), leading advancements in aircraft electrification, autonomous systems, and secure intelligent hardware. Through collaborations with the Aerospace Integration Research Centre (AIRC), Airbus, and Rolls-Royce, students gain industry exposure and further research opportunities.
Additionally, the IVHM Centre hosts Seretonix, a research group specializing in secure electronic design, AI-driven system resilience, and intelligent hardware security. Through the EUROPRACTICE partnership, the IVHM Centre provides access to advanced CAD tools, integrated circuit prototyping, and technical training, equipping students with cutting-edge skills.
To support hands-on experimentation and applied research, the IVHM Centre offers access to a suite of specialised facilities:
- UAV Fuel Rig with Five Degradation Faults: Simulates various degradation scenarios in unmanned aerial vehicle (UAV) fuel systems, enabling research into fault detection, isolation, and prognostics.
- Machine Fault Simulator for Rotating Machinery Faults: A versatile platform that replicates common faults in rotating machinery, such as imbalance and misalignment, facilitating the development and validation of diagnostic and prognostic algorithms.
- Electronic Prognostics Systems: Facilities equipped to assess the health and predict the remaining useful life of electronic components, supporting studies in electronic system reliability and maintenance strategies.
- Filter Rig: An experimental setup to study filter clogging phenomena, allowing for the collection of data to develop and validate prognostic models for filter degradation.
- Integrated Drive Generator (IDG) Rig: Simulates the operation of an aircraft's IDG, used to investigate fault detection, diagnostics, and prognostics in power generation systems.
- Auxiliary Power Unit (APU) Rig: Replicates the functions of an aircraft's APU, enabling research into fault detection, diagnostics, and health management of auxiliary power systems.
- ÌúÅ£ÊÓÆµ 737-400: Aircraft Instrumentation and Environmental Control Systems (AID, ECS): A full-scale Boeing 737-400 aircraft equipped with instrumentation for studying environmental control systems and other onboard systems, providing a realistic environment for research and training.
- SIU 737-200 ECS: A ground-based Boeing 737-200 Environmental Control System used for simulating faults and studying system behaviour under various conditions, aiding in the development of diagnostic and prognostic techniques.
- Hawk ECS: An Environmental Control System from a BAE Systems Hawk aircraft, utilized for research into thermal management and system health monitoring, supporting studies in military aircraft systems.
Engaging with these facilities allows students to acquire practical skills and technical expertise, enhancing their research capabilities and employability in the field of intelligent systems and AI-integrated electronics.
This project focuses on integrating AI into the certification processes and lifecycle monitoring of safety-critical systems. Research will involve developing AI-powered verification tools and health monitoring systems that ensure compliance with regulatory standards throughout a system's operational life. The project will also address secure communication protocols to safeguard against jamming and spoofing, enhancing the trustworthiness of these systems. Expected outcomes include streamlined certification processes, improved system reliability, and reduced downtime, benefiting industries such as aviation, automotive, and medical devices. By aligning with the increasing emphasis on safety and compliance in technology deployment, this research equips students with the skills to influence policy and practice in high-stakes environments.
Bridging AI innovation with real-world assurance, this PhD focuses on designing verifiable and certifiable AI-embedded systems for safety-critical applications. You’ll collaborate closely with industrial and regulatory partners in aerospace, automotive, and digital infrastructure, participating in audits, standards evaluations, and compliance testing. The project encourages international dissemination and engagement, with funded opportunities to present at conferences like SAFECOMP, DATE, and DSN. Training in AI explainability, lifecycle analysis, and compliance frameworks (e.g., ISO 26262, DO-254) will be central to your development, ensuring you graduate ready to drive change in AI-powered system certification and governance.
Through this PhD, students will master the intersection of AI, verification, and regulatory compliance, gaining a rare combination of skills in intelligent system validation, safety assurance, and lifecycle analytics. They will also develop transferable abilities in technical reporting, stakeholder communication, standardisation processes, and risk management. These proficiencies open pathways to leadership roles in AI engineering for regulated industries, including aerospace certification, automotive compliance, and digital health systems. The project’s emphasis on explainability, traceability, and system integrity positions graduates as key contributors to the future of responsible AI deployment.
At a glance
- Application deadline25 Mar 2026
- Award type(s)PhD
- Start date01 Jun 2026
- Duration of award3 years full-time
- EligibilityUK, Rest of world, EU
- Reference numberSATM594
Entry requirements
Applicants should have a first or second class UK honours degree or equivalent in a related discipline. This project would suit individuals with a background in electronic engineering, systems engineering, software engineering, data science, or related fields. It is particularly well-suited to those with an interest in AI verification, safety assurance, and regulatory compliance for safety-critical systems. Applicants with industrial or academic experience in aerospace, automotive, healthcare, or critical infrastructure are encouraged to apply. Familiarity with Python, MATLAB, or system modelling tools is advantageous. Most importantly, candidates should be driven by a desire to advance trustworthy and certifiable AI technologies.
Funding
Self funded.
ÌúÅ£ÊÓÆµ Doctoral Network
Research students at ÌúÅ£ÊÓÆµ benefit from being part of a dynamic, focused and professional study environment and all become valued members of the ÌúÅ£ÊÓÆµ Doctoral Network. This network brings together both research students and staff, providing a platform for our researchers to share ideas and collaborate in a multi-disciplinary environment. It aims to encourage an effective and vibrant research culture, founded upon the diversity of activities and knowledge. A tailored programme of seminars and events, alongside our Doctoral Researchers Core Development programme (transferable skills training), provide those studying a research degree with a wealth of social and networking opportunities.
How to apply
For further information please contact:
Name: Dr Mohammad Samie
Email: m.samie@cranfield.ac.uk
Phone: +44 (0) 1234 758571
If you are eligible to apply for this studentship, please complete the